More stories

  • in

    Decarbonizing heavy industry with thermal batteries

    Whether you’re manufacturing cement, steel, chemicals, or paper, you need a large amount of heat. Almost without exception, manufacturers around the world create that heat by burning fossil fuels.In an effort to clean up the industrial sector, some startups are changing manufacturing processes for specific materials. Some are even changing the materials themselves. Daniel Stack SM ’17, PhD ’21 is trying to address industrial emissions across the board by replacing the heat source.Since coming to MIT in 2014, Stack has worked to develop thermal batteries that use electricity to heat up a conductive version of ceramic firebricks, which have been used as heat stores and insulators for centuries. In 2021, Stack co-founded Electrified Thermal Solutions, which has since demonstrated that its firebricks can store heat efficiently for hours and discharge it by heating air or gas up to 3,272 degrees Fahrenheit — hot enough to power the most demanding industrial applications.Achieving temperatures north of 3,000 F represents a breakthrough for the electric heating industry, as it enables some of the world’s hardest-to-decarbonize sectors to utilize renewable energy for the first time. It also unlocks a new, low-cost model for using electricity when it’s at its cheapest and cleanest.“We have a global perspective at Electrified Thermal, but in the U.S. over the last five years, we’ve seen an incredible opportunity emerge in energy prices that favors flexible offtake of electricity,” Stack says. “Throughout the middle of the country, especially in the wind belt, electricity prices in many places are negative for more than 20 percent of the year, and the trend toward decreasing electricity pricing during off-peak hours is a nationwide phenomenon. Technologies like our Joule Hive Thermal Battery will enable us to access this inexpensive, clean electricity and compete head to head with fossil fuels on price for industrial heating needs, without even factoring in the positive climate impact.”A new approach to an old technologyStack’s research plans changed quickly when he joined MIT’s Department of Nuclear Science and Engineering as a master’s student in 2014.“I went to MIT excited to work on the next generation of nuclear reactors, but what I focused on almost from day one was how to heat up bricks,” Stack says. “It wasn’t what I expected, but when I talked to my advisor, [Principal Research Scientist] Charles Forsberg, about energy storage and why it was valuable to not just nuclear power but the entire energy transition, I realized there was no project I would rather work on.”Firebricks are ubiquitous, inexpensive clay bricks that have been used for millennia in fireplaces and ovens. In 2017, Forsberg and Stack co-authored a paper showing firebricks’ potential to store heat from renewable resources, but the system still used electric resistance heaters — like the metal coils in toasters and space heaters — which limited its temperature output.For his doctoral work, Stack worked with Forsberg to make firebricks that were electrically conductive, replacing the resistance heaters so the bricks produced the heat directly.“Electric heaters are your biggest limiter: They burn out too fast, they break down, they don’t get hot enough,” Stack explains. “The idea was to skip the heaters because firebricks themselves are really cheap, abundant materials that can go to flame-like temperatures and hang out there for days.”Forsberg and Stacks were able to create conductive firebricks by tweaking the chemical composition of traditional firebricks. Electrified Thermal’s bricks are 98 percent similar to existing firebricks and are produced using the same processes, allowing existing manufacturers to make them inexpensively.Toward the end of his PhD program, Stack realized the invention could be commercialized. He started taking classes at the MIT Sloan School of Management and spending time at the Martin Trust Center for MIT Entrepreneurship. He also entered the StartMIT program and the I-Corps program, and received support from the U.S. Department of Energy and MIT’s Venture Mentoring Service (VMS).“Through the Boston ecosystem, the MIT ecosystem, and with help from the Department of Energy, we were able to launch this from the lab at MIT,” Stack says. “What we spun out was an electrically conductive firebrick, or what we refer to as an e-Brick.”Electrified Thermal contains its firebrick arrays in insulated, off-the-shelf metal boxes. Although the system is highly configurable depending on the end use, the company’s standard system can collect and release about 5 megawatts of energy and store about 25 megawatt-hours.The company has demonstrated its system’s ability to produce high temperatures and has been cycling its system at its headquarters in Medford, Massachusetts. That work has collectively earned Electrified Thermal $40 million from various the Department of Energy offices to scale the technology and work with manufacturers.“Compared to other electric heating, we can run hotter and last longer than any other solution on the market,” Stack says. “That means replacing fossil fuels at a lot of industrial sites that couldn’t otherwise decarbonize.”Scaling to solve a global problemElectrified Thermal is engaging with hundreds of industrial companies, including manufacturers of cement, steel, glass, basic and specialty chemicals, food and beverage, and pulp and paper.“The industrial heating challenge affects everyone under the sun,” Stack says. “They all have fundamentally the same problem, which is getting their heat in a way that is affordable and zero carbon for the energy transition.”The company is currently building a megawatt-scale commercial version of its system, which it expects to be operational in the next seven months.“Next year will be a huge proof point to the industry,” Stack says. “We’ll be using the commercial system to showcase a variety of operating points that customers need to see, and we’re hoping to be running systems on customer sites by the end of the year. It’ll be a huge achievement and a first for electric heating because no other solution in the market can put out the kind of temperatures that we can put out.”By working with manufacturers to produce its firebricks and casings, Electrified Thermal hopes to be able to deploy its systems rapidly and at low cost across a massive industry.“From the very beginning, we engineered these e-bricks to be rapidly scalable and rapidly producible within existing supply chains and manufacturing processes,” Stack says. “If you want to decarbonize heavy industry, there will be no cheaper way than turning electricity into heat from zero-carbon electricity assets. We’re seeking to be the premier technology that unlocks those capabilities, with double digit percentages of global energy flowing through our system as we accomplish the energy transition.” More

  • in

    New AI tool generates realistic satellite images of future flooding

    Visualizing the potential impacts of a hurricane on people’s homes before it hits can help residents prepare and decide whether to evacuate.MIT scientists have developed a method that generates satellite imagery from the future to depict how a region would look after a potential flooding event. The method combines a generative artificial intelligence model with a physics-based flood model to create realistic, birds-eye-view images of a region, showing where flooding is likely to occur given the strength of an oncoming storm.As a test case, the team applied the method to Houston and generated satellite images depicting what certain locations around the city would look like after a storm comparable to Hurricane Harvey, which hit the region in 2017. The team compared these generated images with actual satellite images taken of the same regions after Harvey hit. They also compared AI-generated images that did not include a physics-based flood model.The team’s physics-reinforced method generated satellite images of future flooding that were more realistic and accurate. The AI-only method, in contrast, generated images of flooding in places where flooding is not physically possible.The team’s method is a proof-of-concept, meant to demonstrate a case in which generative AI models can generate realistic, trustworthy content when paired with a physics-based model. In order to apply the method to other regions to depict flooding from future storms, it will need to be trained on many more satellite images to learn how flooding would look in other regions.“The idea is: One day, we could use this before a hurricane, where it provides an additional visualization layer for the public,” says Björn Lütjens, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences, who led the research while he was a doctoral student in MIT’s Department of Aeronautics and Astronautics (AeroAstro). “One of the biggest challenges is encouraging people to evacuate when they are at risk. Maybe this could be another visualization to help increase that readiness.”To illustrate the potential of the new method, which they have dubbed the “Earth Intelligence Engine,” the team has made it available as an online resource for others to try.The researchers report their results today in the journal IEEE Transactions on Geoscience and Remote Sensing. The study’s MIT co-authors include Brandon Leshchinskiy; Aruna Sankaranarayanan; and Dava Newman, professor of AeroAstro and director of the MIT Media Lab; along with collaborators from multiple institutions.Generative adversarial imagesThe new study is an extension of the team’s efforts to apply generative AI tools to visualize future climate scenarios.“Providing a hyper-local perspective of climate seems to be the most effective way to communicate our scientific results,” says Newman, the study’s senior author. “People relate to their own zip code, their local environment where their family and friends live. Providing local climate simulations becomes intuitive, personal, and relatable.”For this study, the authors use a conditional generative adversarial network, or GAN, a type of machine learning method that can generate realistic images using two competing, or “adversarial,” neural networks. The first “generator” network is trained on pairs of real data, such as satellite images before and after a hurricane. The second “discriminator” network is then trained to distinguish between the real satellite imagery and the one synthesized by the first network.Each network automatically improves its performance based on feedback from the other network. The idea, then, is that such an adversarial push and pull should ultimately produce synthetic images that are indistinguishable from the real thing. Nevertheless, GANs can still produce “hallucinations,” or factually incorrect features in an otherwise realistic image that shouldn’t be there.“Hallucinations can mislead viewers,” says Lütjens, who began to wonder whether such hallucinations could be avoided, such that generative AI tools can be trusted to help inform people, particularly in risk-sensitive scenarios. “We were thinking: How can we use these generative AI models in a climate-impact setting, where having trusted data sources is so important?”Flood hallucinationsIn their new work, the researchers considered a risk-sensitive scenario in which generative AI is tasked with creating satellite images of future flooding that could be trustworthy enough to inform decisions of how to prepare and potentially evacuate people out of harm’s way.Typically, policymakers can get an idea of where flooding might occur based on visualizations in the form of color-coded maps. These maps are the final product of a pipeline of physical models that usually begins with a hurricane track model, which then feeds into a wind model that simulates the pattern and strength of winds over a local region. This is combined with a flood or storm surge model that forecasts how wind might push any nearby body of water onto land. A hydraulic model then maps out where flooding will occur based on the local flood infrastructure and generates a visual, color-coded map of flood elevations over a particular region.“The question is: Can visualizations of satellite imagery add another level to this, that is a bit more tangible and emotionally engaging than a color-coded map of reds, yellows, and blues, while still being trustworthy?” Lütjens says.The team first tested how generative AI alone would produce satellite images of future flooding. They trained a GAN on actual satellite images taken by satellites as they passed over Houston before and after Hurricane Harvey. When they tasked the generator to produce new flood images of the same regions, they found that the images resembled typical satellite imagery, but a closer look revealed hallucinations in some images, in the form of floods where flooding should not be possible (for instance, in locations at higher elevation).To reduce hallucinations and increase the trustworthiness of the AI-generated images, the team paired the GAN with a physics-based flood model that incorporates real, physical parameters and phenomena, such as an approaching hurricane’s trajectory, storm surge, and flood patterns. With this physics-reinforced method, the team generated satellite images around Houston that depict the same flood extent, pixel by pixel, as forecasted by the flood model.“We show a tangible way to combine machine learning with physics for a use case that’s risk-sensitive, which requires us to analyze the complexity of Earth’s systems and project future actions and possible scenarios to keep people out of harm’s way,” Newman says. “We can’t wait to get our generative AI tools into the hands of decision-makers at the local community level, which could make a significant difference and perhaps save lives.”The research was supported, in part, by the MIT Portugal Program, the DAF-MIT Artificial Intelligence Accelerator, NASA, and Google Cloud. More

  • in

    A vision for U.S. science success

    White House science advisor Arati Prabhakar expressed confidence in U.S. science and technology capacities during a talk on Wednesday about major issues the country must tackle.“Let me start with the purpose of science and technology and innovation, which is to open possibilities so that we can achieve our great aspirations,” said Prabhakar, who is the director of the Office of Science and Technology Policy (OSTP) and a co-chair of the President’s Council of Advisors on Science and Technology (PCAST). “The aspirations that we have as a country today are as great as they have ever been,” she added.Much of Prabhakar’s talk focused on three major issues in science and technology development: cancer prevention, climate change, and AI. In the process, she also emphasized the necessity for the U.S. to sustain its global leadership in research across domains of science and technology, which she called “one of America’s long-time strengths.”“Ever since the end of the Second World War, we said we’re going in on basic research, we’re going to build our universities’ capacity to do it, we have an unparalleled basic research capacity, and we should always have that,” said Prabhakar.“We have gotten better, I think, in recent years at commercializing technology from our basic research,” Prabhakar added, noting, “Capital moves when you can see profit and growth.” The Biden administration, she said, has invested in a variety of new ways for the public and private sector to work together to massively accelerate the movement of technology into the market.Wednesday’s talk drew a capacity audience of nearly 300 people in MIT’s Wong Auditorium and was hosted by the Manufacturing@MIT Working Group. The event included introductory remarks by Suzanne Berger, an Institute Professor and a longtime expert on the innovation economy, and Nergis Mavalvala, dean of the School of Science and an astrophysicist and leader in gravitational-wave detection.Introducing Mavalvala, Berger said the 2015 announcement of the discovery of gravitational waves “was the day I felt proudest and most elated to be a member of the MIT community,” and noted that U.S. government support helped make the research possible. Mavalvala, in turn, said MIT was “especially honored” to hear Prabhakar discuss leading-edge research and acknowledge the role of universities in strengthening the country’s science and technology sectors.Prabhakar has extensive experience in both government and the private sector. She has been OSTP director and co-chair of PCAST since October of 2022. She served as director of the Defense Advanced Research Projects Agency (DARPA) from 2012 to 2017 and director of the National Institute of Standards and Technology (NIST) from 1993 to 1997.She has also held executive positions at Raychem and Interval Research, and spent a decade at the investment firm U.S. Venture Partners. An engineer by training, Prabhakar earned a BS in electrical engineering from Texas Tech University in 1979, an MA in electrical engineering from Caltech in 1980, and a PhD in applied physics from Caltech in 1984.Among other remarks about medicine, Prabhakar touted the Biden administration’s “Cancer Moonshot” program, which aims to cut the cancer death rate in half over the next 25 years through multiple approaches, from better health care provision and cancer detection to limiting public exposure to carcinogens. We should be striving, Prabhakar said, for “a future in which people take good health for granted and can get on with their lives.”On AI, she heralded both the promise and concerns about technology, saying, “I think it’s time for active steps to get on a path to where it actually allows people to do more and earn more.”When it comes to climate change, Prabhakar said, “We all understand that the climate is going to change. But it’s in our hands how severe those changes get. And it’s possible that we can build a better future.” She noted the bipartisan infrastructure bill signed into law in 2021 and the Biden administration’s Inflation Reduction Act as important steps forward in this fight.“Together those are making the single biggest investment anyone anywhere on the planet has ever made in the clean energy transition,” she said. “I used to feel hopeless about our ability to do that, and it gives me tremendous hope.”After her talk, Prabhakar was joined onstage for a group discussion with the three co-presidents of the MIT Energy and Climate Club: Laurentiu Anton, a doctoral candidate in electrical engineering and computer science; Rosie Keller, an MBA candidate at the MIT Sloan School of Management; and Thomas Lee, a doctoral candidate in MIT’s Institute for Data, Systems, and Society.Asked about the seemingly sagging public confidence in science today, Prabhakar offered a few thoughts.“The first thing I would say is, don’t take it personally,” Prabhakar said, noting that any dip in public regard for science is less severe than the diminished public confidence in other institutions.Adding some levity, she observed that in polling about which occupations are regarded as being desirable for a marriage partner to have, “scientist” still ranks highly.“Scientists still do really well on that front, we’ve got that going for us,” she quipped.More seriously, Prabhakar observed, rather than “preaching” at the public, scientists should recognize that “part of the job for us is to continue to be clear about what we know are the facts, and to present them clearly but humbly, and to be clear that we’re going to continue working to learn more.” At the same time, she continued, scientists can always reinforce that “oh, by the way, facts are helpful things that can actually help you make better choices about how the future turns out. I think that would be better in my view.”Prabhakar said that her White House work had been guided, in part, by one of the overarching themes that President Biden has often reinforced.“He thinks about America as a nation that can be described in a single word, and that word is ‘possibilities,’” she said. “And that idea, that is such a big idea, it lights me up. I think of what we do in the world of science and technology and innovation as really part and parcel of creating those possibilities.”Ultimately, Prabhakar said, at all times and all points in American history, scientists and technologists must continue “to prove once more that when people come together and do this work … we do it in a way that builds opportunity and expands opportunity for everyone in our country. I think this is the great privilege we all have in the work we do, and it’s also our responsibility.” More

  • in

    Catherine Wolfram: High-energy scholar

    In the mid 2000s, Catherine Wolfram PhD ’96 reached what she calls “an inflection point” in her career. After about a decade of studying U.S. electricity markets, she had come to recognize that “you couldn’t study the energy industries without thinking about climate mitigation,” as she puts it.At the same time, Wolfram understood that the trajectory of energy use in the developing world was a massively important part of the climate picture. To get a comprehensive grasp on global dynamics, she says, “I realized I needed to start thinking about the rest of the world.”An accomplished scholar and policy expert, Wolfram has been on the faculty at Harvard University, the University of California at Berkeley — and now MIT, where she is the William Barton Rogers Professor in Energy. She has also served as deputy assistant secretary for climate and energy economics at the U.S. Treasury.Yet even leading experts want to keep learning. So, when she hit that inflection point, Wolfram started carving out a new phase of her research career.“One of the things I love about being an academic is, I could just decide to do that,” Wolfram says. “I didn’t need to check with a boss. I could just pivot my career to being more focused to thinking about energy in the developing world.”Over the last decade, Wolfram has published a wide array of original studies about energy consumption in the developing world. From Kenya to Mexico to South Asia, she has shed light on the dynamics of economics growth and energy consumption — while spending some of that time serving the government too. Last year, Wolfram joined the faculty of the MIT Sloan School of Management, where her work bolsters the Institute’s growing effort to combat climate change.Studying at MITWolfram largely grew up in Minnesota, where her father was a legal scholar, although he moved to Cornell University around the time she started high school. As an undergraduate, she majored in economics at Harvard University, and after graduation she worked first for a consultant, then for the Massachusetts Department of Public Utilities, the agency regulating energy rates. In the latter job, Wolfram kept noticing that people were often citing the research of an MIT scholar named Paul Joskow (who is now the Elizabeth and James Killian Professor of Economics Emeritus in MIT’s Department of Economics) and Richard Schmalensee (a former dean of the MIT Sloan School of Management and now the Howard W. Johnson Professor of Management Emeritus). Seeing how consequential economics research could be for policymaking, Wolfram decided to get a PhD in the field and was accepted into MIT’s doctoral program.“I went into graduate school with an unusually specific view of what I wanted to do,” Wolfram says. “I wanted to work with Paul Joskow and Dick Schmalensee on electricity markets, and that’s how I wound up here.”At MIT, Wolfram also ended up working extensively with Nancy Rose, the Charles P. Kindleberger Professor of Applied Economics and a former head of the Department of Economics, who helped oversee Wolfram’s thesis; Rose has extensively studied market regulation as well.Wolfram’s dissertation research largely focused on price-setting behavior in the U.K.’s newly deregulated electricity markets, which, it turned out, applied handily to the U.S., where a similar process was taking place. “I was fortunate because this was around the time California was thinking about restructuring, as it was known,” Wolfram says. She spent four years on the faculty at Harvard, then moved to UC Berkeley. Wolfram’s studies have shown that deregulation has had some medium-term benefits, for instance in making power plants operate more efficiently.Turning on the ACBy around 2010, though, Wolfram began shifting her scholarly focus in earnest, conducting innovative studies about energy in the developing world. One strand of her research has centered on Kenya, to better understand how more energy access for people without electricity might fit into growth in the developing world.In this case, Wolfram’s perhaps surprising conclusion is that electrification itself is not a magic ticket to prosperity; people without electricity are more eager to adopt it when they have a practical economic need for it. Meanwhile, they have other essential needs that are not necessarily being addressed.“The 800 million people in the world who don’t have electricity also don’t have access to good health care or running water,” Wolfram says. “Giving them better housing infrastructure is important, and harder to tackle. It’s not clear that bringing people electricity alone is the single most useful thing from a development perspective. Although electricity is a super-important component of modern living.”Wolfram has even delved into topics such as air conditioner use in the developing world — an important driver of energy use. As her research shows, many countries, with a combined population far bigger than the U.S., are among the fastest-growing adopters of air conditioners and have an even greater need for them, based on their climates. Adoption of air conditioning within those countries also is characterized by marked economic inequality.From early 2021 until late 2022, Wolfram also served in the administration of President Joe Biden, where her work also centered on global energy issues. Among other things, Wolfram was part of the team working out a price-cap policy for Russian oil exports, a concept that she thinks could be applied to many other products globally. Although, she notes, working with countries heavily dependent on exporting energy materials will always require careful engagement.“We need to be mindful of that dependence and importance as we go through this massive effort to decarbonize the energy sector and shift it to a whole new paradigm,” Wolfram says.At MIT againStill, she notes, the world does need a whole new energy paradigm, and fast. Her arrival at MIT overlaps with the emergence of a new Institute-wide effort, the Climate Project at MIT, that aims to accelerate and scale climate solutions and good climate policy, including through the new Climate Policy Center at MIT Sloan. That kind of effort, Wolfram says, matters to her.“It’s part of why I’ve come to MIT,” Wolfram says. “Technology will be one part of the climate solution, but I do think an innovative mindset, how can we think about doing things better, can be productively applied to climate policy.” On being at MIT, she adds: “It’s great, it’s awesome. One of the things that pleasantly surprised me is how tight-knit and friendly the MIT faculty all are, and how many interactions I’ve had with people from other departments.”Wolfram has also been enjoying her teaching at MIT, and will be offering a large class in spring 2025, 15.016 (Climate and Energy in the Global Economy), that she debuted this past academic year.“It’s super fun to have students from around the world, who have personal stories and knowledge of energy systems in their countries and can contribute to our discussions,” she says.When it comes to tackling climate change, many things seem daunting. But there is still a world of knowledge to be acquired while we try to keep the planet from overheating, and Wolfram has a can-do attitude about learning more and applying those lessons.“We’ve made a lot of progress,” Wolfram says. “But we still have a lot more to do.” More

  • in

    Advancing urban tree monitoring with AI-powered digital twins

    The Irish philosopher George Berkely, best known for his theory of immaterialism, once famously mused, “If a tree falls in a forest and no one is around to hear it, does it make a sound?”What about AI-generated trees? They probably wouldn’t make a sound, but they will be critical nonetheless for applications such as adaptation of urban flora to climate change. To that end, the novel “Tree-D Fusion” system developed by researchers at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), Google, and Purdue University merges AI and tree-growth models with Google’s Auto Arborist data to create accurate 3D models of existing urban trees. The project has produced the first-ever large-scale database of 600,000 environmentally aware, simulation-ready tree models across North America.“We’re bridging decades of forestry science with modern AI capabilities,” says Sara Beery, MIT electrical engineering and computer science (EECS) assistant professor, MIT CSAIL principal investigator, and a co-author on a new paper about Tree-D Fusion. “This allows us to not just identify trees in cities, but to predict how they’ll grow and impact their surroundings over time. We’re not ignoring the past 30 years of work in understanding how to build these 3D synthetic models; instead, we’re using AI to make this existing knowledge more useful across a broader set of individual trees in cities around North America, and eventually the globe.”Tree-D Fusion builds on previous urban forest monitoring efforts that used Google Street View data, but branches it forward by generating complete 3D models from single images. While earlier attempts at tree modeling were limited to specific neighborhoods, or struggled with accuracy at scale, Tree-D Fusion can create detailed models that include typically hidden features, such as the back side of trees that aren’t visible in street-view photos.The technology’s practical applications extend far beyond mere observation. City planners could use Tree-D Fusion to one day peer into the future, anticipating where growing branches might tangle with power lines, or identifying neighborhoods where strategic tree placement could maximize cooling effects and air quality improvements. These predictive capabilities, the team says, could change urban forest management from reactive maintenance to proactive planning.A tree grows in Brooklyn (and many other places)The researchers took a hybrid approach to their method, using deep learning to create a 3D envelope of each tree’s shape, then using traditional procedural models to simulate realistic branch and leaf patterns based on the tree’s genus. This combo helped the model predict how trees would grow under different environmental conditions and climate scenarios, such as different possible local temperatures and varying access to groundwater.Now, as cities worldwide grapple with rising temperatures, this research offers a new window into the future of urban forests. In a collaboration with MIT’s Senseable City Lab, the Purdue University and Google team is embarking on a global study that re-imagines trees as living climate shields. Their digital modeling system captures the intricate dance of shade patterns throughout the seasons, revealing how strategic urban forestry could hopefully change sweltering city blocks into more naturally cooled neighborhoods.“Every time a street mapping vehicle passes through a city now, we’re not just taking snapshots — we’re watching these urban forests evolve in real-time,” says Beery. “This continuous monitoring creates a living digital forest that mirrors its physical counterpart, offering cities a powerful lens to observe how environmental stresses shape tree health and growth patterns across their urban landscape.”AI-based tree modeling has emerged as an ally in the quest for environmental justice: By mapping urban tree canopy in unprecedented detail, a sister project from the Google AI for Nature team has helped uncover disparities in green space access across different socioeconomic areas. “We’re not just studying urban forests — we’re trying to cultivate more equity,” says Beery. The team is now working closely with ecologists and tree health experts to refine these models, ensuring that as cities expand their green canopies, the benefits branch out to all residents equally.It’s a breezeWhile Tree-D fusion marks some major “growth” in the field, trees can be uniquely challenging for computer vision systems. Unlike the rigid structures of buildings or vehicles that current 3D modeling techniques handle well, trees are nature’s shape-shifters — swaying in the wind, interweaving branches with neighbors, and constantly changing their form as they grow. The Tree-D fusion models are “simulation-ready” in that they can estimate the shape of the trees in the future, depending on the environmental conditions.“What makes this work exciting is how it pushes us to rethink fundamental assumptions in computer vision,” says Beery. “While 3D scene understanding techniques like photogrammetry or NeRF [neural radiance fields] excel at capturing static objects, trees demand new approaches that can account for their dynamic nature, where even a gentle breeze can dramatically alter their structure from moment to moment.”The team’s approach of creating rough structural envelopes that approximate each tree’s form has proven remarkably effective, but certain issues remain unsolved. Perhaps the most vexing is the “entangled tree problem;” when neighboring trees grow into each other, their intertwined branches create a puzzle that no current AI system can fully unravel.The scientists see their dataset as a springboard for future innovations in computer vision, and they’re already exploring applications beyond street view imagery, looking to extend their approach to platforms like iNaturalist and wildlife camera traps.“This marks just the beginning for Tree-D Fusion,” says Jae Joong Lee, a Purdue University PhD student who developed, implemented and deployed the Tree-D-Fusion algorithm. “Together with my collaborators, I envision expanding the platform’s capabilities to a planetary scale. Our goal is to use AI-driven insights in service of natural ecosystems — supporting biodiversity, promoting global sustainability, and ultimately, benefiting the health of our entire planet.”Beery and Lee’s co-authors are Jonathan Huang, Scaled Foundations head of AI (formerly of Google); and four others from Purdue University: PhD students Jae Joong Lee and Bosheng Li, Professor and Dean’s Chair of Remote Sensing Songlin Fei, Assistant Professor Raymond Yeh, and Professor and Associate Head of Computer Science Bedrich Benes. Their work is based on efforts supported by the United States Department of Agriculture’s (USDA) Natural Resources Conservation Service and is directly supported by the USDA’s National Institute of Food and Agriculture. The researchers presented their findings at the European Conference on Computer Vision this month.  More

  • in

    Reality check on technologies to remove carbon dioxide from the air

    In 2015, 195 nations plus the European Union signed the Paris Agreement and pledged to undertake plans designed to limit the global temperature increase to 1.5 degrees Celsius. Yet in 2023, the world exceeded that target for most, if not all of, the year — calling into question the long-term feasibility of achieving that target.To do so, the world must reduce the levels of greenhouse gases in the atmosphere, and strategies for achieving levels that will “stabilize the climate” have been both proposed and adopted. Many of those strategies combine dramatic cuts in carbon dioxide (CO2) emissions with the use of direct air capture (DAC), a technology that removes CO2 from the ambient air. As a reality check, a team of researchers in the MIT Energy Initiative (MITEI) examined those strategies, and what they found was alarming: The strategies rely on overly optimistic — indeed, unrealistic — assumptions about how much CO2 could be removed by DAC. As a result, the strategies won’t perform as predicted. Nevertheless, the MITEI team recommends that work to develop the DAC technology continue so that it’s ready to help with the energy transition — even if it’s not the silver bullet that solves the world’s decarbonization challenge.DAC: The promise and the realityIncluding DAC in plans to stabilize the climate makes sense. Much work is now under way to develop DAC systems, and the technology looks promising. While companies may never run their own DAC systems, they can already buy “carbon credits” based on DAC. Today, a multibillion-dollar market exists on which entities or individuals that face high costs or excessive disruptions to reduce their own carbon emissions can pay others to take emissions-reducing actions on their behalf. Those actions can involve undertaking new renewable energy projects or “carbon-removal” initiatives such as DAC or afforestation/reforestation (planting trees in areas that have never been forested or that were forested in the past). DAC-based credits are especially appealing for several reasons, explains Howard Herzog, a senior research engineer at MITEI. With DAC, measuring and verifying the amount of carbon removed is straightforward; the removal is immediate, unlike with planting forests, which may take decades to have an impact; and when DAC is coupled with CO2 storage in geologic formations, the CO2 is kept out of the atmosphere essentially permanently — in contrast to, for example, sequestering it in trees, which may one day burn and release the stored CO2.Will current plans that rely on DAC be effective in stabilizing the climate in the coming years? To find out, Herzog and his colleagues Jennifer Morris and Angelo Gurgel, both MITEI principal research scientists, and Sergey Paltsev, a MITEI senior research scientist — all affiliated with the MIT Center for Sustainability Science and Strategy (CS3) — took a close look at the modeling studies on which those plans are based.Their investigation identified three unavoidable engineering challenges that together lead to a fourth challenge — high costs for removing a single ton of CO2 from the atmosphere. The details of their findings are reported in a paper published in the journal One Earth on Sept. 20.Challenge 1: Scaling upWhen it comes to removing CO2 from the air, nature presents “a major, non-negotiable challenge,” notes the MITEI team: The concentration of CO2 in the air is extremely low — just 420 parts per million, or roughly 0.04 percent. In contrast, the CO2 concentration in flue gases emitted by power plants and industrial processes ranges from 3 percent to 20 percent. Companies now use various carbon capture and sequestration (CCS) technologies to capture CO2 from their flue gases, but capturing CO2 from the air is much more difficult. To explain, the researchers offer the following analogy: “The difference is akin to needing to find 10 red marbles in a jar of 25,000 marbles of which 24,990 are blue [the task representing DAC] versus needing to find about 10 red marbles in a jar of 100 marbles of which 90 are blue [the task for CCS].”Given that low concentration, removing a single metric ton (tonne) of CO2 from air requires processing about 1.8 million cubic meters of air, which is roughly equivalent to the volume of 720 Olympic-sized swimming pools. And all that air must be moved across a CO2-capturing sorbent — a feat requiring large equipment. For example, one recently proposed design for capturing 1 million tonnes of CO2 per year would require an “air contactor” equivalent in size to a structure about three stories high and three miles long.Recent modeling studies project DAC deployment on the scale of 5 to 40 gigatonnes of CO2 removed per year. (A gigatonne equals 1 billion metric tonnes.) But in their paper, the researchers conclude that the likelihood of deploying DAC at the gigatonne scale is “highly uncertain.”Challenge 2: Energy requirementGiven the low concentration of CO2 in the air and the need to move large quantities of air to capture it, it’s no surprise that even the best DAC processes proposed today would consume large amounts of energy — energy that’s generally supplied by a combination of electricity and heat. Including the energy needed to compress the captured CO2 for transportation and storage, most proposed processes require an equivalent of at least 1.2 megawatt-hours of electricity for each tonne of CO2 removed.The source of that electricity is critical. For example, using coal-based electricity to drive an all-electric DAC process would generate 1.2 tonnes of CO2 for each tonne of CO2 captured. The result would be a net increase in emissions, defeating the whole purpose of the DAC. So clearly, the energy requirement must be satisfied using either low-carbon electricity or electricity generated using fossil fuels with CCS. All-electric DAC deployed at large scale — say, 10 gigatonnes of CO2 removed annually — would require 12,000 terawatt-hours of electricity, which is more than 40 percent of total global electricity generation today.Electricity consumption is expected to grow due to increasing overall electrification of the world economy, so low-carbon electricity will be in high demand for many competing uses — for example, in power generation, transportation, industry, and building operations. Using clean electricity for DAC instead of for reducing CO2 emissions in other critical areas raises concerns about the best uses of clean electricity.Many studies assume that a DAC unit could also get energy from “waste heat” generated by some industrial process or facility nearby. In the MITEI researchers’ opinion, “that may be more wishful thinking than reality.” The heat source would need to be within a few miles of the DAC plant for transporting the heat to be economical; given its high capital cost, the DAC plant would need to run nonstop, requiring constant heat delivery; and heat at the temperature required by the DAC plant would have competing uses, for example, for heating buildings. Finally, if DAC is deployed at the gigatonne per year scale, waste heat will likely be able to provide only a small fraction of the needed energy.Challenge 3: SitingSome analysts have asserted that, because air is everywhere, DAC units can be located anywhere. But in reality, siting a DAC plant involves many complex issues. As noted above, DAC plants require significant amounts of energy, so having access to enough low-carbon energy is critical. Likewise, having nearby options for storing the removed CO2 is also critical. If storage sites or pipelines to such sites don’t exist, major new infrastructure will need to be built, and building new infrastructure of any kind is expensive and complicated, involving issues related to permitting, environmental justice, and public acceptability — issues that are, in the words of the researchers, “commonly underestimated in the real world and neglected in models.”Two more siting needs must be considered. First, meteorological conditions must be acceptable. By definition, any DAC unit will be exposed to the elements, and factors like temperature and humidity will affect process performance and process availability. And second, a DAC plant will require some dedicated land — though how much is unclear, as the optimal spacing of units is as yet unresolved. Like wind turbines, DAC units need to be properly spaced to ensure maximum performance such that one unit is not sucking in CO2-depleted air from another unit.Challenge 4: CostConsidering the first three challenges, the final challenge is clear: the cost per tonne of CO2 removed is inevitably high. Recent modeling studies assume DAC costs as low as $100 to $200 per ton of CO2 removed. But the researchers found evidence suggesting far higher costs.To start, they cite typical costs for power plants and industrial sites that now use CCS to remove CO2 from their flue gases. The cost of CCS in such applications is estimated to be in the range of $50 to $150 per ton of CO2 removed. As explained above, the far lower concentration of CO2 in the air will lead to substantially higher costs.As explained under Challenge 1, the DAC units needed to capture the required amount of air are massive. The capital cost of building them will be high, given labor, materials, permitting costs, and so on. Some estimates in the literature exceed $5,000 per tonne captured per year.Then there are the ongoing costs of energy. As noted under Challenge 2, removing 1 tonne of CO2 requires the equivalent of 1.2 megawatt-hours of electricity. If that electricity costs $0.10 per kilowatt-hour, the cost of just the electricity needed to remove 1 tonne of CO2 is $120. The researchers point out that assuming such a low price is “questionable,” given the expected increase in electricity demand, future competition for clean energy, and higher costs on a system dominated by renewable — but intermittent — energy sources.Then there’s the cost of storage, which is ignored in many DAC cost estimates.Clearly, many considerations show that prices of $100 to $200 per tonne are unrealistic, and assuming such low prices will distort assessments of strategies, leading them to underperform going forward.The bottom lineIn their paper, the MITEI team calls DAC a “very seductive concept.” Using DAC to suck CO2 out of the air and generate high-quality carbon-removal credits can offset reduction requirements for industries that have hard-to-abate emissions. By doing so, DAC would minimize disruptions to key parts of the world’s economy, including air travel, certain carbon-intensive industries, and agriculture. However, the world would need to generate billions of tonnes of CO2 credits at an affordable price. That prospect doesn’t look likely. The largest DAC plant in operation today removes just 4,000 tonnes of CO2 per year, and the price to buy the company’s carbon-removal credits on the market today is $1,500 per tonne.The researchers recognize that there is room for energy efficiency improvements in the future, but DAC units will always be subject to higher work requirements than CCS applied to power plant or industrial flue gases, and there is not a clear pathway to reducing work requirements much below the levels of current DAC technologies.Nevertheless, the researchers recommend that work to develop DAC continue “because it may be needed for meeting net-zero emissions goals, especially given the current pace of emissions.” But their paper concludes with this warning: “Given the high stakes of climate change, it is foolhardy to rely on DAC to be the hero that comes to our rescue.” More

  • in

    Turning automotive engines into modular chemical plants to make green fuels

    Reducing methane emissions is a top priority in the fight against climate change because of its propensity to trap heat in the atmosphere: Methane’s warming effects are 84 times more potent than CO2 over a 20-year timescale.And yet, as the main component of natural gas, methane is also a valuable fuel and a precursor to several important chemicals. The main barrier to using methane emissions to create carbon-negative materials is that human sources of methane gas — landfills, farms, and oil and gas wells — are relatively small and spread out across large areas, while traditional chemical processing facilities are huge and centralized. That makes it prohibitively expensive to capture, transport, and convert methane gas into anything useful. As a result, most companies burn or “flare” their methane at the site where it’s emitted, seeing it as a sunk cost and an environmental liability.The MIT spinout Emvolon is taking a new approach to processing methane by repurposing automotive engines to serve as modular, cost-effective chemical plants. The company’s systems can take methane gas and produce liquid fuels like methanol and ammonia on-site; these fuels can then be used or transported in standard truck containers.”We see this as a new way of chemical manufacturing,” Emvolon co-founder and CEO Emmanuel Kasseris SM ’07, PhD ’11 says. “We’re starting with methane because methane is an abundant emission that we can use as a resource. With methane, we can solve two problems at the same time: About 15 percent of global greenhouse gas emissions come from hard-to-abate sectors that need green fuel, like shipping, aviation, heavy heavy-duty trucks, and rail. Then another 15 percent of emissions come from distributed methane emissions like landfills and oil wells.”By using mass-produced engines and eliminating the need to invest in infrastructure like pipelines, the company says it’s making methane conversion economically attractive enough to be adopted at scale. The system can also take green hydrogen produced by intermittent renewables and turn it into ammonia, another fuel that can also be used to decarbonize fertilizers.“In the future, we’re going to need green fuels because you can’t electrify a large ship or plane — you have to use a high-energy-density, low-carbon-footprint, low-cost liquid fuel,” Kasseris says. “The energy resources to produce those green fuels are either distributed, as is the case with methane, or variable, like wind. So, you cannot have a massive plant [producing green fuels] that has its own zip code. You either have to be distributed or variable, and both of those approaches lend themselves to this modular design.”From a “crazy idea” to a companyKasseris first came to MIT to study mechanical engineering as a graduate student in 2004, when he worked in the Sloan Automotive Lab on a report on the future of transportation. For his PhD, he developed a novel technology for improving internal combustion engine fuel efficiency for a consortium of automotive and energy companies, which he then went to work for after graduation.Around 2014, he was approached by Leslie Bromberg ’73, PhD ’77, a serial inventor with more than 100 patents, who has been a principal research engineer in MIT’s Plasma Science and Fusion Center for nearly 50 years.“Leslie had this crazy idea of repurposing an internal combustion engine as a reactor,” Kasseris recalls. “I had looked at that while working in industry, and I liked it, but my company at the time thought the work needed more validation.”Bromberg had done that validation through a U.S. Department of Energy-funded project in which he used a diesel engine to “reform” methane — a high-pressure chemical reaction in which methane is combined with steam and oxygen to produce hydrogen. The work impressed Kasseris enough to bring him back to MIT as a research scientist in 2016.“We worked on that idea in addition to some other projects, and eventually it had reached the point where we decided to license the work from MIT and go full throttle,” Kasseris recalls. “It’s very easy to work with MIT’s Technology Licensing Office when you are an MIT inventor. You can get a low-cost licensing option, and you can do a lot with that, which is important for a new company. Then, once you are ready, you can finalize the license, so MIT was instrumental.”Emvolon continued working with MIT’s research community, sponsoring projects with Professor Emeritus John Heywood and participating in the MIT Venture Mentoring Service and the MIT Industrial Liaison Program.An engine-powered chemical plantAt the core of Emvolon’s system is an off-the-shelf automotive engine that runs “fuel rich” — with a higher ratio of fuel to air than what is needed for complete combustion.“That’s easy to say, but it takes a lot of [intellectual property], and that’s what was developed at MIT,” Kasseris says. “Instead of burning the methane in the gas to carbon dioxide and water, you partially burn it, or partially oxidize it, to carbon monoxide and hydrogen, which are the building blocks to synthesize a variety of chemicals.”The hydrogen and carbon monoxide are intermediate products used to synthesize different chemicals through further reactions. Those processing steps take place right next to the engine, which makes its own power. Each of Emvolon’s standalone systems fits within a 40-foot shipping container and can produce about 8 tons of methanol per day from 300,000 standard cubic feet of methane gas.The company is starting with green methanol because it’s an ideal fuel for hard-to-abate sectors such as shipping and heavy-duty transport, as well as an excellent feedstock for other high-value chemicals, such as sustainable aviation fuel. Many shipping vessels have already converted to run on green methanol in an effort to meet decarbonization goals.This summer, the company also received a grant from the Department of Energy to adapt its process to produce clean liquid fuels from power sources like solar and wind.“We’d like to expand to other chemicals like ammonia, but also other feedstocks, such as biomass and hydrogen from renewable electricity, and we already have promising results in that direction” Kasseris says. “We think we have a good solution for the energy transition and, in the later stages of the transition, for e-manufacturing.”A scalable approachEmvolon has already built a system capable of producing up to six barrels of green methanol a day in its 5,000 square-foot headquarters in Woburn, Massachusetts.“For chemical technologies, people talk about scale up risk, but with an engine, if it works in a single cylinder, we know it will work in a multicylinder engine,” Kasseris says. “It’s just engineering.”Last month, Emvolon announced an agreement with Montauk Renewables to build a commercial-scale demonstration unit next to a Texas landfill that will initially produce up to 15,000 gallons of green methanol a year and later scale up to 2.5 million gallons. That project could be expanded tenfold by scaling across Montauk’s other sites.“Our whole process was designed to be a very realistic approach to the energy transition,” Kasseris says. “Our solution is designed to produce green fuels and chemicals at prices that the markets are willing to pay today, without the need for subsidies. Using the engines as chemical plants, we can get the capital expenditure per unit output close to that of a large plant, but at a modular scale that enables us to be next to low-cost feedstock. Furthermore, our modular systems require small investments — of $1 to 10 million — that are quickly deployed, one at a time, within weeks, as opposed to massive chemical plants that require multiyear capital construction projects and cost hundreds of millions.” More

  • in

    Ensuring a durable transition

    To fend off the worst impacts of climate change, “we have to decarbonize, and do it even faster,” said William H. Green, director of the MIT Energy Initiative (MITEI) and Hoyt C. Hottel Professor, MIT Department of Chemical Engineering, at MITEI’s Annual Research Conference.“But how the heck do we actually achieve this goal when the United States is in the middle of a divisive election campaign, and globally, we’re facing all kinds of geopolitical conflicts, trade protectionism, weather disasters, increasing demand from developing countries building a middle class, and data centers in countries like the U.S.?”Researchers, government officials, and business leaders convened in Cambridge, Massachusetts, Sept. 25-26 to wrestle with this vexing question at the conference that was themed, “A durable energy transition: How to stay on track in the face of increasing demand and unpredictable obstacles.”“In this room we have a lot of power,” said Green, “if we work together, convey to all of society what we see as real pathways and policies to solve problems, and take collective action.”The critical role of consensus-building in driving the energy transition arose repeatedly in conference sessions, whether the topic involved developing and adopting new technologies, constructing and siting infrastructure, drafting and passing vital energy policies, or attracting and retaining a skilled workforce.Resolving conflictsThere is “blowback and a social cost” in transitioning away from fossil fuels, said Stephen Ansolabehere, the Frank G. Thompson Professor of Government at Harvard University, in a panel on the social barriers to decarbonization. “Companies need to engage differently and recognize the rights of communities,” he said.Nora DeDontney, director of development at Vineyard Offshore, described her company’s two years of outreach and negotiations to bring large cables from ocean-based wind turbines onshore.“Our motto is, ‘community first,’” she said. Her company works to mitigate any impacts towns might feel because of offshore wind infrastructure construction with projects, such as sewer upgrades; provides workforce training to Tribal Nations; and lays out wind turbines in a manner that provides safe and reliable areas for local fisheries.Elsa A. Olivetti, professor in the Department of Materials Science and Engineering at MIT and the lead of the Decarbonization Mission of MIT’s new Climate Project, discussed the urgent need for rapid scale-up of mineral extraction. “Estimates indicate that to electrify the vehicle fleet by 2050, about six new large copper mines need to come on line each year,” she said. To meet the demand for metals in the United States means pushing into Indigenous lands and environmentally sensitive habitats. “The timeline of permitting is not aligned with the temporal acceleration needed,” she said.Larry Susskind, the Ford Professor of Urban and Environmental Planning in the MIT Department of Urban Studies and Planning, is trying to resolve such tensions with universities playing the role of mediators. He is creating renewable energy clinics where students train to participate in emerging disputes over siting. “Talk to people before decisions are made, conduct joint fact finding, so that facilities reduce harms and share the benefits,” he said.Clean energy boom and pressureA relatively recent and unforeseen increase in demand for energy comes from data centers, which are being built by large technology companies for new offerings, such as artificial intelligence.“General energy demand was flat for 20 years — and now, boom,” said Sean James, Microsoft’s senior director of data center research. “It caught utilities flatfooted.” With the expansion of AI, the rush to provision data centers with upwards of 35 gigawatts of new (and mainly renewable) power in the near future, intensifies pressure on big companies to balance the concerns of stakeholders across multiple domains. Google is pursuing 24/7 carbon-free energy by 2030, said Devon Swezey, the company’s senior manager for global energy and climate.“We’re pursuing this by purchasing more and different types of clean energy locally, and accelerating technological innovation such as next-generation geothermal projects,” he said. Pedro Gómez Lopez, strategy and development director, Ferrovial Digital, which designs and constructs data centers, incorporates renewable energy into their projects, which contributes to decarbonization goals and benefits to locales where they are sited. “We can create a new supply of power, taking the heat generated by a data center to residences or industries in neighborhoods through District Heating initiatives,” he said.The Inflation Reduction Act and other legislation has ramped up employment opportunities in clean energy nationwide, touching every region, including those most tied to fossil fuels. “At the start of 2024 there were about 3.5 million clean energy jobs, with ‘red’ states showing the fastest growth in clean energy jobs,” said David S. Miller, managing partner at Clean Energy Ventures. “The majority (58 percent) of new jobs in energy are now in clean energy — that transition has happened. And one-in-16 new jobs nationwide were in clean energy, with clean energy jobs growing more than three times faster than job growth economy-wide”In this rapid expansion, the U.S. Department of Energy (DoE) is prioritizing economically marginalized places, according to Zoe Lipman, lead for good jobs and labor standards in the Office of Energy Jobs at the DoE. “The community benefit process is integrated into our funding,” she said. “We are creating the foundation of a virtuous circle,” encouraging benefits to flow to disadvantaged and energy communities, spurring workforce training partnerships, and promoting well-paid union jobs. “These policies incentivize proactive community and labor engagement, and deliver community benefits, both of which are key to building support for technological change.”Hydrogen opportunity and challengeWhile engagement with stakeholders helps clear the path for implementation of technology and the spread of infrastructure, there remain enormous policy, scientific, and engineering challenges to solve, said multiple conference participants. In a “fireside chat,” Prasanna V. Joshi, vice president of low-carbon-solutions technology at ExxonMobil, and Ernest J. Moniz, professor of physics and special advisor to the president at MIT, discussed efforts to replace natural gas and coal with zero-carbon hydrogen in order to reduce greenhouse gas emissions in such major industries as steel and fertilizer manufacturing.“We have gone into an era of industrial policy,” said Moniz, citing a new DoE program offering incentives to generate demand for hydrogen — more costly than conventional fossil fuels — in end-use applications. “We are going to have to transition from our current approach, which I would call carrots-and-twigs, to ultimately, carrots-and-sticks,” Moniz warned, in order to create “a self-sustaining, major, scalable, affordable hydrogen economy.”To achieve net zero emissions by 2050, ExxonMobil intends to use carbon capture and sequestration in natural gas-based hydrogen and ammonia production. Ammonia can also serve as a zero-carbon fuel. Industry is exploring burning ammonia directly in coal-fired power plants to extend the hydrogen value chain. But there are challenges. “How do you burn 100 percent ammonia?”, asked Joshi. “That’s one of the key technology breakthroughs that’s needed.” Joshi believes that collaboration with MIT’s “ecosystem of breakthrough innovation” will be essential to breaking logjams around the hydrogen and ammonia-based industries.MIT ingenuity essentialThe energy transition is placing very different demands on different regions around the world. Take India, where today per capita power consumption is one of the lowest. But Indians “are an aspirational people … and with increasing urbanization and industrial activity, the growth in power demand is expected to triple by 2050,” said Praveer Sinha, CEO and managing director of the Tata Power Co. Ltd., in his keynote speech. For that nation, which currently relies on coal, the move to clean energy means bringing another 300 gigawatts of zero-carbon capacity online in the next five years. Sinha sees this power coming from wind, solar, and hydro, supplemented by nuclear energy.“India plans to triple nuclear power generation capacity by 2032, and is focusing on advancing small modular reactors,” said Sinha. “The country also needs the rapid deployment of storage solutions to firm up the intermittent power.” The goal is to provide reliable electricity 24/7 to a population living both in large cities and in geographically remote villages, with the help of long-range transmission lines and local microgrids. “India’s energy transition will require innovative and affordable technology solutions, and there is no better place to go than MIT, where you have the best brains, startups, and technology,” he said.These assets were on full display at the conference. Among them a cluster of young businesses, including:the MIT spinout Form Energy, which has developed a 100-hour iron battery as a backstop to renewable energy sources in case of multi-day interruptions;startup Noya that aims for direct air capture of atmospheric CO2 using carbon-based materials;the firm Active Surfaces, with a lightweight material for putting solar photovoltaics in previously inaccessible places;Copernic Catalysts, with new chemistry for making ammonia and sustainable aviation fuel far more inexpensively than current processes; andSesame Sustainability, a software platform spun out of MITEI that gives industries a full financial analysis of the costs and benefits of decarbonization.The pipeline of research talent extended into the undergraduate ranks, with a conference “slam” competition showcasing students’ summer research projects in areas from carbon capture using enzymes to 3D design for the coils used in fusion energy confinement.“MIT students like me are looking to be the next generation of energy leaders, looking for careers where we can apply our engineering skills to tackle exciting climate problems and make a tangible impact,” said Trent Lee, a junior in mechanical engineering researching improvements in lithium-ion energy storage. “We are stoked by the energy transition, because it’s not just the future, but our chance to build it.” More