More stories

  • in

    A new catalyst can turn methane into something useful

    Although it is less abundant than carbon dioxide, methane gas contributes disproportionately to global warming because it traps more heat in the atmosphere than carbon dioxide, due to its molecular structure.MIT chemical engineers have now designed a new catalyst that can convert methane into useful polymers, which could help reduce greenhouse gas emissions.“What to do with methane has been a longstanding problem,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT and the senior author of the study. “It’s a source of carbon, and we want to keep it out of the atmosphere but also turn it into something useful.”The new catalyst works at room temperature and atmospheric pressure, which could make it easier and more economical to deploy at sites of methane production, such as power plants and cattle barns.Daniel Lundberg PhD ’24 and MIT postdoc Jimin Kim are the lead authors of the study, which appears today in Nature Catalysis. Former postdoc Yu-Ming Tu and postdoc Cody Ritt also authors of the paper.Capturing methaneMethane is produced by bacteria known as methanogens, which are often highly concentrated in landfills, swamps, and other sites of decaying biomass. Agriculture is a major source of methane, and methane gas is also generated as a byproduct of transporting, storing, and burning natural gas. Overall, it is believed to account for about 15 percent of global temperature increases.At the molecular level, methane is made of a single carbon atom bound to four hydrogen atoms. In theory, this molecule should be a good building block for making useful products such as polymers. However, converting methane to other compounds has proven difficult because getting it to react with other molecules usually requires high temperature and high pressures.To achieve methane conversion without that input of energy, the MIT team designed a hybrid catalyst with two components: a zeolite and a naturally occurring enzyme. Zeolites are abundant, inexpensive clay-like minerals, and previous work has found that they can be used to catalyze the conversion of methane to carbon dioxide.In this study, the researchers used a zeolite called iron-modified aluminum silicate, paired with an enzyme called alcohol oxidase. Bacteria, fungi, and plants use this enzyme to oxidize alcohols.This hybrid catalyst performs a two-step reaction in which zeolite converts methane to methanol, and then the enzyme converts methanol to formaldehyde. That reaction also generates hydrogen peroxide, which is fed back into the zeolite to provide a source of oxygen for the conversion of methane to methanol.This series of reactions can occur at room temperature and doesn’t require high pressure. The catalyst particles are suspended in water, which can absorb methane from the surrounding air. For future applications, the researchers envision that it could be painted onto surfaces.“Other systems operate at high temperature and high pressure, and they use hydrogen peroxide, which is an expensive chemical, to drive the methane oxidation. But our enzyme produces hydrogen peroxide from oxygen, so I think our system could be very cost-effective and scalable,” Kim says.Creating a system that incorporates both enzymes and artificial catalysts is a “smart strategy,” says Damien Debecker, a professor at the Institute of Condensed Matter and Nanosciences at the University of Louvain, Belgium.“Combining these two families of catalysts is challenging, as they tend to operate in rather distinct operation conditions. By unlocking this constraint and mastering the art of chemo-enzymatic cooperation, hybrid catalysis becomes key-enabling: It opens new perspectives to run complex reaction systems in an intensified way,” says Debecker, who was not involved in the research.Building polymersOnce formaldehyde is produced, the researchers showed they could use that molecule to generate polymers by adding urea, a nitrogen-containing molecule found in urine. This resin-like polymer, known as urea-formaldehyde, is now used in particle board, textiles and other products.The researchers envision that this catalyst could be incorporated into pipes used to transport natural gas. Within those pipes, the catalyst could generate a polymer that could act as a sealant to heal cracks in the pipes, which are a common source of methane leakage. The catalyst could also be applied as a film to coat surfaces that are exposed to methane gas, producing polymers that could be collected for use in manufacturing, the researchers say.Strano’s lab is now working on catalysts that could be used to remove carbon dioxide from the atmosphere and combine it with nitrate to produce urea. That urea could then be mixed with the formaldehyde produced by the zeolite-enzyme catalyst to produce urea-formaldehyde.The research was funded by the U.S. Department of Energy. More

  • in

    Q&A: Transforming research through global collaborations

    The MIT Global Seed Funds (GSF) program fosters global research collaborations with MIT faculty and their peers abroad — creating partnerships that tackle complex global issues, from climate change to health-care challenges and beyond. Administered by the MIT Center for International Studies (CIS), the GSF program has awarded more than $26 million to over 1,200 faculty research projects since its inception in 2008. Through its unique funding structure — comprising a general fund for unrestricted geographical use and several specific funds within individual countries, regions, and universities — GSF supports a wide range of projects. The current call for proposals from MIT faculty and researchers with principal investigator status is open until Dec. 10. CIS recently sat down with faculty recipients Josephine Carstensen and David McGee to discuss the value and impact GSF added to their research. Carstensen, the Gilbert W. Winslow Career Development Associate Professor of Civil and Environmental Engineering, generates computational designs for large-scale structures with the intent of designing novel low-carbon solutions. McGee, the William R. Kenan, Jr. Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), reconstructs the patterns, pace, and magnitudes of past hydro-climate changes.Q: How did the Global Seed Funds program connect you with global partnerships related to your research?Carstensen: One of the projects my lab is working on is to unlock the potential of complex cast-glass structures. Through our GSF partnership with researchers at TUDelft (Netherlands), my group was able to leverage our expertise in generative design algorithms alongside the TUDelft team, who are experts in the physical casting and fabrication of glass structures. Our initial connection to TUDelft was actually through one of my graduate students who was at a conference and met TUDelft researchers. He was inspired by their work and felt there could be synergy between our labs. The question then became: How do we connect with TUDelft? And that was what led us to the Global Seed Funds program. McGee: Our research is based in fieldwork conducted in partnership with experts who have a rich understanding of local environments. These locations range from lake basins in Chile and Argentina to caves in northern Mexico, Vietnam, and Madagascar. GSF has been invaluable for helping foster partnerships with collaborators and universities in these different locations, enabling the pilot work and relationship-building necessary to establish longer-term, externally funded projects.Q: Tell us more about your GSF-funded work.Carstensen: In my research group at MIT, we live mainly in a computational regime, and we do very little proof-of-concept testing. To that point, we do not even have the facilities nor experience to physically build large-scale structures, or even specialized structures. GSF has enabled us to connect with the researchers at TUDelft who do much more experimental testing than we do. Being able to work with the experts at TUDelft within their physical realm provided valuable insights into their way of approaching problems. And, likewise, the researchers at TUDelft benefited from our expertise. It has been fruitful in ways we couldn’t have imagined within our lab at MIT.McGee: The collaborative work supported by the GSF has focused on reconstructing how past climate changes impacted rainfall patterns around the world, using natural archives like lake sediments and cave formations. One particularly successful project has been our work in caves in northeastern Mexico, which has been conducted in partnership with researchers from the National Autonomous University of Mexico (UNAM) and a local caving group. This project has involved several MIT undergraduate and graduate students, sponsored a research symposium in Mexico City, and helped us obtain funding from the National Science Foundation for a longer-term project.Q: You both mentioned the involvement of your graduate students. How exactly has the GSF augmented the research experience of your students?Carstensen: The collaboration has especially benefited the graduate students from both the MIT and TUDelft teams. The opportunity presented through this project to engage in research at an international peer institution has been extremely beneficial for their academic growth and maturity. It has facilitated training in new and complementary technical areas that they would not have had otherwise and allowed them to engage with leading world experts. An example of this aspect of the project’s success is that the collaboration has inspired one of my graduate students to actively pursue postdoc opportunities in Europe (including at TU Delft) after his graduation.McGee: MIT students have traveled to caves in northeastern Mexico and to lake basins in northern Chile to conduct fieldwork and build connections with local collaborators. Samples enabled by GSF-supported projects became the focus of two graduate students’ PhD theses, two EAPS undergraduate senior theses, and multiple UROP [Undergraduate Research Opportunity Program] projects.Q: Were there any unexpected benefits to the work funded by GSF?Carstensen: The success of this project would not have been possible without this specific international collaboration. Both the Delft and MIT teams bring highly different essential expertise that has been necessary for the successful project outcome. It allowed both the Delft and MIT teams to gain an in-depth understanding of the expertise areas and resources of the other collaborators. Both teams have been deeply inspired. This partnership has fueled conversations about potential future projects and provided multiple outcomes, including a plan to publish two journal papers on the project outcome. The first invited publication is being finalized now.McGee: GSF’s focus on reciprocal exchange has enabled external collaborators to spend time at MIT, sharing their work and exchanging ideas. Other funding is often focused on sending MIT researchers and students out, but GSF has helped us bring collaborators here, making the relationship more equal. A GSF-supported visit by Argentinian researchers last year made it possible for them to interact not just with my group, but with students and faculty across EAPS. More

  • in

    Is there enough land on Earth to fight climate change and feed the world?

    Capping global warming at 1.5 degrees Celsius is a tall order. Achieving that goal will not only require a massive reduction in greenhouse gas emissions from human activities, but also a substantial reallocation of land to support that effort and sustain the biosphere, including humans. More land will be needed to accommodate a growing demand for bioenergy and nature-based carbon sequestration while ensuring sufficient acreage for food production and ecological sustainability.The expanding role of land in a 1.5 C world will be twofold — to remove carbon dioxide from the atmosphere and to produce clean energy. Land-based carbon dioxide removal strategies include bioenergy with carbon capture and storage; direct air capture; and afforestation/reforestation and other nature-based solutions. Land-based clean energy production includes wind and solar farms and sustainable bioenergy cropland. Any decision to allocate more land for climate mitigation must also address competing needs for long-term food security and ecosystem health.Land-based climate mitigation choices vary in terms of costs — amount of land required, implications for food security, impact on biodiversity and other ecosystem services — and benefits — potential for sequestering greenhouse gases and producing clean energy.Now a study in the journal Frontiers in Environmental Science provides the most comprehensive analysis to date of competing land-use and technology options to limit global warming to 1.5 C. Led by researchers at the MIT Center for Sustainability Science and Strategy (CS3), the study applies the MIT Integrated Global System Modeling (IGSM) framework to evaluate costs and benefits of different land-based climate mitigation options in Sky2050, a 1.5 C climate-stabilization scenario developed by Shell.Under this scenario, demand for bioenergy and natural carbon sinks increase along with the need for sustainable farming and food production. To determine if there’s enough land to meet all these growing demands, the research team uses the global hectare (gha) — an area of 10,000 square meters, or 2.471 acres — as the standard unit of measurement, and current estimates of the Earth’s total habitable land area (about 10 gha) and land area used for food production and bioenergy (5 gha).The team finds that with transformative changes in policy, land management practices, and consumption patterns, global land is sufficient to provide a sustainable supply of food and ecosystem services throughout this century while also reducing greenhouse gas emissions in alignment with the 1.5 C goal. These transformative changes include policies to protect natural ecosystems; stop deforestation and accelerate reforestation and afforestation; promote advances in sustainable agriculture technology and practice; reduce agricultural and food waste; and incentivize consumers to purchase sustainably produced goods.If such changes are implemented, 2.5–3.5 gha of land would be used for NBS practices to sequester 3–6 gigatonnes (Gt) of CO2 per year, and 0.4–0.6 gha of land would be allocated for energy production — 0.2–0.3 gha for bioenergy and 0.2–0.35 gha for wind and solar power generation.“Our scenario shows that there is enough land to support a 1.5 degree C future as long as effective policies at national and global levels are in place,” says CS3 Principal Research Scientist Angelo Gurgel, the study’s lead author. “These policies must not only promote efficient use of land for food, energy, and nature, but also be supported by long-term commitments from government and industry decision-makers.” More

  • in

    Decarbonizing heavy industry with thermal batteries

    Whether you’re manufacturing cement, steel, chemicals, or paper, you need a large amount of heat. Almost without exception, manufacturers around the world create that heat by burning fossil fuels.In an effort to clean up the industrial sector, some startups are changing manufacturing processes for specific materials. Some are even changing the materials themselves. Daniel Stack SM ’17, PhD ’21 is trying to address industrial emissions across the board by replacing the heat source.Since coming to MIT in 2014, Stack has worked to develop thermal batteries that use electricity to heat up a conductive version of ceramic firebricks, which have been used as heat stores and insulators for centuries. In 2021, Stack co-founded Electrified Thermal Solutions, which has since demonstrated that its firebricks can store heat efficiently for hours and discharge it by heating air or gas up to 3,272 degrees Fahrenheit — hot enough to power the most demanding industrial applications.Achieving temperatures north of 3,000 F represents a breakthrough for the electric heating industry, as it enables some of the world’s hardest-to-decarbonize sectors to utilize renewable energy for the first time. It also unlocks a new, low-cost model for using electricity when it’s at its cheapest and cleanest.“We have a global perspective at Electrified Thermal, but in the U.S. over the last five years, we’ve seen an incredible opportunity emerge in energy prices that favors flexible offtake of electricity,” Stack says. “Throughout the middle of the country, especially in the wind belt, electricity prices in many places are negative for more than 20 percent of the year, and the trend toward decreasing electricity pricing during off-peak hours is a nationwide phenomenon. Technologies like our Joule Hive Thermal Battery will enable us to access this inexpensive, clean electricity and compete head to head with fossil fuels on price for industrial heating needs, without even factoring in the positive climate impact.”A new approach to an old technologyStack’s research plans changed quickly when he joined MIT’s Department of Nuclear Science and Engineering as a master’s student in 2014.“I went to MIT excited to work on the next generation of nuclear reactors, but what I focused on almost from day one was how to heat up bricks,” Stack says. “It wasn’t what I expected, but when I talked to my advisor, [Principal Research Scientist] Charles Forsberg, about energy storage and why it was valuable to not just nuclear power but the entire energy transition, I realized there was no project I would rather work on.”Firebricks are ubiquitous, inexpensive clay bricks that have been used for millennia in fireplaces and ovens. In 2017, Forsberg and Stack co-authored a paper showing firebricks’ potential to store heat from renewable resources, but the system still used electric resistance heaters — like the metal coils in toasters and space heaters — which limited its temperature output.For his doctoral work, Stack worked with Forsberg to make firebricks that were electrically conductive, replacing the resistance heaters so the bricks produced the heat directly.“Electric heaters are your biggest limiter: They burn out too fast, they break down, they don’t get hot enough,” Stack explains. “The idea was to skip the heaters because firebricks themselves are really cheap, abundant materials that can go to flame-like temperatures and hang out there for days.”Forsberg and Stacks were able to create conductive firebricks by tweaking the chemical composition of traditional firebricks. Electrified Thermal’s bricks are 98 percent similar to existing firebricks and are produced using the same processes, allowing existing manufacturers to make them inexpensively.Toward the end of his PhD program, Stack realized the invention could be commercialized. He started taking classes at the MIT Sloan School of Management and spending time at the Martin Trust Center for MIT Entrepreneurship. He also entered the StartMIT program and the I-Corps program, and received support from the U.S. Department of Energy and MIT’s Venture Mentoring Service (VMS).“Through the Boston ecosystem, the MIT ecosystem, and with help from the Department of Energy, we were able to launch this from the lab at MIT,” Stack says. “What we spun out was an electrically conductive firebrick, or what we refer to as an e-Brick.”Electrified Thermal contains its firebrick arrays in insulated, off-the-shelf metal boxes. Although the system is highly configurable depending on the end use, the company’s standard system can collect and release about 5 megawatts of energy and store about 25 megawatt-hours.The company has demonstrated its system’s ability to produce high temperatures and has been cycling its system at its headquarters in Medford, Massachusetts. That work has collectively earned Electrified Thermal $40 million from various the Department of Energy offices to scale the technology and work with manufacturers.“Compared to other electric heating, we can run hotter and last longer than any other solution on the market,” Stack says. “That means replacing fossil fuels at a lot of industrial sites that couldn’t otherwise decarbonize.”Scaling to solve a global problemElectrified Thermal is engaging with hundreds of industrial companies, including manufacturers of cement, steel, glass, basic and specialty chemicals, food and beverage, and pulp and paper.“The industrial heating challenge affects everyone under the sun,” Stack says. “They all have fundamentally the same problem, which is getting their heat in a way that is affordable and zero carbon for the energy transition.”The company is currently building a megawatt-scale commercial version of its system, which it expects to be operational in the next seven months.“Next year will be a huge proof point to the industry,” Stack says. “We’ll be using the commercial system to showcase a variety of operating points that customers need to see, and we’re hoping to be running systems on customer sites by the end of the year. It’ll be a huge achievement and a first for electric heating because no other solution in the market can put out the kind of temperatures that we can put out.”By working with manufacturers to produce its firebricks and casings, Electrified Thermal hopes to be able to deploy its systems rapidly and at low cost across a massive industry.“From the very beginning, we engineered these e-bricks to be rapidly scalable and rapidly producible within existing supply chains and manufacturing processes,” Stack says. “If you want to decarbonize heavy industry, there will be no cheaper way than turning electricity into heat from zero-carbon electricity assets. We’re seeking to be the premier technology that unlocks those capabilities, with double digit percentages of global energy flowing through our system as we accomplish the energy transition.” More

  • in

    New AI tool generates realistic satellite images of future flooding

    Visualizing the potential impacts of a hurricane on people’s homes before it hits can help residents prepare and decide whether to evacuate.MIT scientists have developed a method that generates satellite imagery from the future to depict how a region would look after a potential flooding event. The method combines a generative artificial intelligence model with a physics-based flood model to create realistic, birds-eye-view images of a region, showing where flooding is likely to occur given the strength of an oncoming storm.As a test case, the team applied the method to Houston and generated satellite images depicting what certain locations around the city would look like after a storm comparable to Hurricane Harvey, which hit the region in 2017. The team compared these generated images with actual satellite images taken of the same regions after Harvey hit. They also compared AI-generated images that did not include a physics-based flood model.The team’s physics-reinforced method generated satellite images of future flooding that were more realistic and accurate. The AI-only method, in contrast, generated images of flooding in places where flooding is not physically possible.The team’s method is a proof-of-concept, meant to demonstrate a case in which generative AI models can generate realistic, trustworthy content when paired with a physics-based model. In order to apply the method to other regions to depict flooding from future storms, it will need to be trained on many more satellite images to learn how flooding would look in other regions.“The idea is: One day, we could use this before a hurricane, where it provides an additional visualization layer for the public,” says Björn Lütjens, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences, who led the research while he was a doctoral student in MIT’s Department of Aeronautics and Astronautics (AeroAstro). “One of the biggest challenges is encouraging people to evacuate when they are at risk. Maybe this could be another visualization to help increase that readiness.”To illustrate the potential of the new method, which they have dubbed the “Earth Intelligence Engine,” the team has made it available as an online resource for others to try.The researchers report their results today in the journal IEEE Transactions on Geoscience and Remote Sensing. The study’s MIT co-authors include Brandon Leshchinskiy; Aruna Sankaranarayanan; and Dava Newman, professor of AeroAstro and director of the MIT Media Lab; along with collaborators from multiple institutions.Generative adversarial imagesThe new study is an extension of the team’s efforts to apply generative AI tools to visualize future climate scenarios.“Providing a hyper-local perspective of climate seems to be the most effective way to communicate our scientific results,” says Newman, the study’s senior author. “People relate to their own zip code, their local environment where their family and friends live. Providing local climate simulations becomes intuitive, personal, and relatable.”For this study, the authors use a conditional generative adversarial network, or GAN, a type of machine learning method that can generate realistic images using two competing, or “adversarial,” neural networks. The first “generator” network is trained on pairs of real data, such as satellite images before and after a hurricane. The second “discriminator” network is then trained to distinguish between the real satellite imagery and the one synthesized by the first network.Each network automatically improves its performance based on feedback from the other network. The idea, then, is that such an adversarial push and pull should ultimately produce synthetic images that are indistinguishable from the real thing. Nevertheless, GANs can still produce “hallucinations,” or factually incorrect features in an otherwise realistic image that shouldn’t be there.“Hallucinations can mislead viewers,” says Lütjens, who began to wonder whether such hallucinations could be avoided, such that generative AI tools can be trusted to help inform people, particularly in risk-sensitive scenarios. “We were thinking: How can we use these generative AI models in a climate-impact setting, where having trusted data sources is so important?”Flood hallucinationsIn their new work, the researchers considered a risk-sensitive scenario in which generative AI is tasked with creating satellite images of future flooding that could be trustworthy enough to inform decisions of how to prepare and potentially evacuate people out of harm’s way.Typically, policymakers can get an idea of where flooding might occur based on visualizations in the form of color-coded maps. These maps are the final product of a pipeline of physical models that usually begins with a hurricane track model, which then feeds into a wind model that simulates the pattern and strength of winds over a local region. This is combined with a flood or storm surge model that forecasts how wind might push any nearby body of water onto land. A hydraulic model then maps out where flooding will occur based on the local flood infrastructure and generates a visual, color-coded map of flood elevations over a particular region.“The question is: Can visualizations of satellite imagery add another level to this, that is a bit more tangible and emotionally engaging than a color-coded map of reds, yellows, and blues, while still being trustworthy?” Lütjens says.The team first tested how generative AI alone would produce satellite images of future flooding. They trained a GAN on actual satellite images taken by satellites as they passed over Houston before and after Hurricane Harvey. When they tasked the generator to produce new flood images of the same regions, they found that the images resembled typical satellite imagery, but a closer look revealed hallucinations in some images, in the form of floods where flooding should not be possible (for instance, in locations at higher elevation).To reduce hallucinations and increase the trustworthiness of the AI-generated images, the team paired the GAN with a physics-based flood model that incorporates real, physical parameters and phenomena, such as an approaching hurricane’s trajectory, storm surge, and flood patterns. With this physics-reinforced method, the team generated satellite images around Houston that depict the same flood extent, pixel by pixel, as forecasted by the flood model.“We show a tangible way to combine machine learning with physics for a use case that’s risk-sensitive, which requires us to analyze the complexity of Earth’s systems and project future actions and possible scenarios to keep people out of harm’s way,” Newman says. “We can’t wait to get our generative AI tools into the hands of decision-makers at the local community level, which could make a significant difference and perhaps save lives.”The research was supported, in part, by the MIT Portugal Program, the DAF-MIT Artificial Intelligence Accelerator, NASA, and Google Cloud. More

  • in

    A vision for U.S. science success

    White House science advisor Arati Prabhakar expressed confidence in U.S. science and technology capacities during a talk on Wednesday about major issues the country must tackle.“Let me start with the purpose of science and technology and innovation, which is to open possibilities so that we can achieve our great aspirations,” said Prabhakar, who is the director of the Office of Science and Technology Policy (OSTP) and a co-chair of the President’s Council of Advisors on Science and Technology (PCAST). “The aspirations that we have as a country today are as great as they have ever been,” she added.Much of Prabhakar’s talk focused on three major issues in science and technology development: cancer prevention, climate change, and AI. In the process, she also emphasized the necessity for the U.S. to sustain its global leadership in research across domains of science and technology, which she called “one of America’s long-time strengths.”“Ever since the end of the Second World War, we said we’re going in on basic research, we’re going to build our universities’ capacity to do it, we have an unparalleled basic research capacity, and we should always have that,” said Prabhakar.“We have gotten better, I think, in recent years at commercializing technology from our basic research,” Prabhakar added, noting, “Capital moves when you can see profit and growth.” The Biden administration, she said, has invested in a variety of new ways for the public and private sector to work together to massively accelerate the movement of technology into the market.Wednesday’s talk drew a capacity audience of nearly 300 people in MIT’s Wong Auditorium and was hosted by the Manufacturing@MIT Working Group. The event included introductory remarks by Suzanne Berger, an Institute Professor and a longtime expert on the innovation economy, and Nergis Mavalvala, dean of the School of Science and an astrophysicist and leader in gravitational-wave detection.Introducing Mavalvala, Berger said the 2015 announcement of the discovery of gravitational waves “was the day I felt proudest and most elated to be a member of the MIT community,” and noted that U.S. government support helped make the research possible. Mavalvala, in turn, said MIT was “especially honored” to hear Prabhakar discuss leading-edge research and acknowledge the role of universities in strengthening the country’s science and technology sectors.Prabhakar has extensive experience in both government and the private sector. She has been OSTP director and co-chair of PCAST since October of 2022. She served as director of the Defense Advanced Research Projects Agency (DARPA) from 2012 to 2017 and director of the National Institute of Standards and Technology (NIST) from 1993 to 1997.She has also held executive positions at Raychem and Interval Research, and spent a decade at the investment firm U.S. Venture Partners. An engineer by training, Prabhakar earned a BS in electrical engineering from Texas Tech University in 1979, an MA in electrical engineering from Caltech in 1980, and a PhD in applied physics from Caltech in 1984.Among other remarks about medicine, Prabhakar touted the Biden administration’s “Cancer Moonshot” program, which aims to cut the cancer death rate in half over the next 25 years through multiple approaches, from better health care provision and cancer detection to limiting public exposure to carcinogens. We should be striving, Prabhakar said, for “a future in which people take good health for granted and can get on with their lives.”On AI, she heralded both the promise and concerns about technology, saying, “I think it’s time for active steps to get on a path to where it actually allows people to do more and earn more.”When it comes to climate change, Prabhakar said, “We all understand that the climate is going to change. But it’s in our hands how severe those changes get. And it’s possible that we can build a better future.” She noted the bipartisan infrastructure bill signed into law in 2021 and the Biden administration’s Inflation Reduction Act as important steps forward in this fight.“Together those are making the single biggest investment anyone anywhere on the planet has ever made in the clean energy transition,” she said. “I used to feel hopeless about our ability to do that, and it gives me tremendous hope.”After her talk, Prabhakar was joined onstage for a group discussion with the three co-presidents of the MIT Energy and Climate Club: Laurentiu Anton, a doctoral candidate in electrical engineering and computer science; Rosie Keller, an MBA candidate at the MIT Sloan School of Management; and Thomas Lee, a doctoral candidate in MIT’s Institute for Data, Systems, and Society.Asked about the seemingly sagging public confidence in science today, Prabhakar offered a few thoughts.“The first thing I would say is, don’t take it personally,” Prabhakar said, noting that any dip in public regard for science is less severe than the diminished public confidence in other institutions.Adding some levity, she observed that in polling about which occupations are regarded as being desirable for a marriage partner to have, “scientist” still ranks highly.“Scientists still do really well on that front, we’ve got that going for us,” she quipped.More seriously, Prabhakar observed, rather than “preaching” at the public, scientists should recognize that “part of the job for us is to continue to be clear about what we know are the facts, and to present them clearly but humbly, and to be clear that we’re going to continue working to learn more.” At the same time, she continued, scientists can always reinforce that “oh, by the way, facts are helpful things that can actually help you make better choices about how the future turns out. I think that would be better in my view.”Prabhakar said that her White House work had been guided, in part, by one of the overarching themes that President Biden has often reinforced.“He thinks about America as a nation that can be described in a single word, and that word is ‘possibilities,’” she said. “And that idea, that is such a big idea, it lights me up. I think of what we do in the world of science and technology and innovation as really part and parcel of creating those possibilities.”Ultimately, Prabhakar said, at all times and all points in American history, scientists and technologists must continue “to prove once more that when people come together and do this work … we do it in a way that builds opportunity and expands opportunity for everyone in our country. I think this is the great privilege we all have in the work we do, and it’s also our responsibility.” More

  • in

    Catherine Wolfram: High-energy scholar

    In the mid 2000s, Catherine Wolfram PhD ’96 reached what she calls “an inflection point” in her career. After about a decade of studying U.S. electricity markets, she had come to recognize that “you couldn’t study the energy industries without thinking about climate mitigation,” as she puts it.At the same time, Wolfram understood that the trajectory of energy use in the developing world was a massively important part of the climate picture. To get a comprehensive grasp on global dynamics, she says, “I realized I needed to start thinking about the rest of the world.”An accomplished scholar and policy expert, Wolfram has been on the faculty at Harvard University, the University of California at Berkeley — and now MIT, where she is the William Barton Rogers Professor in Energy. She has also served as deputy assistant secretary for climate and energy economics at the U.S. Treasury.Yet even leading experts want to keep learning. So, when she hit that inflection point, Wolfram started carving out a new phase of her research career.“One of the things I love about being an academic is, I could just decide to do that,” Wolfram says. “I didn’t need to check with a boss. I could just pivot my career to being more focused to thinking about energy in the developing world.”Over the last decade, Wolfram has published a wide array of original studies about energy consumption in the developing world. From Kenya to Mexico to South Asia, she has shed light on the dynamics of economics growth and energy consumption — while spending some of that time serving the government too. Last year, Wolfram joined the faculty of the MIT Sloan School of Management, where her work bolsters the Institute’s growing effort to combat climate change.Studying at MITWolfram largely grew up in Minnesota, where her father was a legal scholar, although he moved to Cornell University around the time she started high school. As an undergraduate, she majored in economics at Harvard University, and after graduation she worked first for a consultant, then for the Massachusetts Department of Public Utilities, the agency regulating energy rates. In the latter job, Wolfram kept noticing that people were often citing the research of an MIT scholar named Paul Joskow (who is now the Elizabeth and James Killian Professor of Economics Emeritus in MIT’s Department of Economics) and Richard Schmalensee (a former dean of the MIT Sloan School of Management and now the Howard W. Johnson Professor of Management Emeritus). Seeing how consequential economics research could be for policymaking, Wolfram decided to get a PhD in the field and was accepted into MIT’s doctoral program.“I went into graduate school with an unusually specific view of what I wanted to do,” Wolfram says. “I wanted to work with Paul Joskow and Dick Schmalensee on electricity markets, and that’s how I wound up here.”At MIT, Wolfram also ended up working extensively with Nancy Rose, the Charles P. Kindleberger Professor of Applied Economics and a former head of the Department of Economics, who helped oversee Wolfram’s thesis; Rose has extensively studied market regulation as well.Wolfram’s dissertation research largely focused on price-setting behavior in the U.K.’s newly deregulated electricity markets, which, it turned out, applied handily to the U.S., where a similar process was taking place. “I was fortunate because this was around the time California was thinking about restructuring, as it was known,” Wolfram says. She spent four years on the faculty at Harvard, then moved to UC Berkeley. Wolfram’s studies have shown that deregulation has had some medium-term benefits, for instance in making power plants operate more efficiently.Turning on the ACBy around 2010, though, Wolfram began shifting her scholarly focus in earnest, conducting innovative studies about energy in the developing world. One strand of her research has centered on Kenya, to better understand how more energy access for people without electricity might fit into growth in the developing world.In this case, Wolfram’s perhaps surprising conclusion is that electrification itself is not a magic ticket to prosperity; people without electricity are more eager to adopt it when they have a practical economic need for it. Meanwhile, they have other essential needs that are not necessarily being addressed.“The 800 million people in the world who don’t have electricity also don’t have access to good health care or running water,” Wolfram says. “Giving them better housing infrastructure is important, and harder to tackle. It’s not clear that bringing people electricity alone is the single most useful thing from a development perspective. Although electricity is a super-important component of modern living.”Wolfram has even delved into topics such as air conditioner use in the developing world — an important driver of energy use. As her research shows, many countries, with a combined population far bigger than the U.S., are among the fastest-growing adopters of air conditioners and have an even greater need for them, based on their climates. Adoption of air conditioning within those countries also is characterized by marked economic inequality.From early 2021 until late 2022, Wolfram also served in the administration of President Joe Biden, where her work also centered on global energy issues. Among other things, Wolfram was part of the team working out a price-cap policy for Russian oil exports, a concept that she thinks could be applied to many other products globally. Although, she notes, working with countries heavily dependent on exporting energy materials will always require careful engagement.“We need to be mindful of that dependence and importance as we go through this massive effort to decarbonize the energy sector and shift it to a whole new paradigm,” Wolfram says.At MIT againStill, she notes, the world does need a whole new energy paradigm, and fast. Her arrival at MIT overlaps with the emergence of a new Institute-wide effort, the Climate Project at MIT, that aims to accelerate and scale climate solutions and good climate policy, including through the new Climate Policy Center at MIT Sloan. That kind of effort, Wolfram says, matters to her.“It’s part of why I’ve come to MIT,” Wolfram says. “Technology will be one part of the climate solution, but I do think an innovative mindset, how can we think about doing things better, can be productively applied to climate policy.” On being at MIT, she adds: “It’s great, it’s awesome. One of the things that pleasantly surprised me is how tight-knit and friendly the MIT faculty all are, and how many interactions I’ve had with people from other departments.”Wolfram has also been enjoying her teaching at MIT, and will be offering a large class in spring 2025, 15.016 (Climate and Energy in the Global Economy), that she debuted this past academic year.“It’s super fun to have students from around the world, who have personal stories and knowledge of energy systems in their countries and can contribute to our discussions,” she says.When it comes to tackling climate change, many things seem daunting. But there is still a world of knowledge to be acquired while we try to keep the planet from overheating, and Wolfram has a can-do attitude about learning more and applying those lessons.“We’ve made a lot of progress,” Wolfram says. “But we still have a lot more to do.” More

  • in

    Advancing urban tree monitoring with AI-powered digital twins

    The Irish philosopher George Berkely, best known for his theory of immaterialism, once famously mused, “If a tree falls in a forest and no one is around to hear it, does it make a sound?”What about AI-generated trees? They probably wouldn’t make a sound, but they will be critical nonetheless for applications such as adaptation of urban flora to climate change. To that end, the novel “Tree-D Fusion” system developed by researchers at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), Google, and Purdue University merges AI and tree-growth models with Google’s Auto Arborist data to create accurate 3D models of existing urban trees. The project has produced the first-ever large-scale database of 600,000 environmentally aware, simulation-ready tree models across North America.“We’re bridging decades of forestry science with modern AI capabilities,” says Sara Beery, MIT electrical engineering and computer science (EECS) assistant professor, MIT CSAIL principal investigator, and a co-author on a new paper about Tree-D Fusion. “This allows us to not just identify trees in cities, but to predict how they’ll grow and impact their surroundings over time. We’re not ignoring the past 30 years of work in understanding how to build these 3D synthetic models; instead, we’re using AI to make this existing knowledge more useful across a broader set of individual trees in cities around North America, and eventually the globe.”Tree-D Fusion builds on previous urban forest monitoring efforts that used Google Street View data, but branches it forward by generating complete 3D models from single images. While earlier attempts at tree modeling were limited to specific neighborhoods, or struggled with accuracy at scale, Tree-D Fusion can create detailed models that include typically hidden features, such as the back side of trees that aren’t visible in street-view photos.The technology’s practical applications extend far beyond mere observation. City planners could use Tree-D Fusion to one day peer into the future, anticipating where growing branches might tangle with power lines, or identifying neighborhoods where strategic tree placement could maximize cooling effects and air quality improvements. These predictive capabilities, the team says, could change urban forest management from reactive maintenance to proactive planning.A tree grows in Brooklyn (and many other places)The researchers took a hybrid approach to their method, using deep learning to create a 3D envelope of each tree’s shape, then using traditional procedural models to simulate realistic branch and leaf patterns based on the tree’s genus. This combo helped the model predict how trees would grow under different environmental conditions and climate scenarios, such as different possible local temperatures and varying access to groundwater.Now, as cities worldwide grapple with rising temperatures, this research offers a new window into the future of urban forests. In a collaboration with MIT’s Senseable City Lab, the Purdue University and Google team is embarking on a global study that re-imagines trees as living climate shields. Their digital modeling system captures the intricate dance of shade patterns throughout the seasons, revealing how strategic urban forestry could hopefully change sweltering city blocks into more naturally cooled neighborhoods.“Every time a street mapping vehicle passes through a city now, we’re not just taking snapshots — we’re watching these urban forests evolve in real-time,” says Beery. “This continuous monitoring creates a living digital forest that mirrors its physical counterpart, offering cities a powerful lens to observe how environmental stresses shape tree health and growth patterns across their urban landscape.”AI-based tree modeling has emerged as an ally in the quest for environmental justice: By mapping urban tree canopy in unprecedented detail, a sister project from the Google AI for Nature team has helped uncover disparities in green space access across different socioeconomic areas. “We’re not just studying urban forests — we’re trying to cultivate more equity,” says Beery. The team is now working closely with ecologists and tree health experts to refine these models, ensuring that as cities expand their green canopies, the benefits branch out to all residents equally.It’s a breezeWhile Tree-D fusion marks some major “growth” in the field, trees can be uniquely challenging for computer vision systems. Unlike the rigid structures of buildings or vehicles that current 3D modeling techniques handle well, trees are nature’s shape-shifters — swaying in the wind, interweaving branches with neighbors, and constantly changing their form as they grow. The Tree-D fusion models are “simulation-ready” in that they can estimate the shape of the trees in the future, depending on the environmental conditions.“What makes this work exciting is how it pushes us to rethink fundamental assumptions in computer vision,” says Beery. “While 3D scene understanding techniques like photogrammetry or NeRF [neural radiance fields] excel at capturing static objects, trees demand new approaches that can account for their dynamic nature, where even a gentle breeze can dramatically alter their structure from moment to moment.”The team’s approach of creating rough structural envelopes that approximate each tree’s form has proven remarkably effective, but certain issues remain unsolved. Perhaps the most vexing is the “entangled tree problem;” when neighboring trees grow into each other, their intertwined branches create a puzzle that no current AI system can fully unravel.The scientists see their dataset as a springboard for future innovations in computer vision, and they’re already exploring applications beyond street view imagery, looking to extend their approach to platforms like iNaturalist and wildlife camera traps.“This marks just the beginning for Tree-D Fusion,” says Jae Joong Lee, a Purdue University PhD student who developed, implemented and deployed the Tree-D-Fusion algorithm. “Together with my collaborators, I envision expanding the platform’s capabilities to a planetary scale. Our goal is to use AI-driven insights in service of natural ecosystems — supporting biodiversity, promoting global sustainability, and ultimately, benefiting the health of our entire planet.”Beery and Lee’s co-authors are Jonathan Huang, Scaled Foundations head of AI (formerly of Google); and four others from Purdue University: PhD students Jae Joong Lee and Bosheng Li, Professor and Dean’s Chair of Remote Sensing Songlin Fei, Assistant Professor Raymond Yeh, and Professor and Associate Head of Computer Science Bedrich Benes. Their work is based on efforts supported by the United States Department of Agriculture’s (USDA) Natural Resources Conservation Service and is directly supported by the USDA’s National Institute of Food and Agriculture. The researchers presented their findings at the European Conference on Computer Vision this month.  More