More stories

  • in

    MIT community in 2022: A year in review

    In 2022, MIT returned to a bit of normalcy after the challenge of Covid-19 began to subside. The Institute prepared to bid farewell to its president and later announced his successor; announced five flagship projects in a new competition aimed at tackling climate’s greatest challenges; made new commitments toward ensuring support for diverse voices; and celebrated the reopening of a reimagined MIT Museum — as well as a Hollywood blockbuster featuring scenes from campus. Here are some of the top stories in the MIT community this year.

    Presidential transition

    In February, MIT President L. Rafael Reif announced that he planned to step down at the end of 2022. In more than 10 years as president, Reif guided MIT through a period of dynamic growth, greatly enhancing its global stature and magnetism. At the conclusion of his term at the end of this month, Reif will take a sabbatical, then return to the faculty of the Department of Electrical Engineering and Computer Science. In September, Reif expressed his gratitude to the MIT community at an Institute-wide dance celebration, and he was honored with a special MIT Dome lighting earlier this month.

    After an extensive presidential search, Sally Kornbluth, a cell biologist and the current provost of Duke University, was announced in October as MIT’s 18th president. Following an introduction to MIT that included a press conference, welcoming event, and community celebration, Kornbluth will assume the MIT presidency on Jan. 1, 2023.

    In other administrative transitions: Cynthia Barnhart was appointed provost after Martin Schmidt stepped down to become president of Rensselaer Polytechnic Institute; Sanjay Sarma stepped down as vice president for open learning after nine years in the role; professors Brent Ryan and Anne White were named associate provosts, while White was also named associate vice president for research administration; and Agustín Rayo was named dean of the School of Humanities, Arts, and Social Sciences.

    Climate Grand Challenges

    MIT announced five flagship projects in its first-ever Climate Grand Challenges competition. These multiyear projects focus on unraveling some of the toughest unsolved climate problems and bringing high-impact, science-based solutions to the world on an accelerated basis. Representing the most promising concepts to emerge from the two-year competition that yielded 27 finalist projects, the five flagship projects will receive additional funding and resources from MIT and others to develop their ideas and swiftly transform them into practical solutions at scale.

    CHIPS and Science Act

    President Reif and Vice President for Research Maria Zuber were among several MIT representatives to witness President Biden’s signing of the $52 billion “CHIPS and Science” bill into law in August. Reif helped shape aspects of the bill and was a vocal advocate for it among university and government officials, while Zuber served on two government science advisory boards during the bill’s gestation and consideration. Earlier in the year, MIT.nano hosted U.S. Secretary of Commerce Gina Raimondo, while MIT researchers released a key report on U.S. microelectronics research and manufacturing.

    MIT Morningside Academy for Design

    Supported by a $100 million founding gift, the MIT Morningside Academy for Design launched as a major interdisciplinary center that aims to build on the Institute’s leadership in design-focused education. Housed in the School of Architecture and Planning, the academy provides a hub that will encourage design work at MIT to grow and cross disciplines among engineering, science, management, computing, architecture, urban planning, and the arts.

    Reports of the Institute

    A number of key Institute reports and announcements were released in 2022. They include: an announcement of the future of gift acceptance for MIT: an announcement of priority MIT investments; a new MIT Values Statement; a renewed commitment to Indigenous scholarship and community; the Strategic Action Plan for Belonging, Achievement, and Composition; a report on MIT’s engagement with China; a report of the Working Group on Reimagining Public Safety at MIT; a report of the Indigenous Working Group; and a report of the Ad Hoc Committee on Arts, Culture, and DEI.

    Nobel Prizes

    MIT affiliates were well-represented among new and recent Nobel laureates who took part in the first in-person Nobel Prize ceremony since the start of the Covid-19 pandemic. MIT-affiliated winners for 2022 included Ben Bernanke PhD ’79, K. Barry Sharpless, and Carolyn Bertozzi. Winners in attendance from 2020 and 2021 included Professor Joshua Angrist, David Julius ’77, and Andrea Ghez ’87.

    New MIT Museum

    A reimagined MIT Museum opened this fall in a new 56,000-square-foot space in the heart of Cambridge’s Kendall Square. The museum invites visitors to explore the Institute’s innovations in science, technology, engineering, arts, and math — and to take part in that work with hands-on learning labs and maker spaces, interactive exhibits, and venues to discuss the impact of science and technology on society.

    “Wakanda Forever”

    In November, the Institute Office of Communications and the Division of Student Life hosted a special screening of Marvel Studios’ “Black Panther: Wakanda Forever.” The MIT campus had been used as a filming location in summer 2021, as one of the film’s characters, Riri Williams (also known as Ironheart), is portrayed as a student at the Institute.

    In-person Commencement returns

    After two years of online celebrations due to Covid-19, MIT Commencement returned to Killian Court at the end of May. World Trade Organization Director-General Ngozi Okonjo-Iweala MCP ’78, PhD ’81 delivered the Commencement address, while poet Kealoha Wong ’99 spoke at a special ceremony for the classes of 2020 and 2021.

    Students win distinguished fellowships

    As in previous years, MIT students continued to shine. This year, exceptional undergraduates were awarded Fulbright, Marshall, Mitchell, Rhodes, and Schwarzman scholarships.

    Remembering those we’ve lost

    Among MIT community members who died this year were Robert Balluffi, Louis Braida, Ashton Carter, Tom Eagar, Dick Eckaus, Octavian-Eugen Ganea, Peter Griffith, Patrick Hale, Frank Sidney Jones, Nonabah Lane, Leo Marx, Bruce Montgomery, Joel Moses, Brian Sousa Jr., Mohamed Magdi Taha, John Tirman, Richard Wurtman, and Markus Zahn.

    In case you missed it:

    Additional top community stories of 2022 included MIT students dominating the 82nd Putnam Mathematical Competition, an update on MIT’s reinstating the SAT/ACT requirement for admissions, a new mathematics program for Ukrainian students and refugees, a roundup of new books from MIT authors, the renaming of the MIT.nano building, an announcement of winners of this year’s MIT $100K Entrepreneurship Competition, the new MIT Wright Brothers Wind Tunnel, and MIT students winning the 45th International Collegiate Programming Contest for the first time in 44 years. More

  • in

    MIT scientists contribute to National Ignition Facility fusion milestone

    On Monday, Dec. 5, at around 1 a.m., a tiny sphere of deuterium-tritium fuel surrounded by a cylindrical can of gold called a hohlraum was targeted by 192 lasers at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) in California. Over the course of billionths of a second, the lasers fired, generating X-rays inside the gold can, and imploding the sphere of fuel.

    On that morning, for the first time ever, the lasers delivered 2.1 megajoules of energy and yielded 3.15 megajoules in return, achieving a historic fusion energy gain well above 1 — a result verified by diagnostic tools developed by the MIT Plasma Science and Fusion Center (PSFC). The use of these tools and their importance was referenced by Arthur Pak, a LLNL staff scientist who spoke at a U.S. Department of Energy press event on Dec. 13 announcing the NIF’s success.

    Johan Frenje, head of the PSFC High-Energy-Density Physics division, notes that this milestone “will have profound implications for laboratory fusion research in general.”

    Since the late 1950s, researchers worldwide have pursued fusion ignition and energy gain in a laboratory, considering it one of the grand challenges of the 21st century. Ignition can only be reached when the internal fusion heating power is high enough to overcome the physical processes that cool the fusion plasma, creating a positive thermodynamic feedback loop that very rapidly increases the plasma temperature. In the case of inertial confinement fusion, the method used at the NIF, ignition can initiate a “fuel burn propagation” into the surrounding dense and cold fuel, and when done correctly, enable fusion-energy gain.

    Frenje and his PSFC division initially designed dozens of diagnostic systems that were implemented at the NIF, including the vitally important magnetic recoil neutron spectrometer (MRS), which measures the neutron energy spectrum, the data from which fusion yield, plasma ion temperature, and spherical fuel pellet compression (“fuel areal density”) can be determined. Overseen by PSFC Research Scientist Maria Gatu Johnson since 2013, the MRS is one of two systems at the NIF relied upon to measure the absolute neutron yield from the Dec. 5 experiment because of its unique ability to accurately interpret an implosion’s neutron signals.

    “Before the announcement of this historic achievement could be made, the LLNL team wanted to wait until Maria had analyzed the MRS data to an adequate level for a fusion yield to be determined,” says Frenje.

    Response around MIT to NIF’s announcement has been enthusiastic and hopeful. “This is the kind of breakthrough that ignites the imagination,” says Vice President for Research Maria Zuber, “reminding us of the wonder of discovery and the possibilities of human ingenuity. Although we have a long, hard path ahead of us before fusion can deliver clean energy to the electrical grid, we should find much reason for optimism in today’s announcement. Innovation in science and technology holds great power and promise to address some of the world’s biggest challenges, including climate change.”

    Frenje also credits the rest of the team at the PSFC’s High-Energy-Density Physics division, the Laboratory for Laser Energetics at the University of Rochester, LLNL, and other collaborators for their support and involvement in this research, as well as the National Nuclear Security Administration of the Department of Energy, which has funded much of their work since the early 1990s. He is also proud of the number of MIT PhDs that have been generated by the High-Energy-Density Physics Division and subsequently hired by LLNL, including the experimental lead for this experiment, Alex Zylstra PhD ’15.

    “This is really a team effort,” says Frenje. “Without the scientific dialogue and the extensive know-how at the HEDP Division, the critical contributions made by the MRS system would not have happened.” More

  • in

    A healthy wind

    Nearly 10 percent of today’s electricity in the United States comes from wind power. The renewable energy source benefits climate, air quality, and public health by displacing emissions of greenhouse gases and air pollutants that would otherwise be produced by fossil-fuel-based power plants.

    A new MIT study finds that the health benefits associated with wind power could more than quadruple if operators prioritized turning down output from the most polluting fossil-fuel-based power plants when energy from wind is available.

    In the study, published today in Science Advances, researchers analyzed the hourly activity of wind turbines, as well as the reported emissions from every fossil-fuel-based power plant in the country, between the years 2011 and 2017. They traced emissions across the country and mapped the pollutants to affected demographic populations. They then calculated the regional air quality and associated health costs to each community.

    The researchers found that in 2014, wind power that was associated with state-level policies improved air quality overall, resulting in $2 billion in health benefits across the country. However, only roughly 30 percent of these health benefits reached disadvantaged communities.

    The team further found that if the electricity industry were to reduce the output of the most polluting fossil-fuel-based power plants, rather than the most cost-saving plants, in times of wind-generated power, the overall health benefits could quadruple to $8.4 billion nationwide. However, the results would have a similar demographic breakdown.

    “We found that prioritizing health is a great way to maximize benefits in a widespread way across the U.S., which is a very positive thing. But it suggests it’s not going to address disparities,” says study co-author Noelle Selin, a professor in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences at MIT. “In order to address air pollution disparities, you can’t just focus on the electricity sector or renewables and count on the overall air pollution benefits addressing these real and persistent racial and ethnic disparities. You’ll need to look at other air pollution sources, as well as the underlying systemic factors that determine where plants are sited and where people live.”

    Selin’s co-authors are lead author and former MIT graduate student Minghao Qiu PhD ’21, now at Stanford University, and Corwin Zigler at the University of Texas at Austin.

    Turn-down service

    In their new study, the team looked for patterns between periods of wind power generation and the activity of fossil-fuel-based power plants, to see how regional electricity markets adjusted the output of power plants in response to influxes of renewable energy.

    “One of the technical challenges, and the contribution of this work, is trying to identify which are the power plants that respond to this increasing wind power,” Qiu notes.

    To do so, the researchers compared two historical datasets from the period between 2011 and 2017: an hour-by-hour record of energy output of wind turbines across the country, and a detailed record of emissions measurements from every fossil-fuel-based power plant in the U.S. The datasets covered each of seven major regional electricity markets, each market providing energy to one or multiple states.

    “California and New York are each their own market, whereas the New England market covers around seven states, and the Midwest covers more,” Qiu explains. “We also cover about 95 percent of all the wind power in the U.S.”

    In general, they observed that, in times when wind power was available, markets adjusted by essentially scaling back the power output of natural gas and sub-bituminous coal-fired power plants. They noted that the plants that were turned down were likely chosen for cost-saving reasons, as certain plants were less costly to turn down than others.

    The team then used a sophisticated atmospheric chemistry model to simulate the wind patterns and chemical transport of emissions across the country, and determined where and at what concentrations the emissions generated fine particulates and ozone — two pollutants that are known to damage air quality and human health. Finally, the researchers mapped the general demographic populations across the country, based on U.S. census data, and applied a standard epidemiological approach to calculate a population’s health cost as a result of their pollution exposure.

    This analysis revealed that, in the year 2014, a general cost-saving approach to displacing fossil-fuel-based energy in times of wind energy resulted in $2 billion in health benefits, or savings, across the country. A smaller share of these benefits went to disadvantaged populations, such as communities of color and low-income communities, though this disparity varied by state.

    “It’s a more complex story than we initially thought,” Qiu says. “Certain population groups are exposed to a higher level of air pollution, and those would be low-income people and racial minority groups. What we see is, developing wind power could reduce this gap in certain states but further increase it in other states, depending on which fossil-fuel plants are displaced.”

    Tweaking power

    The researchers then examined how the pattern of emissions and the associated health benefits would change if they prioritized turning down different fossil-fuel-based plants in times of wind-generated power. They tweaked the emissions data to reflect several alternative scenarios: one in which the most health-damaging, polluting power plants are turned down first; and two other scenarios in which plants producing the most sulfur dioxide and carbon dioxide respectively, are first to reduce their output.

    They found that while each scenario increased health benefits overall, and the first scenario in particular could quadruple health benefits, the original disparity persisted: Communities of color and low-income communities still experienced smaller health benefits than more well-off communities.

    “We got to the end of the road and said, there’s no way we can address this disparity by being smarter in deciding which plants to displace,” Selin says.

    Nevertheless, the study can help identify ways to improve the health of the general population, says Julian Marshall, a professor of environmental engineering at the University of Washington.

    “The detailed information provided by the scenarios in this paper can offer a roadmap to electricity-grid operators and to state air-quality regulators regarding which power plants are highly damaging to human health and also are likely to noticeably reduce emissions if wind-generated electricity increases,” says Marshall, who was not involved in the study.

    “One of the things that makes me optimistic about this area is, there’s a lot more attention to environmental justice and equity issues,” Selin concludes. “Our role is to figure out the strategies that are most impactful in addressing those challenges.”

    This work was supported, in part, by the U.S. Environmental Protection Agency, and by the National Institutes of Health. More

  • in

    Earth can regulate its own temperature over millennia, new study finds

    The Earth’s climate has undergone some big changes, from global volcanism to planet-cooling ice ages and dramatic shifts in solar radiation. And yet life, for the last 3.7 billion years, has kept on beating.

    Now, a study by MIT researchers in Science Advances confirms that the planet harbors a “stabilizing feedback” mechanism that acts over hundreds of thousands of years to pull the climate back from the brink, keeping global temperatures within a steady, habitable range.

    Just how does it accomplish this? A likely mechanism is “silicate weathering” — a geological process by which the slow and steady weathering of silicate rocks involves chemical reactions that ultimately draw carbon dioxide out of the atmosphere and into ocean sediments, trapping the gas in rocks.

    Scientists have long suspected that silicate weathering plays a major role in regulating the Earth’s carbon cycle. The mechanism of silicate weathering could provide a geologically constant force in keeping carbon dioxide — and global temperatures — in check. But there’s never been direct evidence for the continual operation of such a feedback, until now.

    The new findings are based on a study of paleoclimate data that record changes in average global temperatures over the last 66 million years. The MIT team applied a mathematical analysis to see whether the data revealed any patterns characteristic of stabilizing phenomena that reined in global temperatures on a  geologic timescale.

    They found that indeed there appears to be a consistent pattern in which the Earth’s temperature swings are dampened over timescales of hundreds of thousands of years. The duration of this effect is similar to the timescales over which silicate weathering is predicted to act.

    The results are the first to use actual data to confirm the existence of a stabilizing feedback, the mechanism of which is likely silicate weathering. This stabilizing feedback would explain how the Earth has remained habitable through dramatic climate events in the geologic past.

    “On the one hand, it’s good because we know that today’s global warming will eventually be canceled out through this stabilizing feedback,” says Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “But on the other hand, it will take hundreds of thousands of years to happen, so not fast enough to solve our present-day issues.”

    The study is co-authored by Arnscheidt and Daniel Rothman, professor of geophysics at MIT.

    Stability in data

    Scientists have previously seen hints of a climate-stabilizing effect in the Earth’s carbon cycle: Chemical analyses of ancient rocks have shown that the flux of carbon in and out of Earth’s surface environment has remained relatively balanced, even through dramatic swings in global temperature. Furthermore, models of silicate weathering predict that the process should have some stabilizing effect on the global climate. And finally, the fact of the Earth’s enduring habitability points to some inherent, geologic check on extreme temperature swings.

    “You have a planet whose climate was subjected to so many dramatic external changes. Why did life survive all this time? One argument is that we need some sort of stabilizing mechanism to keep temperatures suitable for life,” Arnscheidt says. “But it’s never been demonstrated from data that such a mechanism has consistently controlled Earth’s climate.”

    Arnscheidt and Rothman sought to confirm whether a stabilizing feedback has indeed been at work, by looking at data of global temperature fluctuations through geologic history. They worked with a range of global temperature records compiled by other scientists, from the chemical composition of ancient marine fossils and shells, as well as preserved Antarctic ice cores.

    “This whole study is only possible because there have been great advances in improving the resolution of these deep-sea temperature records,” Arnscheidt notes. “Now we have data going back 66 million years, with data points at most thousands of years apart.”

    Speeding to a stop

    To the data, the team applied the mathematical theory of stochastic differential equations, which is commonly used to reveal patterns in widely fluctuating datasets.

    “We realized this theory makes predictions for what you would expect Earth’s temperature history to look like if there had been feedbacks acting on certain timescales,” Arnscheidt explains.

    Using this approach, the team analyzed the history of average global temperatures over the last 66 million years, considering the entire period over different timescales, such as tens of thousands of years versus hundreds of thousands, to see whether any patterns of stabilizing feedback emerged within each timescale.

    “To some extent, it’s like your car is speeding down the street, and when you put on the brakes, you slide for a long time before you stop,” Rothman says. “There’s a timescale over which frictional resistance, or a stabilizing feedback, kicks in, when the system returns to a steady state.”

    Without stabilizing feedbacks, fluctuations of global temperature should grow with timescale. But the team’s analysis revealed a regime in which fluctuations did not grow, implying that a stabilizing mechanism reigned in the climate before fluctuations grew too extreme. The timescale for this stabilizing effect — hundreds of thousands of years — coincides with what scientists predict for silicate weathering.

    Interestingly, Arnscheidt and Rothman found that on longer timescales, the data did not reveal any stabilizing feedbacks. That is, there doesn’t appear to be any recurring pull-back of global temperatures on timescales longer than a million years. Over these longer timescales, then, what has kept global temperatures in check?

    “There’s an idea that chance may have played a major role in determining why, after more than 3 billion years, life still exists,” Rothman offers.

    In other words, as the Earth’s temperatures fluctuate over longer stretches, these fluctuations may just happen to be small enough in the geologic sense, to be within a range that a stabilizing feedback, such as silicate weathering, could periodically keep the climate in check, and more to the point, within a habitable zone.

    “There are two camps: Some say random chance is a good enough explanation, and others say there must be a stabilizing feedback,” Arnscheidt says. “We’re able to show, directly from data, that the answer is probably somewhere in between. In other words, there was some stabilization, but pure luck likely also played a role in keeping Earth continuously habitable.”

    This research was supported, in part, by a MathWorks fellowship and the National Science Foundation. More

  • in

    Methane research takes on new urgency at MIT

    One of the most notable climate change provisions in the 2022 Inflation Reduction Act is the first U.S. federal tax on a greenhouse gas (GHG). That the fee targets methane (CH4), rather than carbon dioxide (CO2), emissions is indicative of the urgency the scientific community has placed on reducing this short-lived but powerful gas. Methane persists in the air about 12 years — compared to more than 1,000 years for CO2 — yet it immediately causes about 120 times more warming upon release. The gas is responsible for at least a quarter of today’s gross warming. 

    “Methane has a disproportionate effect on near-term warming,” says Desiree Plata, the director of MIT Methane Network. “CH4 does more damage than CO2 no matter how long you run the clock. By removing methane, we could potentially avoid critical climate tipping points.” 

    Because GHGs have a runaway effect on climate, reductions made now will have a far greater impact than the same reductions made in the future. Cutting methane emissions will slow the thawing of permafrost, which could otherwise lead to massive methane releases, as well as reduce increasing emissions from wetlands.  

    “The goal of MIT Methane Network is to reduce methane emissions by 45 percent by 2030, which would save up to 0.5 degree C of warming by 2100,” says Plata, an associate professor of civil and environmental engineering at MIT and director of the Plata Lab. “When you consider that governments are trying for a 1.5-degree reduction of all GHGs by 2100, this is a big deal.” 

    Under normal concentrations, methane, like CO2, poses no health risks. Yet methane assists in the creation of high levels of ozone. In the lower atmosphere, ozone is a key component of air pollution, which leads to “higher rates of asthma and increased emergency room visits,” says Plata. 

    Methane-related projects at the Plata Lab include a filter made of zeolite — the same clay-like material used in cat litter — designed to convert methane into CO2 at dairy farms and coal mines. At first glance, the technology would appear to be a bit of a hard sell, since it converts one GHG into another. Yet the zeolite filter’s low carbon and dollar costs, combined with the disproportionate warming impact of methane, make it a potential game-changer.

    The sense of urgency about methane has been amplified by recent studies that show humans are generating far more methane emissions than previously estimated, and that the rates are rising rapidly. Exactly how much methane is in the air is uncertain. Current methods for measuring atmospheric methane, such as ground, drone, and satellite sensors, “are not readily abundant and do not always agree with each other,” says Plata.  

    The Plata Lab is collaborating with Tim Swager in the MIT Department of Chemistry to develop low-cost methane sensors. “We are developing chemiresisitive sensors that cost about a dollar that you could place near energy infrastructure to back-calculate where leaks are coming from,” says Plata.  

    The researchers are working on improving the accuracy of the sensors using machine learning techniques and are planning to integrate internet-of-things technology to transmit alerts. Plata and Swager are not alone in focusing on data collection: the Inflation Reduction Act adds significant funding for methane sensor research. 

    Other research at the Plata Lab includes the development of nanomaterials and heterogeneous catalysis techniques for environmental applications. The lab also explores mitigation solutions for industrial waste, particularly those related to the energy transition. Plata is the co-founder of an lithium-ion battery recycling startup called Nth Cycle. 

    On a more fundamental level, the Plata Lab is exploring how to develop products with environmental and social sustainability in mind. “Our overarching mission is to change the way that we invent materials and processes so that environmental objectives are incorporated along with traditional performance and cost metrics,” says Plata. “It is important to do that rigorous assessment early in the design process.”

    Play video

    MIT amps up methane research 

    The MIT Methane Network brings together 26 researchers from MIT along with representatives of other institutions “that are dedicated to the idea that we can reduce methane levels in our lifetime,” says Plata. The organization supports research such as Plata’s zeolite and sensor projects, as well as designing pipeline-fixing robots, developing methane-based fuels for clean hydrogen, and researching the capture and conversion of methane into liquid chemical precursors for pharmaceuticals and plastics. Other members are researching policies to encourage more sustainable agriculture and land use, as well as methane-related social justice initiatives. 

    “Methane is an especially difficult problem because it comes from all over the place,” says Plata. A recent Global Carbon Project study estimated that half of methane emissions are caused by humans. This is led by waste and agriculture (28 percent), including cow and sheep belching, rice paddies, and landfills.  

    Fossil fuels represent 18 percent of the total budget. Of this, about 63 percent is derived from oil and gas production and pipelines, 33 percent from coal mining activities, and 5 percent from industry and transportation. Human-caused biomass burning, primarily from slash-and-burn agriculture, emits about 4 percent of the global total.  

    The other half of the methane budget includes natural methane emissions from wetlands (20 percent) and other natural sources (30 percent). The latter includes permafrost melting and natural biomass burning, such as forest fires started by lightning.  

    With increases in global warming and population, the line between anthropogenic and natural causes is getting fuzzier. “Human activities are accelerating natural emissions,” says Plata. “Climate change increases the release of methane from wetlands and permafrost and leads to larger forest and peat fires.”  

    The calculations can get complicated. For example, wetlands provide benefits from CO2 capture, biological diversity, and sea level rise resiliency that more than compensate for methane releases. Meanwhile, draining swamps for development increases emissions. 

    Over 100 nations have signed onto the U.N.’s Global Methane Pledge to reduce at least 30 percent of anthropogenic emissions within the next 10 years. The U.N. report estimates that this goal can be achieved using proven technologies and that about 60 percent of these reductions can be accomplished at low cost. 

    Much of the savings would come from greater efficiencies in fossil fuel extraction, processing, and delivery. The methane fees in the Inflation Reduction Act are primarily focused on encouraging fossil fuel companies to accelerate ongoing efforts to cap old wells, flare off excess emissions, and tighten pipeline connections.  

    Fossil fuel companies have already made far greater pledges to reduce methane than they have with CO2, which is central to their business. This is due, in part, to the potential savings, as well as in preparation for methane regulations expected from the Environmental Protection Agency in late 2022. The regulations build upon existing EPA oversight of drilling operations, and will likely be exempt from the U.S. Supreme Court’s ruling that limits the federal government’s ability to regulate GHGs. 

    Zeolite filter targets methane in dairy and coal 

    The “low-hanging fruit” of gas stream mitigation addresses most of the 20 percent of total methane emissions in which the gas is released in sufficiently high concentrations for flaring. Plata’s zeolite filter aims to address the thornier challenge of reducing the 80 percent of non-flammable dilute emissions. 

    Plata found inspiration in decades-old catalysis research for turning methane into methanol. One strategy has been to use an abundant, low-cost aluminosilicate clay called zeolite.  

    “The methanol creation process is challenging because you need to separate a liquid, and it has very low efficiency,” says Plata. “Yet zeolite can be very efficient at converting methane into CO2, and it is much easier because it does not require liquid separation. Converting methane to CO2 sounds like a bad thing, but there is a major anti-warming benefit. And because methane is much more dilute than CO2, the relative CO2 contribution is minuscule.”  

    Using zeolite to create methanol requires highly concentrated methane, high temperatures and pressures, and industrial processing conditions. Yet Plata’s process, which dopes the zeolite with copper, operates in the presence of oxygen at much lower temperatures under typical pressures. “We let the methane proceed the way it wants from a thermodynamic perspective from methane to methanol down to CO2,” says Plata. 

    Researchers around the world are working on other dilute methane removal technologies. Projects include spraying iron salt aerosols into sea air where they react with natural chlorine or bromine radicals, thereby capturing methane. Most of these geoengineering solutions, however, are difficult to measure and would require massive scale to make a difference.  

    Plata is focusing her zeolite filters on environments where concentrations are high, but not so high as to be flammable. “We are trying to scale zeolite into filters that you could snap onto the side of a cross-ventilation fan in a dairy barn or in a ventilation air shaft in a coal mine,” says Plata. “For every packet of air we bring in, we take a lot of methane out, so we get more bang for our buck.”  

    The major challenge is creating a filter that can handle high flow rates without getting clogged or falling apart. Dairy barn air handlers can push air at up to 5,000 cubic feet per minute and coal mine handlers can approach 500,000 CFM. 

    Plata is exploring engineering options including fluidized bed reactors with floating catalyst particles. Another filter solution, based in part on catalytic converters, features “higher-order geometric structures where you have a porous material with a long path length where the gas can interact with the catalyst,” says Plata. “This avoids the challenge with fluidized beds of containing catalyst particles in the reactor. Instead, they are fixed within a structured material.”  

    Competing technologies for removing methane from mine shafts “operate at temperatures of 1,000 to 1,200 degrees C, requiring a lot of energy and risking explosion,” says Plata. “Our technology avoids safety concerns by operating at 300 to 400 degrees C. It reduces energy use and provides more tractable deployment costs.” 

    Potentially, energy and dollar costs could be further reduced in coal mines by capturing the heat generated by the conversion process. “In coal mines, you have enrichments above a half-percent methane, but below the 4 percent flammability threshold,” says Plata. “The excess heat from the process could be used to generate electricity using off-the-shelf converters.” 

    Plata’s dairy barn research is funded by the Gerstner Family Foundation and the coal mining project by the U.S. Department of Energy. “The DOE would like us to spin out the technology for scale-up within three years,” says Plata. “We cannot guarantee we will hit that goal, but we are trying to develop this as quickly as possible. Our society needs to start reducing methane emissions now.”  More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More

  • in

    Coordinating climate and air-quality policies to improve public health

    As America’s largest investment to fight climate change, the Inflation Reduction Act positions the country to reduce its greenhouse gas emissions by an estimated 40 percent below 2005 levels by 2030. But as it edges the United States closer to achieving its international climate commitment, the legislation is also expected to yield significant — and more immediate — improvements in the nation’s health. If successful in accelerating the transition from fossil fuels to clean energy alternatives, the IRA will sharply reduce atmospheric concentrations of fine particulates known to exacerbate respiratory and cardiovascular disease and cause premature deaths, along with other air pollutants that degrade human health. One recent study shows that eliminating air pollution from fossil fuels in the contiguous United States would prevent more than 50,000 premature deaths and avoid more than $600 billion in health costs each year.

    While national climate policies such as those advanced by the IRA can simultaneously help mitigate climate change and improve air quality, their results may vary widely when it comes to improving public health. That’s because the potential health benefits associated with air quality improvements are much greater in some regions and economic sectors than in others. Those benefits can be maximized, however, through a prudent combination of climate and air-quality policies.

    Several past studies have evaluated the likely health impacts of various policy combinations, but their usefulness has been limited due to a reliance on a small set of standard policy scenarios. More versatile tools are needed to model a wide range of climate and air-quality policy combinations and assess their collective effects on air quality and human health. Now researchers at the MIT Joint Program on the Science and Policy of Global Change and MIT Institute for Data, Systems and Society (IDSS) have developed a publicly available, flexible scenario tool that does just that.

    In a study published in the journal Geoscientific Model Development, the MIT team introduces its Tool for Air Pollution Scenarios (TAPS), which can be used to estimate the likely air-quality and health outcomes of a wide range of climate and air-quality policies at the regional, sectoral, and fuel-based level. 

    “This tool can help integrate the siloed sustainability issues of air pollution and climate action,” says the study’s lead author William Atkinson, who recently served as a Biogen Graduate Fellow and research assistant at the IDSS Technology and Policy Program’s (TPP) Research to Policy Engagement Initiative. “Climate action does not guarantee a clean air future, and vice versa — but the issues have similar sources that imply shared solutions if done right.”

    The study’s initial application of TAPS shows that with current air-quality policies and near-term Paris Agreement climate pledges alone, short-term pollution reductions give way to long-term increases — given the expected growth of emissions-intensive industrial and agricultural processes in developing regions. More ambitious climate and air-quality policies could be complementary, each reducing different pollutants substantially to give tremendous near- and long-term health benefits worldwide.

    “The significance of this work is that we can more confidently identify the long-term emission reduction strategies that also support air quality improvements,” says MIT Joint Program Deputy Director C. Adam Schlosser, a co-author of the study. “This is a win-win for setting climate targets that are also healthy targets.”

    TAPS projects air quality and health outcomes based on three integrated components: a recent global inventory of detailed emissions resulting from human activities (e.g., fossil fuel combustion, land-use change, industrial processes); multiple scenarios of emissions-generating human activities between now and the year 2100, produced by the MIT Economic Projection and Policy Analysis model; and emissions intensity (emissions per unit of activity) scenarios based on recent data from the Greenhouse Gas and Air Pollution Interactions and Synergies model.

    “We see the climate crisis as a health crisis, and believe that evidence-based approaches are key to making the most of this historic investment in the future, particularly for vulnerable communities,” says Johanna Jobin, global head of corporate reputation and responsibility at Biogen. “The scientific community has spoken with unanimity and alarm that not all climate-related actions deliver equal health benefits. We’re proud of our collaboration with the MIT Joint Program to develop this tool that can be used to bridge research-to-policy gaps, support policy decisions to promote health among vulnerable communities, and train the next generation of scientists and leaders for far-reaching impact.”

    The tool can inform decision-makers about a wide range of climate and air-quality policies. Policy scenarios can be applied to specific regions, sectors, or fuels to investigate policy combinations at a more granular level, or to target short-term actions with high-impact benefits.

    TAPS could be further developed to account for additional emissions sources and trends.

    “Our new tool could be used to examine a large range of both climate and air quality scenarios. As the framework is expanded, we can add detail for specific regions, as well as additional pollutants such as air toxics,” says study supervising co-author Noelle Selin, professor at IDSS and the MIT Department of Earth, Atmospheric and Planetary Sciences, and director of TPP.    

    This research was supported by the U.S. Environmental Protection Agency and its Science to Achieve Results (STAR) program; Biogen; TPP’s Leading Technology and Policy Initiative; and TPP’s Research to Policy Engagement Initiative. More

  • in

    Finding community in high-energy-density physics

    Skylar Dannhoff knew one thing: She did not want to be working alone.

    As an undergraduate at Case Western Reserve University, she had committed to a senior project that often felt like solitary lab work, a feeling heightened by the pandemic. Though it was an enriching experience, she was determined to find a graduate school environment that would foster community, one “with lots of people, lots of collaboration; where it’s impossible to work until 3 a.m. without anyone noticing.” A unique group at the Plasma Science and Fusion Center (PSFC) looked promising: the High-Energy-Density Physics (HEDP) division, a lead partner in the National Nuclear Security Administration’s Center for Excellence at MIT.

    “It was a shot in the dark, just more of a whim than anything,” she says of her request to join HEDP on her application to MIT’s Department of Physics. “And then, somehow, they reached out to me. I told them I’m willing to learn about plasma. I didn’t know anything about it.”

    What she did know was that the HEDP group collaborates with other U.S. laboratories on an approach to creating fusion energy known as inertial confinement fusion (ICF). One version of the technique, known as direct-drive ICF, aims multiple laser beams symmetrically onto a spherical capsule filled with nuclear fuel. The other, indirect-drive ICF, instead aims multiple lasers beams into a gold cylindrical cavity called a hohlraum, within which the spherical fuel capsule is positioned. The laser beams are configured to hit the inner hohlraum wall, generating a “bath” of X-rays, which in turn compress the fuel capsule.

    Imploding the capsule generates intense fusion energy within a tiny fraction of a second (an order of tens of picoseconds). In August 2021, the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) used this method to produce an historic fusion yield of 1.3 megajoules, putting researchers within reach of “ignition,” the point where the self-sustained fusion burn spreads into the surrounding fuel, leading to a high fusion-energy gain.  

    Joining the group just a month before this long-sought success, Dannhoff was impressed more with the response of her new teammates and the ICF community than with the scientific milestone. “I got a better appreciation for people who had spent their entire careers working on this project, just chugging along doing their best, ignoring the naysayers. I was excited for the people.”

    Dannhoff is now working toward extending the success of NIF and other ICF experiments, like the OMEGA laser at the University of Rochester’s Laboratory for Laser Energetics. Under the supervision of Senior Research Scientist Chikang Li, she is studying what happens to the flow of plasma within the hohlraum cavity during indirect ICF experiments, particularly for hohlraums with inner-wall aerogel foam linings. Experiments, over the last decade, have shown just how excruciatingly precise the symmetry in ICF targets must be. The more symmetric the X-ray drive, the more effective the implosion, and it is possible that these foam linings will improve the X-ray symmetry and drive efficiency.

    Dannhoff is specifically interested in studying the behavior of silicon and tantalum-based foam liners. She is as concerned with the challenges of the people at General Atomics (GA) and LLNL who are creating these targets as she is with the scientific outcome.

    “I just had a meeting with GA yesterday,” she notes. “And it’s a really tricky process. It’s kind of pushing the boundaries of what is doable at the moment. I got a much better sense of how demanding this project is for them, how much we’re asking of them.”

    What excites Dannhoff is the teamwork she observes, both at MIT and between ICF institutions around the United States. With roughly 10 graduate students and postdocs down the hall, each with an assigned lead role in lab management, she knows she can consult an expert on almost any question. And collaborators across the country are just an email away. “Any information that people can give you, they will give you, and usually very freely,” she notes. “Everyone just wants to see this work.”

    That Dannhoff is a natural team player is also evidenced in her hobbies. A hockey goalie, she prioritizes playing with MIT’s intramural teams, “because goalies are a little hard to come by. I just play with whoever needs a goalie on that night, and it’s a lot of fun.”

    She is also a member of the radio community, a fellowship she first embraced at Case Western — a moment she describes as a turning point in her life. “I literally don’t know who I would be today if I hadn’t figured out radio is something I’m interested in,” she admits. The MIT Radio Society provided the perfect landing pad for her arrival in Cambridge, full of the kinds of supportive, interesting, knowledgeable students she had befriended as an undergraduate. She credits radio with helping her realize that she could make her greatest contributions to science by focusing on engineering.

    Danhoff gets philosophical as she marvels at the invisible waves that surround us.

    “Not just radio waves: every wave,” she asserts. “The voice is the everywhere. Music, signal, space phenomena: it’s always around. And all we have to do is make the right little device and have the right circuit elements put in the right order to unmix and mix the signals and amplify them. And bada-bing, bada-boom, we’re talking with the universe.”

    “Maybe that epitomizes physics to me,” she adds. “We’re trying to listen to the universe, and it’s talking to us. We just have to come up with the right tools and hear what it’s trying to say.” More