More stories

  • in

    Featured video: Investigating our blue ocean planet

    A five-year doctoral degree program, the MIT – Woods Hole Oceanographic Institution (WHOI) Joint Program in Oceanography/Applied Ocean Science and Engineering combines the strengths of MIT and WHOI to create one of the largest oceanographic facilities in the world. Graduate study in oceanography encompasses virtually all the basic sciences as they apply to the marine environment: physics, chemistry, geochemistry, geology, geophysics, and biology.

    “As a species and as a society we really want to understand the planet that we live on and our place in it,” says Professor Michael Follows, who serves as director of the MIT-WHOI Joint Program.

    “The reason I joined the program was because we cannot afford to wait to be able to address the climate crisis,” explains graduate student Paris Smalls. “The freedom to be able to execute on and have your interests come to life has been incredibly rewarding.”

    “If you have a research problem, you can think of the top five people in that particular niche of a topic and they’re either down the hallway or have some association with WHOI,” adds graduate student Samantha Clevenger. “It’s a really incredible place in terms of connections and just having access to really anything you need.”

    Video by: Melanie Gonick/MIT | 5 min, 12 sec More

  • in

    A healthy wind

    Nearly 10 percent of today’s electricity in the United States comes from wind power. The renewable energy source benefits climate, air quality, and public health by displacing emissions of greenhouse gases and air pollutants that would otherwise be produced by fossil-fuel-based power plants.

    A new MIT study finds that the health benefits associated with wind power could more than quadruple if operators prioritized turning down output from the most polluting fossil-fuel-based power plants when energy from wind is available.

    In the study, published today in Science Advances, researchers analyzed the hourly activity of wind turbines, as well as the reported emissions from every fossil-fuel-based power plant in the country, between the years 2011 and 2017. They traced emissions across the country and mapped the pollutants to affected demographic populations. They then calculated the regional air quality and associated health costs to each community.

    The researchers found that in 2014, wind power that was associated with state-level policies improved air quality overall, resulting in $2 billion in health benefits across the country. However, only roughly 30 percent of these health benefits reached disadvantaged communities.

    The team further found that if the electricity industry were to reduce the output of the most polluting fossil-fuel-based power plants, rather than the most cost-saving plants, in times of wind-generated power, the overall health benefits could quadruple to $8.4 billion nationwide. However, the results would have a similar demographic breakdown.

    “We found that prioritizing health is a great way to maximize benefits in a widespread way across the U.S., which is a very positive thing. But it suggests it’s not going to address disparities,” says study co-author Noelle Selin, a professor in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences at MIT. “In order to address air pollution disparities, you can’t just focus on the electricity sector or renewables and count on the overall air pollution benefits addressing these real and persistent racial and ethnic disparities. You’ll need to look at other air pollution sources, as well as the underlying systemic factors that determine where plants are sited and where people live.”

    Selin’s co-authors are lead author and former MIT graduate student Minghao Qiu PhD ’21, now at Stanford University, and Corwin Zigler at the University of Texas at Austin.

    Turn-down service

    In their new study, the team looked for patterns between periods of wind power generation and the activity of fossil-fuel-based power plants, to see how regional electricity markets adjusted the output of power plants in response to influxes of renewable energy.

    “One of the technical challenges, and the contribution of this work, is trying to identify which are the power plants that respond to this increasing wind power,” Qiu notes.

    To do so, the researchers compared two historical datasets from the period between 2011 and 2017: an hour-by-hour record of energy output of wind turbines across the country, and a detailed record of emissions measurements from every fossil-fuel-based power plant in the U.S. The datasets covered each of seven major regional electricity markets, each market providing energy to one or multiple states.

    “California and New York are each their own market, whereas the New England market covers around seven states, and the Midwest covers more,” Qiu explains. “We also cover about 95 percent of all the wind power in the U.S.”

    In general, they observed that, in times when wind power was available, markets adjusted by essentially scaling back the power output of natural gas and sub-bituminous coal-fired power plants. They noted that the plants that were turned down were likely chosen for cost-saving reasons, as certain plants were less costly to turn down than others.

    The team then used a sophisticated atmospheric chemistry model to simulate the wind patterns and chemical transport of emissions across the country, and determined where and at what concentrations the emissions generated fine particulates and ozone — two pollutants that are known to damage air quality and human health. Finally, the researchers mapped the general demographic populations across the country, based on U.S. census data, and applied a standard epidemiological approach to calculate a population’s health cost as a result of their pollution exposure.

    This analysis revealed that, in the year 2014, a general cost-saving approach to displacing fossil-fuel-based energy in times of wind energy resulted in $2 billion in health benefits, or savings, across the country. A smaller share of these benefits went to disadvantaged populations, such as communities of color and low-income communities, though this disparity varied by state.

    “It’s a more complex story than we initially thought,” Qiu says. “Certain population groups are exposed to a higher level of air pollution, and those would be low-income people and racial minority groups. What we see is, developing wind power could reduce this gap in certain states but further increase it in other states, depending on which fossil-fuel plants are displaced.”

    Tweaking power

    The researchers then examined how the pattern of emissions and the associated health benefits would change if they prioritized turning down different fossil-fuel-based plants in times of wind-generated power. They tweaked the emissions data to reflect several alternative scenarios: one in which the most health-damaging, polluting power plants are turned down first; and two other scenarios in which plants producing the most sulfur dioxide and carbon dioxide respectively, are first to reduce their output.

    They found that while each scenario increased health benefits overall, and the first scenario in particular could quadruple health benefits, the original disparity persisted: Communities of color and low-income communities still experienced smaller health benefits than more well-off communities.

    “We got to the end of the road and said, there’s no way we can address this disparity by being smarter in deciding which plants to displace,” Selin says.

    Nevertheless, the study can help identify ways to improve the health of the general population, says Julian Marshall, a professor of environmental engineering at the University of Washington.

    “The detailed information provided by the scenarios in this paper can offer a roadmap to electricity-grid operators and to state air-quality regulators regarding which power plants are highly damaging to human health and also are likely to noticeably reduce emissions if wind-generated electricity increases,” says Marshall, who was not involved in the study.

    “One of the things that makes me optimistic about this area is, there’s a lot more attention to environmental justice and equity issues,” Selin concludes. “Our role is to figure out the strategies that are most impactful in addressing those challenges.”

    This work was supported, in part, by the U.S. Environmental Protection Agency, and by the National Institutes of Health. More

  • in

    Earth can regulate its own temperature over millennia, new study finds

    The Earth’s climate has undergone some big changes, from global volcanism to planet-cooling ice ages and dramatic shifts in solar radiation. And yet life, for the last 3.7 billion years, has kept on beating.

    Now, a study by MIT researchers in Science Advances confirms that the planet harbors a “stabilizing feedback” mechanism that acts over hundreds of thousands of years to pull the climate back from the brink, keeping global temperatures within a steady, habitable range.

    Just how does it accomplish this? A likely mechanism is “silicate weathering” — a geological process by which the slow and steady weathering of silicate rocks involves chemical reactions that ultimately draw carbon dioxide out of the atmosphere and into ocean sediments, trapping the gas in rocks.

    Scientists have long suspected that silicate weathering plays a major role in regulating the Earth’s carbon cycle. The mechanism of silicate weathering could provide a geologically constant force in keeping carbon dioxide — and global temperatures — in check. But there’s never been direct evidence for the continual operation of such a feedback, until now.

    The new findings are based on a study of paleoclimate data that record changes in average global temperatures over the last 66 million years. The MIT team applied a mathematical analysis to see whether the data revealed any patterns characteristic of stabilizing phenomena that reined in global temperatures on a  geologic timescale.

    They found that indeed there appears to be a consistent pattern in which the Earth’s temperature swings are dampened over timescales of hundreds of thousands of years. The duration of this effect is similar to the timescales over which silicate weathering is predicted to act.

    The results are the first to use actual data to confirm the existence of a stabilizing feedback, the mechanism of which is likely silicate weathering. This stabilizing feedback would explain how the Earth has remained habitable through dramatic climate events in the geologic past.

    “On the one hand, it’s good because we know that today’s global warming will eventually be canceled out through this stabilizing feedback,” says Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “But on the other hand, it will take hundreds of thousands of years to happen, so not fast enough to solve our present-day issues.”

    The study is co-authored by Arnscheidt and Daniel Rothman, professor of geophysics at MIT.

    Stability in data

    Scientists have previously seen hints of a climate-stabilizing effect in the Earth’s carbon cycle: Chemical analyses of ancient rocks have shown that the flux of carbon in and out of Earth’s surface environment has remained relatively balanced, even through dramatic swings in global temperature. Furthermore, models of silicate weathering predict that the process should have some stabilizing effect on the global climate. And finally, the fact of the Earth’s enduring habitability points to some inherent, geologic check on extreme temperature swings.

    “You have a planet whose climate was subjected to so many dramatic external changes. Why did life survive all this time? One argument is that we need some sort of stabilizing mechanism to keep temperatures suitable for life,” Arnscheidt says. “But it’s never been demonstrated from data that such a mechanism has consistently controlled Earth’s climate.”

    Arnscheidt and Rothman sought to confirm whether a stabilizing feedback has indeed been at work, by looking at data of global temperature fluctuations through geologic history. They worked with a range of global temperature records compiled by other scientists, from the chemical composition of ancient marine fossils and shells, as well as preserved Antarctic ice cores.

    “This whole study is only possible because there have been great advances in improving the resolution of these deep-sea temperature records,” Arnscheidt notes. “Now we have data going back 66 million years, with data points at most thousands of years apart.”

    Speeding to a stop

    To the data, the team applied the mathematical theory of stochastic differential equations, which is commonly used to reveal patterns in widely fluctuating datasets.

    “We realized this theory makes predictions for what you would expect Earth’s temperature history to look like if there had been feedbacks acting on certain timescales,” Arnscheidt explains.

    Using this approach, the team analyzed the history of average global temperatures over the last 66 million years, considering the entire period over different timescales, such as tens of thousands of years versus hundreds of thousands, to see whether any patterns of stabilizing feedback emerged within each timescale.

    “To some extent, it’s like your car is speeding down the street, and when you put on the brakes, you slide for a long time before you stop,” Rothman says. “There’s a timescale over which frictional resistance, or a stabilizing feedback, kicks in, when the system returns to a steady state.”

    Without stabilizing feedbacks, fluctuations of global temperature should grow with timescale. But the team’s analysis revealed a regime in which fluctuations did not grow, implying that a stabilizing mechanism reigned in the climate before fluctuations grew too extreme. The timescale for this stabilizing effect — hundreds of thousands of years — coincides with what scientists predict for silicate weathering.

    Interestingly, Arnscheidt and Rothman found that on longer timescales, the data did not reveal any stabilizing feedbacks. That is, there doesn’t appear to be any recurring pull-back of global temperatures on timescales longer than a million years. Over these longer timescales, then, what has kept global temperatures in check?

    “There’s an idea that chance may have played a major role in determining why, after more than 3 billion years, life still exists,” Rothman offers.

    In other words, as the Earth’s temperatures fluctuate over longer stretches, these fluctuations may just happen to be small enough in the geologic sense, to be within a range that a stabilizing feedback, such as silicate weathering, could periodically keep the climate in check, and more to the point, within a habitable zone.

    “There are two camps: Some say random chance is a good enough explanation, and others say there must be a stabilizing feedback,” Arnscheidt says. “We’re able to show, directly from data, that the answer is probably somewhere in between. In other words, there was some stabilization, but pure luck likely also played a role in keeping Earth continuously habitable.”

    This research was supported, in part, by a MathWorks fellowship and the National Science Foundation. More

  • in

    Study finds natural sources of air pollution exceed air quality guidelines in many regions

    Alongside climate change, air pollution is one of the biggest environmental threats to human health. Tiny particles known as particulate matter or PM2.5 (named for their diameter of just 2.5 micrometers or less) are a particularly hazardous type of pollutant. These particles are produced from a variety of sources, including wildfires and the burning of fossil fuels, and can enter our bloodstream, travel deep into our lungs, and cause respiratory and cardiovascular damage. Exposure to particulate matter is responsible for millions of premature deaths globally every year.

    In response to the increasing body of evidence on the detrimental effects of PM2.5, the World Health Organization (WHO) recently updated its air quality guidelines, lowering its recommended annual PM2.5 exposure guideline by 50 percent, from 10 micrograms per meter cubed (μm3) to 5 μm3. These updated guidelines signify an aggressive attempt to promote the regulation and reduction of anthropogenic emissions in order to improve global air quality.

    A new study by researchers in the MIT Department of Civil and Environmental Engineering explores if the updated air quality guideline of 5 μm3 is realistically attainable across different regions of the world, particularly if anthropogenic emissions are aggressively reduced. 

    The first question the researchers wanted to investigate was to what degree moving to a no-fossil-fuel future would help different regions meet this new air quality guideline.

    “The answer we found is that eliminating fossil-fuel emissions would improve air quality around the world, but while this would help some regions come into compliance with the WHO guidelines, for many other regions high contributions from natural sources would impede their ability to meet that target,” says senior author Colette Heald, the Germeshausen Professor in the MIT departments of Civil and Environmental Engineering, and Earth, Atmospheric and Planetary Sciences. 

    The study by Heald, Professor Jesse Kroll, and graduate students Sidhant Pai and Therese Carter, published June 6 in the journal Environmental Science and Technology Letters, finds that over 90 percent of the global population is currently exposed to average annual concentrations that are higher than the recommended guideline. The authors go on to demonstrate that over 50 percent of the world’s population would still be exposed to PM2.5 concentrations that exceed the new air quality guidelines, even in the absence of all anthropogenic emissions.

    This is due to the large natural sources of particulate matter — dust, sea salt, and organics from vegetation — that still exist in the atmosphere when anthropogenic emissions are removed from the air. 

    “If you live in parts of India or northern Africa that are exposed to large amounts of fine dust, it can be challenging to reduce PM2.5 exposures below the new guideline,” says Sidhant Pai, co-lead author and graduate student. “This study challenges us to rethink the value of different emissions abatement controls across different regions and suggests the need for a new generation of air quality metrics that can enable targeted decision-making.”

    The researchers conducted a series of model simulations to explore the viability of achieving the updated PM2.5 guidelines worldwide under different emissions reduction scenarios, using 2019 as a representative baseline year. 

    Their model simulations used a suite of different anthropogenic sources that could be turned on and off to study the contribution of a particular source. For instance, the researchers conducted a simulation that turned off all human-based emissions in order to determine the amount of PM2.5 pollution that could be attributed to natural and fire sources. By analyzing the chemical composition of the PM2.5 aerosol in the atmosphere (e.g., dust, sulfate, and black carbon), the researchers were also able to get a more accurate understanding of the most important PM2.5 sources in a particular region. For example, elevated PM2.5 concentrations in the Amazon were shown to predominantly consist of carbon-containing aerosols from sources like deforestation fires. Conversely, nitrogen-containing aerosols were prominent in Northern Europe, with large contributions from vehicles and fertilizer usage. The two regions would thus require very different policies and methods to improve their air quality. 

    “Analyzing particulate pollution across individual chemical species allows for mitigation and adaptation decisions that are specific to the region, as opposed to a one-size-fits-all approach, which can be challenging to execute without an understanding of the underlying importance of different sources,” says Pai. 

    When the WHO air quality guidelines were last updated in 2005, they had a significant impact on environmental policies. Scientists could look at an area that was not in compliance and suggest high-level solutions to improve the region’s air quality. But as the guidelines have tightened, globally-applicable solutions to manage and improve air quality are no longer as evident. 

    “Another benefit of speciating is that some of the particles have different toxicity properties that are correlated to health outcomes,” says Therese Carter, co-lead author and graduate student. “It’s an important area of research that this work can help motivate. Being able to separate out that piece of the puzzle can provide epidemiologists with more insights on the different toxicity levels and the impact of specific particles on human health.”

    The authors view these new findings as an opportunity to expand and iterate on the current guidelines.  

    “Routine and global measurements of the chemical composition of PM2.5 would give policymakers information on what interventions would most effectively improve air quality in any given location,” says Jesse Kroll, a professor in the MIT departments of Civil and Environmental Engineering and Chemical Engineering. “But it would also provide us with new insights into how different chemical species in PM2.5 affect human health.”

    “I hope that as we learn more about the health impacts of these different particles, our work and that of the broader atmospheric chemistry community can help inform strategies to reduce the pollutants that are most harmful to human health,” adds Heald. More

  • in

    Microbes and minerals may have set off Earth’s oxygenation

    For the first 2 billion years of Earth’s history, there was barely any oxygen in the air. While some microbes were photosynthesizing by the latter part of this period, oxygen had not yet accumulated at levels that would impact the global biosphere.

    But somewhere around 2.3 billion years ago, this stable, low-oxygen equilibrium shifted, and oxygen began building up in the atmosphere, eventually reaching the life-sustaining levels we breathe today. This rapid infusion is known as the Great Oxygenation Event, or GOE. What triggered the event and pulled the planet out of its low-oxygen funk is one of the great mysteries of science.

    A new hypothesis, proposed by MIT scientists, suggests that oxygen finally started accumulating in the atmosphere thanks to interactions between certain marine microbes and minerals in ocean sediments. These interactions helped prevent oxygen from being consumed, setting off a self-amplifying process where more and more oxygen was made available to accumulate in the atmosphere.

    The scientists have laid out their hypothesis using mathematical and evolutionary analyses, showing that there were indeed microbes that existed before the GOE and evolved the ability to interact with sediment in the way that the researchers have proposed.

    Their study, appearing today in Nature Communications, is the first to connect the co-evolution of microbes and minerals to Earth’s oxygenation.

    “Probably the most important biogeochemical change in the history of the planet was oxygenation of the atmosphere,” says study author Daniel Rothman, professor of geophysics in MIT’s Department of Earth, Atmospheric, and Planetary Sciences (EAPS). “We show how the interactions of microbes, minerals, and the geochemical environment acted in concert to increase oxygen in the atmosphere.”

    The study’s co-authors include lead author Haitao Shang, a former MIT graduate student, and Gregory Fournier, associate professor of geobiology in EAPS.

    A step up

    Today’s oxygen levels in the atmosphere are a stable balance between processes that produce oxygen and those that consume it. Prior to the GOE, the atmosphere maintained a different kind of equilibrium, with producers and consumers of oxygen  in balance, but in a way that didn’t leave much extra oxygen for the atmosphere.

    What could have pushed the planet out of one stable, oxygen-deficient state to another stable, oxygen-rich state?

    “If you look at Earth’s history, it appears there were two jumps, where you went from a steady state of low oxygen to a steady state of much higher oxygen, once in the Paleoproterozoic, once in the Neoproterozoic,” Fournier notes. “These jumps couldn’t have been because of a gradual increase in excess oxygen. There had to have been some feedback loop that caused this step-change in stability.”

    He and his colleagues wondered whether such a positive feedback loop could have come from a process in the ocean that made some organic carbon unavailable to its consumers. Organic carbon is mainly consumed through oxidation, usually accompanied by the consumption of oxygen — a process by which microbes in the ocean use oxygen to break down organic matter, such as detritus that has settled in sediment. The team wondered: Could there have been some process by which the presence of oxygen stimulated its further accumulation?

    Shang and Rothman worked out a mathematical model that made the following prediction: If microbes possessed the ability to only partially oxidize organic matter, the partially-oxidized matter, or “POOM,” would effectively become “sticky,” and chemically bind to minerals in sediment in a way that would protect the material from further oxidation. The oxygen that would otherwise have been consumed to fully degrade the material would instead be free to build up in the atmosphere. This process, they found, could serve as a positive feedback, providing a natural pump to push the atmosphere into a new, high-oxygen equilibrium.

    “That led us to ask, is there a microbial metabolism out there that produced POOM?” Fourier says.

    In the genes

    To answer this, the team searched through the scientific literature and identified a group of microbes that partially oxidizes organic matter in the deep ocean today. These microbes belong to the bacterial group SAR202, and their partial oxidation is carried out through an enzyme, Baeyer-Villiger monooxygenase, or BVMO.

    The team carried out a phylogenetic analysis to see how far back the microbe, and the gene for the enzyme, could be traced. They found that the bacteria did indeed have ancestors dating back before the GOE, and that the gene for the enzyme could be traced across various microbial species, as far back as pre-GOE times.

    What’s more, they found that the gene’s diversification, or the number of species that acquired the gene, increased significantly during times when the atmosphere experienced spikes in oxygenation, including once during the GOE’s Paleoproterozoic, and again in the Neoproterozoic.

    “We found some temporal correlations between diversification of POOM-producing genes, and the oxygen levels in the atmosphere,” Shang says. “That supports our overall theory.”

    To confirm this hypothesis will require far more follow-up, from experiments in the lab to surveys in the field, and everything in between. With their new study, the team has introduced a new suspect in the age-old case of what oxygenated Earth’s atmosphere.

    “Proposing a novel method, and showing evidence for its plausibility, is the first but important step,” Fournier says. “We’ve identified this as a theory worthy of study.”

    This work was supported in part by the mTerra Catalyst Fund and the National Science Foundation. More

  • in

    Study: Ice flow is more sensitive to stress than previously thought

    The rate of glacier ice flow is more sensitive to stress than previously calculated, according to a new study by MIT researchers that upends a decades-old equation used to describe ice flow.

    Stress in this case refers to the forces acting on Antarctic glaciers, which are primarily influenced by gravity that drags the ice down toward lower elevations. Viscous glacier ice flows “really similarly to honey,” explains Joanna Millstein, a PhD student in the Glacier Dynamics and Remote Sensing Group and lead author of the study. “If you squeeze honey in the center of a piece of toast, and it piles up there before oozing outward, that’s the exact same motion that’s happening for ice.”

    The revision to the equation proposed by Millstein and her colleagues should improve models for making predictions about the ice flow of glaciers. This could help glaciologists predict how Antarctic ice flow might contribute to future sea level rise, although Millstein said the equation change is unlikely to raise estimates of sea level rise beyond the maximum levels already predicted under climate change models.

    “Almost all our uncertainties about sea level rise coming from Antarctica have to do with the physics of ice flow, though, so this will hopefully be a constraint on that uncertainty,” she says.

    Other authors on the paper, published in Nature Communications Earth and Environment, include Brent Minchew, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, and Samuel Pegler, a university academic fellow at the University of Leeds.

    Benefits of big data

    The equation in question, called Glen’s Flow Law, is the most widely used equation to describe viscous ice flow. It was developed in 1958 by British scientist J.W. Glen, one of the few glaciologists working on the physics of ice flow in the 1950s, according to Millstein.

    With relatively few scientists working in the field until recently, along with the remoteness and inaccessibility of most large glacier ice sheets, there were few attempts to calibrate Glen’s Flow Law outside the lab until recently. In the recent study, Millstein and her colleagues took advantage of a new wealth of satellite imagery over Antarctic ice shelves, the floating extensions of the continent’s ice sheet, to revise the stress exponent of the flow law.

    “In 2002, this major ice shelf [Larsen B] collapsed in Antarctica, and all we have from that collapse is two satellite images that are a month apart,” she says. “Now, over that same area we can get [imagery] every six days.”

    The new analysis shows that “the ice flow in the most dynamic, fastest-changing regions of Antarctica — the ice shelves, which basically hold back and hug the interior of the continental ice — is more sensitive to stress than commonly assumed,” Millstein says. She’s optimistic that the growing record of satellite data will help capture rapid changes on Antarctica in the future, providing insights into the underlying physical processes of glaciers.   

    But stress isn’t the only thing that affects ice flow, the researchers note. Other parts of the flow law equation represent differences in temperature, ice grain size and orientation, and impurities and water contained in the ice — all of which can alter flow velocity. Factors like temperature could be especially important in understanding how ice flow impacts sea level rise in the future, Millstein says.

    Cracking under strain

    Millstein and colleagues are also studying the mechanics of ice sheet collapse, which involves different physical models than those used to understand the ice flow problem. “The cracking and breaking of ice is what we’re working on now, using strain rate observations,” Millstein says.

    The researchers use InSAR, radar images of the Earth’s surface collected by satellites, to observe deformations of the ice sheets that can be used to make precise measurements of strain. By observing areas of ice with high strain rates, they hope to better understand the rate at which crevasses and rifts propagate to trigger collapse.

    The research was supported by the National Science Foundation. More

  • in

    Study reveals chemical link between wildfire smoke and ozone depletion

    The Australian wildfires in 2019 and 2020 were historic for how far and fast they spread, and for how long and powerfully they burned. All told, the devastating “Black Summer” fires blazed across more than 43 million acres of land, and extinguished or displaced nearly 3 billion animals. The fires also injected over 1 million tons of smoke particles into the atmosphere, reaching up to 35 kilometers above Earth’s surface — a mass and reach comparable to that of an erupting volcano.

    Now, atmospheric chemists at MIT have found that the smoke from those fires set off chemical reactions in the stratosphere that contributed to the destruction of ozone, which shields the Earth from incoming ultraviolet radiation. The team’s study, appearing this week in the Proceedings of the National Academy of Sciences, is the first to establish a chemical link between wildfire smoke and ozone depletion.

    In March 2020, shortly after the fires subsided, the team observed a sharp drop in nitrogen dioxide in the stratosphere, which is the first step in a chemical cascade that is known to end in ozone depletion. The researchers found that this drop in nitrogen dioxide directly correlates with the amount of smoke that the fires released into the stratosphere. They estimate that this smoke-induced chemistry depleted the column of ozone by 1 percent.

    To put this in context, they note that the phaseout of ozone-depleting gases under a worldwide agreement to stop their production has led to about a 1 percent ozone recovery from earlier ozone decreases over the past 10 years — meaning that the wildfires canceled those hard-won diplomatic gains for a short period. If future wildfires grow stronger and more frequent, as they are predicted to do with climate change, ozone’s projected recovery could be delayed by years. 

    “The Australian fires look like the biggest event so far, but as the world continues to warm, there is every reason to think these fires will become more frequent and more intense,” says lead author Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at MIT. “It’s another wakeup call, just as the Antarctic ozone hole was, in the sense of showing how bad things could actually be.”

    The study’s co-authors include Kane Stone, a research scientist in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, along with collaborators at multiple institutions including the University of Saskatchewan, Jinan University, the National Center for Atmospheric Research, and the University of Colorado at Boulder.

    Chemical trace

    Massive wildfires are known to generate pyrocumulonimbus — towering clouds of smoke that can reach into the stratosphere, the layer of the atmosphere that lies between about 15 and 50 kilometers above the Earth’s surface. The smoke from Australia’s wildfires reached well into the stratosphere, as high as 35 kilometers.

    In 2021, Solomon’s co-author, Pengfei Yu at Jinan University, carried out a separate study of the fires’ impacts and found that the accumulated smoke warmed parts of the stratosphere by as much as 2 degrees Celsius — a warming that persisted for six months. The study also found hints of ozone destruction in the Southern Hemisphere following the fires.

    Solomon wondered whether smoke from the fires could have depleted ozone through a chemistry similar to volcanic aerosols. Major volcanic eruptions can also reach into the stratosphere, and in 1989, Solomon discovered that the particles in these eruptions can destroy ozone through a series of chemical reactions. As the particles form in the atmosphere, they gather moisture on their surfaces. Once wet, the particles can react with circulating chemicals in the stratosphere, including dinitrogen pentoxide, which reacts with the particles to form nitric acid.

    Normally, dinitrogen pentoxide reacts with the sun to form various nitrogen species, including nitrogen dioxide, a compound that binds with chlorine-containing chemicals in the stratosphere. When volcanic smoke converts dinitrogen pentoxide into nitric acid, nitrogen dioxide drops, and the chlorine compounds take another path, morphing into chlorine monoxide, the main human-made agent that destroys ozone.

    “This chemistry, once you get past that point, is well-established,” Solomon says. “Once you have less nitrogen dioxide, you have to have more chlorine monoxide, and that will deplete ozone.”

    Cloud injection

    In the new study, Solomon and her colleagues looked at how concentrations of nitrogen dioxide in the stratosphere changed following the Australian fires. If these concentrations dropped significantly, it would signal that wildfire smoke depletes ozone through the same chemical reactions as some volcanic eruptions.

    The team looked to observations of nitrogen dioxide taken by three independent satellites that have surveyed the Southern Hemisphere for varying lengths of time. They compared each satellite’s record in the months and years leading up to and following the Australian fires. All three records showed a significant drop in nitrogen dioxide in March 2020. For one satellite’s record, the drop represented a record low among observations spanning the last 20 years.

    To check that the nitrogen dioxide decrease was a direct chemical effect of the fires’ smoke, the researchers carried out atmospheric simulations using a global, three-dimensional model that simulates hundreds of chemical reactions in the atmosphere, from the surface on up through the stratosphere.

    The team injected a cloud of smoke particles into the model, simulating what was observed from the Australian wildfires. They assumed that the particles, like volcanic aerosols, gathered moisture. They then ran the model multiple times and compared the results to simulations without the smoke cloud.

    In every simulation incorporating wildfire smoke, the team found that as the amount of smoke particles increased in the stratosphere, concentrations of nitrogen dioxide decreased, matching the observations of the three satellites.

    “The behavior we saw, of more and more aerosols, and less and less nitrogen dioxide, in both the model and the data, is a fantastic fingerprint,” Solomon says. “It’s the first time that science has established a chemical mechanism linking wildfire smoke to ozone depletion. It may only be one chemical mechanism among several, but it’s clearly there. It tells us these particles are wet and they had to have caused some ozone depletion.”

    She and her collaborators are looking into other reactions triggered by wildfire smoke that might further contribute to stripping ozone. For the time being, the major driver of ozone depletion remains chlorofluorocarbons, or CFCs — chemicals such as old refrigerants that have been banned under the Montreal Protocol, though they continue to linger in the stratosphere. But as global warming leads to stronger, more frequent wildfires, their smoke could have a serious, lasting impact on ozone.

    “Wildfire smoke is a toxic brew of organic compounds that are complex beasts,” Solomon says. “And I’m afraid ozone is getting pummeled by a whole series of reactions that we are now furiously working to unravel.”

    This research was supported in part by the National Science Foundation and NASA. More

  • in

    Understanding air pollution from space

    Climate change and air pollution are interlocking crises that threaten human health. Reducing emissions of some air pollutants can help achieve climate goals, and some climate mitigation efforts can in turn improve air quality.

    One part of MIT Professor Arlene Fiore’s research program is to investigate the fundamental science in understanding air pollutants — how long they persist and move through our environment to affect air quality.

    “We need to understand the conditions under which pollutants, such as ozone, form. How much ozone is formed locally and how much is transported long distances?” says Fiore, who notes that Asian air pollution can be transported across the Pacific Ocean to North America. “We need to think about processes spanning local to global dimensions.”

    Fiore, the Peter H. Stone and Paola Malanotte Stone Professor in Earth, Atmospheric and Planetary Sciences, analyzes data from on-the-ground readings and from satellites, along with models, to better understand the chemistry and behavior of air pollutants — which ultimately can inform mitigation strategies and policy setting.

    A global concern

    At the United Nations’ most recent climate change conference, COP26, air quality management was a topic discussed over two days of presentations.

    “Breathing is vital. It’s life. But for the vast majority of people on this planet right now, the air that they breathe is not giving life, but cutting it short,” said Sarah Vogel, senior vice president for health at the Environmental Defense Fund, at the COP26 session.

    “We need to confront this twin challenge now through both a climate and clean air lens, of targeting those pollutants that warm both the air and harm our health.”

    Earlier this year, the World Health Organization (WHO) updated its global air quality guidelines it had issued 15 years earlier for six key pollutants including ozone (O3), nitrogen dioxide (NO2), sulfur dioxide (SO2), and carbon monoxide (CO). The new guidelines are more stringent based on what the WHO stated is the “quality and quantity of evidence” of how these pollutants affect human health. WHO estimates that roughly 7 million premature deaths are attributable to the joint effects of air pollution.

    “We’ve had all these health-motivated reductions of aerosol and ozone precursor emissions. What are the implications for the climate system, both locally but also around the globe? How does air quality respond to climate change? We study these two-way interactions between air pollution and the climate system,” says Fiore.

    But fundamental science is still required to understand how gases, such as ozone and nitrogen dioxide, linger and move throughout the troposphere — the lowermost layer of our atmosphere, containing the air we breathe.

    “We care about ozone in the air we’re breathing where we live at the Earth’s surface,” says Fiore. “Ozone reacts with biological tissue, and can be damaging to plants and human lungs. Even if you’re a healthy adult, if you’re out running hard during an ozone smog event, you might feel an extra weight on your lungs.”

    Telltale signs from space

    Ozone is not emitted directly, but instead forms through chemical reactions catalyzed by radiation from the sun interacting with nitrogen oxides — pollutants released in large part from burning fossil fuels—and volatile organic compounds. However, current satellite instruments cannot sense ground-level ozone.

    “We can’t retrieve surface- or even near-surface ozone from space,” says Fiore of the satellite data, “although the anticipated launch of a new instrument looks promising for new advances in retrieving lower-tropospheric ozone”. Instead, scientists can look at signatures from other gas emissions to get a sense of ozone formation. “Nitrogen dioxide and formaldehyde are a heavy focus of our research because they serve as proxies for two of the key ingredients that go on to form ozone in the atmosphere.”

    To understand ozone formation via these precursor pollutants, scientists have gathered data for more than two decades using spectrometer instruments aboard satellites that measure sunlight in ultraviolet and visible wavelengths that interact with these pollutants in the Earth’s atmosphere — known as solar backscatter radiation.

    Satellites, such as NASA’s Aura, carry instruments like the Ozone Monitoring Instrument (OMI). OMI, along with European-launched satellites such as the Global Ozone Monitoring Experiment (GOME) and the Scanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY), and the newest generation TROPOspheric Monitoring instrument (TROPOMI), all orbit the Earth, collecting data during daylight hours when sunlight is interacting with the atmosphere over a particular location.

    In a recent paper from Fiore’s group, former graduate student Xiaomeng Jin (now a postdoc at the University of California at Berkeley), demonstrated that she could bring together and “beat down the noise in the data,” as Fiore says, to identify trends in ozone formation chemistry over several U.S. metropolitan areas that “are consistent with our on-the-ground understanding from in situ ozone measurements.”

    “This finding implies that we can use these records to learn about changes in surface ozone chemistry in places where we lack on-the-ground monitoring,” says Fiore. Extracting these signals by stringing together satellite data — OMI, GOME, and SCIAMACHY — to produce a two-decade record required reconciling the instruments’ differing orbit days, times, and fields of view on the ground, or spatial resolutions. 

    Currently, spectrometer instruments aboard satellites are retrieving data once per day. However, newer instruments, such as the Geostationary Environment Monitoring Spectrometer launched in February 2020 by the National Institute of Environmental Research in the Ministry of Environment of South Korea, will monitor a particular region continuously, providing much more data in real time.

    Over North America, the Tropospheric Emissions: Monitoring of Pollution Search (TEMPO) collaboration between NASA and the Smithsonian Astrophysical Observatory, led by Kelly Chance of Harvard University, will provide not only a stationary view of the atmospheric chemistry over the continent, but also a finer-resolution view — with the instrument recording pollution data from only a few square miles per pixel (with an anticipated launch in 2022).

    “What we’re very excited about is the opportunity to have continuous coverage where we get hourly measurements that allow us to follow pollution from morning rush hour through the course of the day and see how plumes of pollution are evolving in real time,” says Fiore.

    Data for the people

    Providing Earth-observing data to people in addition to scientists — namely environmental managers, city planners, and other government officials — is the goal for the NASA Health and Air Quality Applied Sciences Team (HAQAST).

    Since 2016, Fiore has been part of HAQAST, including collaborative “tiger teams” — projects that bring together scientists, nongovernment entities, and government officials — to bring data to bear on real issues.

    For example, in 2017, Fiore led a tiger team that provided guidance to state air management agencies on how satellite data can be incorporated into state implementation plans (SIPs). “Submission of a SIP is required for any state with a region in non-attainment of U.S. National Ambient Air Quality Standards to demonstrate their approach to achieving compliance with the standard,” says Fiore. “What we found is that small tweaks in, for example, the metrics we use to convey the science findings, can go a long way to making the science more usable, especially when there are detailed policy frameworks in place that must be followed.”

    Now, in 2021, Fiore is part of two tiger teams announced by HAQAST in late September. One team is looking at data to address environmental justice issues, by providing data to assess communities disproportionately affected by environmental health risks. Such information can be used to estimate the benefits of governmental investments in environmental improvements for disproportionately burdened communities. The other team is looking at urban emissions of nitrogen oxides to try to better quantify and communicate uncertainties in the estimates of anthropogenic sources of pollution.

    “For our HAQAST work, we’re looking at not just the estimate of the exposure to air pollutants, or in other words their concentrations,” says Fiore, “but how confident are we in our exposure estimates, which in turn affect our understanding of the public health burden due to exposure. We have stakeholder partners at the New York Department of Health who will pair exposure datasets with health data to help prioritize decisions around public health.

    “I enjoy working with stakeholders who have questions that require science to answer and can make a difference in their decisions.” Fiore says. More