More stories

  • in

    Study reveals chemical link between wildfire smoke and ozone depletion

    The Australian wildfires in 2019 and 2020 were historic for how far and fast they spread, and for how long and powerfully they burned. All told, the devastating “Black Summer” fires blazed across more than 43 million acres of land, and extinguished or displaced nearly 3 billion animals. The fires also injected over 1 million tons of smoke particles into the atmosphere, reaching up to 35 kilometers above Earth’s surface — a mass and reach comparable to that of an erupting volcano.

    Now, atmospheric chemists at MIT have found that the smoke from those fires set off chemical reactions in the stratosphere that contributed to the destruction of ozone, which shields the Earth from incoming ultraviolet radiation. The team’s study, appearing this week in the Proceedings of the National Academy of Sciences, is the first to establish a chemical link between wildfire smoke and ozone depletion.

    In March 2020, shortly after the fires subsided, the team observed a sharp drop in nitrogen dioxide in the stratosphere, which is the first step in a chemical cascade that is known to end in ozone depletion. The researchers found that this drop in nitrogen dioxide directly correlates with the amount of smoke that the fires released into the stratosphere. They estimate that this smoke-induced chemistry depleted the column of ozone by 1 percent.

    To put this in context, they note that the phaseout of ozone-depleting gases under a worldwide agreement to stop their production has led to about a 1 percent ozone recovery from earlier ozone decreases over the past 10 years — meaning that the wildfires canceled those hard-won diplomatic gains for a short period. If future wildfires grow stronger and more frequent, as they are predicted to do with climate change, ozone’s projected recovery could be delayed by years. 

    “The Australian fires look like the biggest event so far, but as the world continues to warm, there is every reason to think these fires will become more frequent and more intense,” says lead author Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at MIT. “It’s another wakeup call, just as the Antarctic ozone hole was, in the sense of showing how bad things could actually be.”

    The study’s co-authors include Kane Stone, a research scientist in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, along with collaborators at multiple institutions including the University of Saskatchewan, Jinan University, the National Center for Atmospheric Research, and the University of Colorado at Boulder.

    Chemical trace

    Massive wildfires are known to generate pyrocumulonimbus — towering clouds of smoke that can reach into the stratosphere, the layer of the atmosphere that lies between about 15 and 50 kilometers above the Earth’s surface. The smoke from Australia’s wildfires reached well into the stratosphere, as high as 35 kilometers.

    In 2021, Solomon’s co-author, Pengfei Yu at Jinan University, carried out a separate study of the fires’ impacts and found that the accumulated smoke warmed parts of the stratosphere by as much as 2 degrees Celsius — a warming that persisted for six months. The study also found hints of ozone destruction in the Southern Hemisphere following the fires.

    Solomon wondered whether smoke from the fires could have depleted ozone through a chemistry similar to volcanic aerosols. Major volcanic eruptions can also reach into the stratosphere, and in 1989, Solomon discovered that the particles in these eruptions can destroy ozone through a series of chemical reactions. As the particles form in the atmosphere, they gather moisture on their surfaces. Once wet, the particles can react with circulating chemicals in the stratosphere, including dinitrogen pentoxide, which reacts with the particles to form nitric acid.

    Normally, dinitrogen pentoxide reacts with the sun to form various nitrogen species, including nitrogen dioxide, a compound that binds with chlorine-containing chemicals in the stratosphere. When volcanic smoke converts dinitrogen pentoxide into nitric acid, nitrogen dioxide drops, and the chlorine compounds take another path, morphing into chlorine monoxide, the main human-made agent that destroys ozone.

    “This chemistry, once you get past that point, is well-established,” Solomon says. “Once you have less nitrogen dioxide, you have to have more chlorine monoxide, and that will deplete ozone.”

    Cloud injection

    In the new study, Solomon and her colleagues looked at how concentrations of nitrogen dioxide in the stratosphere changed following the Australian fires. If these concentrations dropped significantly, it would signal that wildfire smoke depletes ozone through the same chemical reactions as some volcanic eruptions.

    The team looked to observations of nitrogen dioxide taken by three independent satellites that have surveyed the Southern Hemisphere for varying lengths of time. They compared each satellite’s record in the months and years leading up to and following the Australian fires. All three records showed a significant drop in nitrogen dioxide in March 2020. For one satellite’s record, the drop represented a record low among observations spanning the last 20 years.

    To check that the nitrogen dioxide decrease was a direct chemical effect of the fires’ smoke, the researchers carried out atmospheric simulations using a global, three-dimensional model that simulates hundreds of chemical reactions in the atmosphere, from the surface on up through the stratosphere.

    The team injected a cloud of smoke particles into the model, simulating what was observed from the Australian wildfires. They assumed that the particles, like volcanic aerosols, gathered moisture. They then ran the model multiple times and compared the results to simulations without the smoke cloud.

    In every simulation incorporating wildfire smoke, the team found that as the amount of smoke particles increased in the stratosphere, concentrations of nitrogen dioxide decreased, matching the observations of the three satellites.

    “The behavior we saw, of more and more aerosols, and less and less nitrogen dioxide, in both the model and the data, is a fantastic fingerprint,” Solomon says. “It’s the first time that science has established a chemical mechanism linking wildfire smoke to ozone depletion. It may only be one chemical mechanism among several, but it’s clearly there. It tells us these particles are wet and they had to have caused some ozone depletion.”

    She and her collaborators are looking into other reactions triggered by wildfire smoke that might further contribute to stripping ozone. For the time being, the major driver of ozone depletion remains chlorofluorocarbons, or CFCs — chemicals such as old refrigerants that have been banned under the Montreal Protocol, though they continue to linger in the stratosphere. But as global warming leads to stronger, more frequent wildfires, their smoke could have a serious, lasting impact on ozone.

    “Wildfire smoke is a toxic brew of organic compounds that are complex beasts,” Solomon says. “And I’m afraid ozone is getting pummeled by a whole series of reactions that we are now furiously working to unravel.”

    This research was supported in part by the National Science Foundation and NASA. More

  • in

    Tuning in to invisible waves on the JET tokamak

    Research scientist Alex Tinguely is readjusting to Cambridge and Boston.

    As a postdoc with the Plasma Science and Fusion Center (PSFC), the MIT graduate spent the last two years in Oxford, England, a city he recalls can be traversed entirely “in the time it takes to walk from MIT to Harvard.” With its ancient stone walls, cathedrals, cobblestone streets, and winding paths, that small city was his home base for a big project: JET, a tokamak that is currently the largest operating magnetic fusion energy experiment in the world.

    Located at the Culham Center for Fusion Energy (CCFE), part of the U.K. Atomic Energy Authority, this key research center of the European Fusion Program has recently announced historic success. Using a 50-50 deuterium-tritium fuel mixture for the first time since 1997, JET established a fusion power record of 10 megawatts output over five seconds. It produced 59 megajoules of fusion energy, more than doubling the 22 megajoule record it set in 1997. As a member of the JET Team, Tinguely has overseen the measurement and instrumentation systems (diagnostics) contributed by the MIT group.

    A lucky chance

    The postdoctoral opportunity arose just as Tinguely was graduating with a PhD in physics from MIT. Managed by Professor Miklos Porkolab as the principal investigator for over 20 years, this postdoctoral program has prepared multiple young researchers for careers in fusion facilities around the world. The collaborative research provided Tinguely the chance to work on a fusion device that would be adding tritium to the usual deuterium fuel.

    Fusion, the process that fuels the sun and other stars, could provide a long-term source of carbon-free power on Earth, if it can be harnessed. For decades researchers have tried to create an artificial star in a doughnut-shaped bottle, or “tokamak,” using magnetic fields to keep the turbulent plasma fuel confined and away from the walls of its container long enough for fusion to occur.

    In his graduate student days at MIT, Tinguely worked on the PSFC’s Alcator C-Mod tokamak, now decommissioned, which, like most magnetic fusion devices, used deuterium to create the plasmas for experiments. JET, since beginning operation in 1983, has done the same, later joining a small number of facilities that added tritium, a radioactive isotope of hydrogen. While this addition increases the amount of fusion, it also creates much more radiation and activation.

    Tinguely considers himself fortunate to have been placed at JET.

    “There aren’t that many operating tokamaks in the U.S. right now,” says Tinguely, “not to mention one that would be running deuterium-tritium (DT), which hasn’t been run for over 20 years, and which would be making some really important measurements. I got a very lucky spot where I was an MIT postdoc, but I lived in Oxford, working on a very international project.”

    Strumming magnetic field lines

    The measurements that interest Tinguely are of low-frequency electromagnetic waves in tokamak plasmas. Tinguely uses an antenna diagnostic developed by MIT, EPFL Swiss Plasma Center, and CCFE to probe the so-called Alfvén eigenmodes when they are stable, before the energetic alpha particles produced by DT fusion plasmas can drive them toward instability.

    What makes MIT’s “Alfvén Eigenmode Active Diagnostic” essential is that without it researchers cannot see, or measure, stable eigenmodes. Unstable modes show up clearly as magnetic fluctuations in the data, but stable waves are invisible without prompting from the antenna. These measurements help researchers understand the physics of Alfvén waves and their potential for degrading fusion performance, providing insights that will be increasingly important for future DT fusion devices.

    Tinguely likens the diagnostic to fingers on guitar strings.

    “The magnetic field lines in the tokamak are like guitar strings. If you have nothing to give energy to the strings — or give energy to the waves of the magnetic field lines — they just sit there, they don’t do anything. The energetic plasma particles can essentially ‘play the guitar strings,’ strum the magnetic field lines of the plasma, and that’s when you can see the waves in your plasma. But if the energetic particle drive of the waves is not strong enough you won’t see them, so you need to come along and ‘pluck the strings’ with our antenna. And that’s how you learn some information about the waves.”

    Much of Tinguely’s experience on JET took place during the Covid-19 pandemic, when off-site operation and analysis were the norm. However, because the MIT diagnostic needed to be physically turned on and off, someone from Tinguely’s team needed to be on site twice a day, a routine that became even less convenient when tritium was introduced.

    “When you have deuterium and tritium, you produce a lot of neutrons. So, some of the buildings became off-limits during operation, which meant they had to be turned on really early in the morning, like 6:30 a.m., and then turned off very late at night, around 10:30 p.m.”

    Looking to the future

    Now a research scientist at the PSFC, Tinguely continues to work at JET remotely. He sometimes wishes he could again ride that train from Oxford to Culham — which he fondly remembers for its clean, comfortable efficiency — to see work colleagues and to visit local friends. The life he created for himself in England included practice and performance with the 125-year-old Oxford Bach Choir, as well as weekly dinner service at The Gatehouse, a facility that offers free support for the local homeless and low-income communities.

    “Being back is exciting too,” he says. “It’s fun to see how things have changed, how people and projects have grown, what new opportunities have arrived.”

    He refers specifically to a project that is beginning to take up more of his time: SPARC, the tokamak the PSFC supports in collaboration with Commonwealth Fusion Systems. Designed to use deuterium-tritium to make net fusion gains, SPARC will be able to use the latest research on JET to advantage. Tinguely is already exploring how his expertise with Alfvén eigenmodes can support the experiment.

    “I actually had an opportunity to do my PhD — or DPhil as they would call it — at Oxford University, but I went to MIT for grad school instead,” Tinguely reveals. “So, this is almost like closure, in a sense. I got to have my Oxford experience in the end, just in a different way, and have the MIT experience too.”

    He adds, “And I see myself being here at MIT for some time.” More

  • in

    More sensitive X-ray imaging

    Scintillators are materials that emit light when bombarded with high-energy particles or X-rays. In medical or dental X-ray systems, they convert incoming X-ray radiation into visible light that can then be captured using film or photosensors. They’re also used for night-vision systems and for research, such as in particle detectors or electron microscopes.

    Researchers at MIT have now shown how one could improve the efficiency of scintillators by at least tenfold, and perhaps even a hundredfold, by changing the material’s surface to create certain nanoscale configurations, such as arrays of wave-like ridges. While past attempts to develop more efficient scintillators have focused on finding new materials, the new approach could in principle work with any of the existing materials.

    Though it will require more time and effort to integrate their scintillators into existing X-ray machines, the team believes that this method might lead to improvements in medical diagnostic X-rays or CT scans, to reduce dose exposure and improve image quality. In other applications, such as X-ray inspection of manufactured parts for quality control, the new scintillators could enable inspections with higher accuracy or at faster speeds.

    The findings are described today in the journal Science, in a paper by MIT doctoral students Charles Roques-Carmes and Nicholas Rivera; MIT professors Marin Soljacic, Steven Johnson, and John Joannopoulos; and 10 others.

    While scintillators have been in use for some 70 years, much of the research in the field has focused on developing new materials that produce brighter or faster light emissions. The new approach instead applies advances in nanotechnology to existing materials. By creating patterns in scintillator materials at a length scale comparable to the wavelengths of the light being emitted, the team found that it was possible to dramatically change the material’s optical properties.

    To make what they coined “nanophotonic scintillators,” Roques-Carmes says, “you can directly make patterns inside the scintillators, or you can glue on another material that would have holes on the nanoscale. The specifics depend on the exact structure and material.” For this research, the team took a scintillator and made holes spaced apart by roughly one optical wavelength, or about 500 nanometers (billionths of a meter).

    “The key to what we’re doing is a general theory and framework we have developed,” Rivera says. This allows the researchers to calculate the scintillation levels that would be produced by any arbitrary configuration of nanophotonic structures. The scintillation process itself involves a series of steps, making it complicated to unravel. The framework the team developed involves integrating three different types of physics, Roques-Carmes says. Using this system they have found a good match between their predictions and the results of their subsequent experiments.

    The experiments showed a tenfold improvement in emission from the treated scintillator. “So, this is something that might translate into applications for medical imaging, which are optical photon-starved, meaning the conversion of X-rays to optical light limits the image quality. [In medical imaging,] you do not want to irradiate your patients with too much of the X-rays, especially for routine screening, and especially for young patients as well,” Roques-Carmes says.

    “We believe that this will open a new field of research in nanophotonics,” he adds. “You can use a lot of the existing work and research that has been done in the field of nanophotonics to improve significantly on existing materials that scintillate.”

    “The research presented in this paper is hugely significant,” says Rajiv Gupta, chief of neuroradiology at Massachusetts General Hospital and an associate professor at Harvard Medical School, who was not associated with this work. “Nearly all detectors used in the $100 billion [medical X-ray] industry are indirect detectors,” which is the type of detector the new findings apply to, he says. “Everything that I use in my clinical practice today is based on this principle. This paper improves the efficiency of this process by 10 times. If this claim is even partially true, say the improvement is two times instead of 10 times, it would be transformative for the field!”

    Soljacic says that while their experiments proved a tenfold improvement in emission could be achieved in particular systems, by further fine-tuning the design of the nanoscale patterning, “we also show that you can get up to 100 times [improvement] in certain scintillator systems, and we believe we also have a path toward making it even better,” he says.

    Soljacic points out that in other areas of nanophotonics, a field that deals with how light interacts with materials that are structured at the nanometer scale, the development of computational simulations has enabled rapid, substantial improvements, for example in the development of solar cells and LEDs. The new models this team developed for scintillating materials could facilitate similar leaps in this technology, he says.

    Nanophotonics techniques “give you the ultimate power of tailoring and enhancing the behavior of light,” Soljacic says. “But until now, this promise, this ability to do this with scintillation was unreachable because modeling the scintillation was very challenging. Now, this work for the first time opens up this field of scintillation, fully opens it, for the application of nanophotonics techniques.” More generally, the team believes that the combination of nanophotonic and scintillators might ultimately enable higher resolution, reduced X-ray dose, and energy-resolved X-ray imaging.

    This work is “very original and excellent,” says Eli Yablonovitch, a professor of Electrical Engineering and Computer Sciences at the University of California at Berkeley, who was not associated with this research. “New scintillator concepts are very important in medical imaging and in basic research.”

    Yablonovitch adds that while the concept still needs to be proven in a practical device, he says that, “After years of research on photonic crystals in optical communication and other fields, it’s long overdue that photonic crystals should be applied to scintillators, which are of great practical importance yet have been overlooked” until this work.

    The research team included Ali Ghorashi, Steven Kooi, Yi Yang, Zin Lin, Justin Beroz, Aviram Massuda, Jamison Sloan, and Nicolas Romeo at MIT; Yang Yu at Raith America, Inc.; and Ido Kaminer at Technion in Israel. The work was supported, in part, by the U.S. Army Research Office and the U.S. Army Research Laboratory through the Institute for Soldier Nanotechnologies, by the Air Force Office of Scientific Research, and by a Mathworks Engineering Fellowship. More

  • in

    Advancing public understanding of sea-level rise

    Museum exhibits can be a unique way to communicate science concepts and information. Recently, MIT faculty have served as sounding boards for curators at the Museum of Science, Boston, a close neighbor of the MIT campus.

    In January, Professor Emerita Paola Malanotte-Rizzoli and Cecil and Ida Green Professor Raffaele Ferrari of the Department of Earth, Atmospheric and Planetary Science (EAPS) visited the museum to view the newly opened pilot exhibit, “Resilient Venice: Adapting to Climate Change.”

    When Malanotte-Rizzoli was asked to contribute her expertise on the efforts in Venice, Italy, to mitigate flood damage, she was more than willing to offer her knowledge. “I love Venice. It is fun to tell people all of the challenges which you see the lagoon has … how much must be done to preserve, not only the city, but the environment, the islands and buildings,” she says.

    The installation is the second Museum of Science exhibit to be developed in recent years in consultation with EAPS scientists. In December 2020, “Arctic Adventure: Exploring with Technology” opened with the help of Cecil and Ida Green Career Development Professor Brent Minchew, who lent his expertise in geophysics and glaciology to the project. But for Malanotte-Rizzoli, the new exhibit hits a little closer to home.

    “My house is there,” Malanotte-Rizzoli excitedly pointed out on the exhibit’s aerial view of Venice, which includes a view above St. Mark’s Square and some of the surrounding city.

    “Resilient Venice” focuses on Malanotte-Rizzoli’s hometown, a city known for flooding. Built on a group of islands in the Venetian Lagoon, Venice has always experienced flooding, but climate change has brought unprecedented tide levels, causing billions of dollars in damages and even causing two deaths in the flood of 2019.

    The dark exhibit hall is lined with immersive images created by Iconem, a startup whose mission is digital preservation of endangered World Heritage Sites. The firm took detailed 3D scans and images of Venice to put together the displays and video.

    The video on which Malanotte-Rizzoli pointed to her home shows the potential sea level rise by 2100 if action isn’t taken. It shows the entrance to St. Mark’s Basilica completely submerged in water; she compares it to the disaster movie “The Day After Tomorrow.”

    The MOSE system

    Between critiques of the choice of music (“that’s not very Venice-inspired,” joked Ferrari, who is also Italian) and bits of conversation exchanged in Italian, the two scientists do what scientists do: discuss technicalities.

    Ferrari pointed to a model of a gate system and asked Malanotte-Rizzoli if the hydraulic jump seen in the model is present in the MOSE system; she confirmed it is not.

    This is the part of the exhibit that Malanotte-Rizzoli was consulted on. One of the plans Venice has implemented to address the flooding is the MOSE system — short for Modulo Sperimentale Elettromeccanico, or the Experimental Electromechanical Module. The MOSE is a system of flood barriers designed to protect the city from extremely high tides. Construction began in 2003, and its first successful operation happened on Oct. 3, 2020, when it prevented a tide 53 inches above normal from flooding the city.

    The barriers are made of a series of gates, each 66-98 feet in length and 66 feet wide, which sit in chambers built into the sea floor when not in use to allow boats and wildlife to travel between the ocean and lagoon. The gates are filled with water to keep them submerged; when activated, air is pumped into them, pushing out the water and allowing them to rise. The entire process takes 30 minutes to complete, and half that time to return to the sea floor.

    The top of the gates in the MOSE come out of the water completely and are individually controlled so that sections can remain open to allow ships to pass through. In the model, the gate remains partially submerged, and as the high-velocity water passes over it into an area of low velocity, it creates a small rise of water before it falls over the edge of the barrier, creating a hydraulic jump.

    But Malanotte-Rizzoli joked that only scientists will care about that; otherwise, the model does a good job demonstrating how the MOSE gates rise and fall.

    The MOSE system is only one of many plans taken to mitigate the rising water levels in Venice and to protect the lagoon and the surrounding area, and this is an important point for Malanotte-Rizzoli, who worked on the project from 1995 to 2013.

    “It is not the MOSE or,” she emphasized. “It is the MOSE and.” Other complementary plans have been implemented to reduce harm to both economic sectors, such as shipping and tourism, as well as the wildlife that live in the lagoons.

    Beyond barriers

    There’s more to protecting Venice than navigating flooded streets — it’s not just “putting on rainboots,” as Malanotte-Rizzoli put it.

    “It’s destroying the walls,” she said, pointing out the corrosive effects of water on a model building, which emphasizes the damage to architecture caused by the unusually high flood levels. “People don’t think about this.” The exhibit also emphasizes the economic costs of businesses lost by having visitors take down and rebuild a flood barrier for a gelato shop with the rising and falling water levels.

    Malanotte-Rizzoli gave the exhibit her seal of approval, but the Venice section is only a small portion of what the finished exhibit will look like. The current plan involves expanding it to include a few other World Heritage Sites.

    “How do we make people care about a site that they haven’t been to?” asked Julia Tate, the project manager of touring exhibits and exhibit production at the museum. She said that it’s easy to start with a city like Venice, since it’s a popular tourist destination. But it becomes trickier to get people to care about a site that they maybe haven’t been to, such as the Easter Islands, that are just as much at risk. The plan is to incorporate a few more sites before turning it into a traveling exhibit that will end by asking visitors to think about climate change in their own towns.

    “We want them to think about solutions and how to do better,” said Tate. Hope is the alternative message: It’s not too late to act.

    Malanotte-Rizzoli thinks it’s important for Bostonians to see their own city in Venice, as Boston is also at risk from sea level rise. The history of Boston reminds Malanotte-Rizzoli about her hometown and is one of the reasons why she was willing to emigrate. The history encompassed in Boston makes the need for preservation even more important.

    “Those things that cannot be replaced, they must be respected in the process of preservation,” she said. “Modern things and engineering can be done even in a city which is so fragile, so delicate.” More

  • in

    Understanding air pollution from space

    Climate change and air pollution are interlocking crises that threaten human health. Reducing emissions of some air pollutants can help achieve climate goals, and some climate mitigation efforts can in turn improve air quality.

    One part of MIT Professor Arlene Fiore’s research program is to investigate the fundamental science in understanding air pollutants — how long they persist and move through our environment to affect air quality.

    “We need to understand the conditions under which pollutants, such as ozone, form. How much ozone is formed locally and how much is transported long distances?” says Fiore, who notes that Asian air pollution can be transported across the Pacific Ocean to North America. “We need to think about processes spanning local to global dimensions.”

    Fiore, the Peter H. Stone and Paola Malanotte Stone Professor in Earth, Atmospheric and Planetary Sciences, analyzes data from on-the-ground readings and from satellites, along with models, to better understand the chemistry and behavior of air pollutants — which ultimately can inform mitigation strategies and policy setting.

    A global concern

    At the United Nations’ most recent climate change conference, COP26, air quality management was a topic discussed over two days of presentations.

    “Breathing is vital. It’s life. But for the vast majority of people on this planet right now, the air that they breathe is not giving life, but cutting it short,” said Sarah Vogel, senior vice president for health at the Environmental Defense Fund, at the COP26 session.

    “We need to confront this twin challenge now through both a climate and clean air lens, of targeting those pollutants that warm both the air and harm our health.”

    Earlier this year, the World Health Organization (WHO) updated its global air quality guidelines it had issued 15 years earlier for six key pollutants including ozone (O3), nitrogen dioxide (NO2), sulfur dioxide (SO2), and carbon monoxide (CO). The new guidelines are more stringent based on what the WHO stated is the “quality and quantity of evidence” of how these pollutants affect human health. WHO estimates that roughly 7 million premature deaths are attributable to the joint effects of air pollution.

    “We’ve had all these health-motivated reductions of aerosol and ozone precursor emissions. What are the implications for the climate system, both locally but also around the globe? How does air quality respond to climate change? We study these two-way interactions between air pollution and the climate system,” says Fiore.

    But fundamental science is still required to understand how gases, such as ozone and nitrogen dioxide, linger and move throughout the troposphere — the lowermost layer of our atmosphere, containing the air we breathe.

    “We care about ozone in the air we’re breathing where we live at the Earth’s surface,” says Fiore. “Ozone reacts with biological tissue, and can be damaging to plants and human lungs. Even if you’re a healthy adult, if you’re out running hard during an ozone smog event, you might feel an extra weight on your lungs.”

    Telltale signs from space

    Ozone is not emitted directly, but instead forms through chemical reactions catalyzed by radiation from the sun interacting with nitrogen oxides — pollutants released in large part from burning fossil fuels—and volatile organic compounds. However, current satellite instruments cannot sense ground-level ozone.

    “We can’t retrieve surface- or even near-surface ozone from space,” says Fiore of the satellite data, “although the anticipated launch of a new instrument looks promising for new advances in retrieving lower-tropospheric ozone”. Instead, scientists can look at signatures from other gas emissions to get a sense of ozone formation. “Nitrogen dioxide and formaldehyde are a heavy focus of our research because they serve as proxies for two of the key ingredients that go on to form ozone in the atmosphere.”

    To understand ozone formation via these precursor pollutants, scientists have gathered data for more than two decades using spectrometer instruments aboard satellites that measure sunlight in ultraviolet and visible wavelengths that interact with these pollutants in the Earth’s atmosphere — known as solar backscatter radiation.

    Satellites, such as NASA’s Aura, carry instruments like the Ozone Monitoring Instrument (OMI). OMI, along with European-launched satellites such as the Global Ozone Monitoring Experiment (GOME) and the Scanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY), and the newest generation TROPOspheric Monitoring instrument (TROPOMI), all orbit the Earth, collecting data during daylight hours when sunlight is interacting with the atmosphere over a particular location.

    In a recent paper from Fiore’s group, former graduate student Xiaomeng Jin (now a postdoc at the University of California at Berkeley), demonstrated that she could bring together and “beat down the noise in the data,” as Fiore says, to identify trends in ozone formation chemistry over several U.S. metropolitan areas that “are consistent with our on-the-ground understanding from in situ ozone measurements.”

    “This finding implies that we can use these records to learn about changes in surface ozone chemistry in places where we lack on-the-ground monitoring,” says Fiore. Extracting these signals by stringing together satellite data — OMI, GOME, and SCIAMACHY — to produce a two-decade record required reconciling the instruments’ differing orbit days, times, and fields of view on the ground, or spatial resolutions. 

    Currently, spectrometer instruments aboard satellites are retrieving data once per day. However, newer instruments, such as the Geostationary Environment Monitoring Spectrometer launched in February 2020 by the National Institute of Environmental Research in the Ministry of Environment of South Korea, will monitor a particular region continuously, providing much more data in real time.

    Over North America, the Tropospheric Emissions: Monitoring of Pollution Search (TEMPO) collaboration between NASA and the Smithsonian Astrophysical Observatory, led by Kelly Chance of Harvard University, will provide not only a stationary view of the atmospheric chemistry over the continent, but also a finer-resolution view — with the instrument recording pollution data from only a few square miles per pixel (with an anticipated launch in 2022).

    “What we’re very excited about is the opportunity to have continuous coverage where we get hourly measurements that allow us to follow pollution from morning rush hour through the course of the day and see how plumes of pollution are evolving in real time,” says Fiore.

    Data for the people

    Providing Earth-observing data to people in addition to scientists — namely environmental managers, city planners, and other government officials — is the goal for the NASA Health and Air Quality Applied Sciences Team (HAQAST).

    Since 2016, Fiore has been part of HAQAST, including collaborative “tiger teams” — projects that bring together scientists, nongovernment entities, and government officials — to bring data to bear on real issues.

    For example, in 2017, Fiore led a tiger team that provided guidance to state air management agencies on how satellite data can be incorporated into state implementation plans (SIPs). “Submission of a SIP is required for any state with a region in non-attainment of U.S. National Ambient Air Quality Standards to demonstrate their approach to achieving compliance with the standard,” says Fiore. “What we found is that small tweaks in, for example, the metrics we use to convey the science findings, can go a long way to making the science more usable, especially when there are detailed policy frameworks in place that must be followed.”

    Now, in 2021, Fiore is part of two tiger teams announced by HAQAST in late September. One team is looking at data to address environmental justice issues, by providing data to assess communities disproportionately affected by environmental health risks. Such information can be used to estimate the benefits of governmental investments in environmental improvements for disproportionately burdened communities. The other team is looking at urban emissions of nitrogen oxides to try to better quantify and communicate uncertainties in the estimates of anthropogenic sources of pollution.

    “For our HAQAST work, we’re looking at not just the estimate of the exposure to air pollutants, or in other words their concentrations,” says Fiore, “but how confident are we in our exposure estimates, which in turn affect our understanding of the public health burden due to exposure. We have stakeholder partners at the New York Department of Health who will pair exposure datasets with health data to help prioritize decisions around public health.

    “I enjoy working with stakeholders who have questions that require science to answer and can make a difference in their decisions.” Fiore says. More

  • in

    Seeing the plasma edge of fusion experiments in new ways with artificial intelligence

    To make fusion energy a viable resource for the world’s energy grid, researchers need to understand the turbulent motion of plasmas: a mix of ions and electrons swirling around in reactor vessels. The plasma particles, following magnetic field lines in toroidal chambers known as tokamaks, must be confined long enough for fusion devices to produce significant gains in net energy, a challenge when the hot edge of the plasma (over 1 million degrees Celsius) is just centimeters away from the much cooler solid walls of the vessel.

    Abhilash Mathews, a PhD candidate in the Department of Nuclear Science and Engineering working at MIT’s Plasma Science and Fusion Center (PSFC), believes this plasma edge to be a particularly rich source of unanswered questions. A turbulent boundary, it is central to understanding plasma confinement, fueling, and the potentially damaging heat fluxes that can strike material surfaces — factors that impact fusion reactor designs.

    To better understand edge conditions, scientists focus on modeling turbulence at this boundary using numerical simulations that will help predict the plasma’s behavior. However, “first principles” simulations of this region are among the most challenging and time-consuming computations in fusion research. Progress could be accelerated if researchers could develop “reduced” computer models that run much faster, but with quantified levels of accuracy.

    For decades, tokamak physicists have regularly used a reduced “two-fluid theory” rather than higher-fidelity models to simulate boundary plasmas in experiment, despite uncertainty about accuracy. In a pair of recent publications, Mathews begins directly testing the accuracy of this reduced plasma turbulence model in a new way: he combines physics with machine learning.

    “A successful theory is supposed to predict what you’re going to observe,” explains Mathews, “for example, the temperature, the density, the electric potential, the flows. And it’s the relationships between these variables that fundamentally define a turbulence theory. What our work essentially examines is the dynamic relationship between two of these variables: the turbulent electric field and the electron pressure.”

    In the first paper, published in Physical Review E, Mathews employs a novel deep-learning technique that uses artificial neural networks to build representations of the equations governing the reduced fluid theory. With this framework, he demonstrates a way to compute the turbulent electric field from an electron pressure fluctuation in the plasma consistent with the reduced fluid theory. Models commonly used to relate the electric field to pressure break down when applied to turbulent plasmas, but this one is robust even to noisy pressure measurements.

    In the second paper, published in Physics of Plasmas, Mathews further investigates this connection, contrasting it against higher-fidelity turbulence simulations. This first-of-its-kind comparison of turbulence across models has previously been difficult — if not impossible — to evaluate precisely. Mathews finds that in plasmas relevant to existing fusion devices, the reduced fluid model’s predicted turbulent fields are consistent with high-fidelity calculations. In this sense, the reduced turbulence theory works. But to fully validate it, “one should check every connection between every variable,” says Mathews.

    Mathews’ advisor, Principal Research Scientist Jerry Hughes, notes that plasma turbulence is notoriously difficult to simulate, more so than the familiar turbulence seen in air and water. “This work shows that, under the right set of conditions, physics-informed machine-learning techniques can paint a very full picture of the rapidly fluctuating edge plasma, beginning from a limited set of observations. I’m excited to see how we can apply this to new experiments, in which we essentially never observe every quantity we want.”

    These physics-informed deep-learning methods pave new ways in testing old theories and expanding what can be observed from new experiments. David Hatch, a research scientist at the Institute for Fusion Studies at the University of Texas at Austin, believes these applications are the start of a promising new technique.

    “Abhi’s work is a major achievement with the potential for broad application,” he says. “For example, given limited diagnostic measurements of a specific plasma quantity, physics-informed machine learning could infer additional plasma quantities in a nearby domain, thereby augmenting the information provided by a given diagnostic. The technique also opens new strategies for model validation.”

    Mathews sees exciting research ahead.

    “Translating these techniques into fusion experiments for real edge plasmas is one goal we have in sight, and work is currently underway,” he says. “But this is just the beginning.”

    Mathews was supported in this work by the Manson Benedict Fellowship, Natural Sciences and Engineering Research Council of Canada, and U.S. Department of Energy Office of Science under the Fusion Energy Sciences program.​ More

  • in

    Predator interactions chiefly determine where Prochlorococcus thrive

    Prochlorococcus are the smallest and most abundant photosynthesizing organisms on the planet. A single Prochlorococcus cell is dwarfed by a human red blood cell, yet globally the microbes number in the octillions and are responsible for a large fraction of the world’s oxygen production as they turn sunlight into energy.

    Prochlorococcus can be found in the ocean’s warm surface waters, and their population drops off dramatically in regions closer to the poles. Scientists have assumed that, as with many marine species, Prochlorococcus’ range is set by temperature: The colder the waters, the less likely the microbes are to live there.

    But MIT scientists have found that where the microbe lives is not determined primarily by temperature. While Prochlorococcus populations do drop off in colder waters, it’s a relationship with a shared predator, and not temperature, that sets the microbe’s range. These findings, published today in the Proceedings of the National Academy of Sciences, could help scientists predict how the microbes’ populations will shift with climate change.

    “People assume that if the ocean warms up, Prochlorococcus will move poleward. And that may be true, but not for the reason they’re predicting,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, temperature is a bit of a red herring.”

    Dutkiewicz’s co-authors on the study are lead author and EAPS Research Scientist Christopher Follett, EAPS Professor Mick Follows, François Ribalet and Virginia Armbrust of the University of Washington, and Emily Zakem and David Caron of the University of Southern California at Los Angeles.

    Temperature’s collapse

    While temperature is thought to set the range of Prochloroccus and other phytoplankton in the ocean, Follett, Dutkiewicz, and their colleagues noticed a curious dissonance in data.

    The team examined observations from several research cruises that sailed through the northeast Pacific Ocean in 2003, 2016, and 2017. Each vessel traversed different latitudes, sampling waters continuously and measuring concentrations of various species of bacteria and phytoplankton, including Prochlorococcus. 

    The MIT team used the publicly archived cruise data to map out the locations where Prochlorococcus noticeably decreased or collapsed, along with each location’s ocean temperature. Surprisingly, they found that Prochlorococcus’ collapse occurred in regions of widely varying temperatures, ranging from around 13 to 18 degrees Celsius. Curiously, the upper end of this range has been shown in lab experiments to be suitable conditions for Prochlorococcus to grow and thrive.

    “Temperature itself was not able to explain where we saw these drop-offs,” Follett says.

    Follett was also working out an alternate idea related to Prochlorococcus and nutrient supply. As a byproduct of its photosynthesis, the microbe produces carbohydrate — an essential nutrient for heterotrophic bacteria, which are single-celled organisms that do not photosynthesize but live off the organic matter produced by phytoplankton.

    “Somewhere along the way, I wondered, what would happen if this food source Prochlorococcus was producing increased? What if we took that knob and spun it?” Follett says.

    In other words, how would the balance of Prochlorococcus and bacteria shift if the bacteria’s food increased as a result of, say, an increase in other carbohydrate-producing phytoplankton? The team also wondered: If the bacteria in question were about the same size as Prochlorococcus, the two would likely share a common grazer, or predator. How would the grazer’s population also shift with a change in carbohydrate supply?

    “Then we went to the whiteboard and started writing down equations and solving them for various cases, and realized that as soon as you reach an environment where other species add carbohydrates to the mix, bacteria and grazers grow up and annihilate Prochlorococcus,” Dutkiewicz says.

    Nutrient shift

    To test this idea, the researchers employed simulations of ocean circulation and marine ecosystem interactions. The team ran the MITgcm, a general circulation model that simulates, in this case, the ocean currents and regions of upwelling waters around the world. They overlaid a biogeochemistry model that simulates how nutrients are redistributed in the ocean. To all of this, they linked a complex ecosystem model that simulates the interactions between many different species of bacteria and phytoplankton, including Prochlorococcus.

    When they ran the simulations without incorporating a representation of bacteria, they found that Prochlorococcus persisted all the way to the poles, contrary to theory and observations. When they added in the equations outlining the relationship between the microbe, bacteria, and a shared predator, Prochlorococcus’ range shifted away from the poles, matching the observations of the original research cruises.

    In particular, the team observed that Prochlorococcus thrived in waters with very low nutrient levels, and where it is the dominant source of food for bacteria. These waters also happen to be warm, and Prochlorococcus and bacteria live in balance, along with their shared predator. But in more nutrient-rich enviroments, such as polar regions, where cold water and nutrients are upwelled from the deep ocean, many more species of phytoplankton can thrive. Bacteria can then feast and grow on more food sources, and in turn feed and grow more of its shared predator. Prochlorococcus, unable to keep up, is quickly decimated. 

    The results show that a relationship with a shared predator, and not temperature, sets Prochlorococcus’ range. Incorporating this mechanism into models will be crucial in predicting how the microbe — and possibly other marine species — will shift with climate change.

    “Prochlorococcus is a big harbinger of changes in the global ocean,” Dutkiewicz says. “If its range expands, that’s a canary — a sign that things have changed in the ocean by a great deal.”

    “There are reasons to believe its range will expand with a warming world,” Follett adds.” But we have to understand the physical mechanisms that set these ranges. And predictions just based on temperature will not be correct.” More

  • in

    Scientists build new atlas of ocean’s oxygen-starved waters

    Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

    Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

    The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

    From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers — roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

    The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

    “It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

    The team’s study appears today in the journal Global Biogeochemical Cycles.

    Airing out artifacts

    Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

    Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

    “But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

    Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

    Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

    “We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

    Bottoming out

    The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

    The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

    “We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says. 

    The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

    “We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

    Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

    “How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

    “This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s oxygen supply is controlled.”

    This research is supported, in part, by the Simons Foundation. More