More stories

  • in

    Understanding air pollution from space

    Climate change and air pollution are interlocking crises that threaten human health. Reducing emissions of some air pollutants can help achieve climate goals, and some climate mitigation efforts can in turn improve air quality.

    One part of MIT Professor Arlene Fiore’s research program is to investigate the fundamental science in understanding air pollutants — how long they persist and move through our environment to affect air quality.

    “We need to understand the conditions under which pollutants, such as ozone, form. How much ozone is formed locally and how much is transported long distances?” says Fiore, who notes that Asian air pollution can be transported across the Pacific Ocean to North America. “We need to think about processes spanning local to global dimensions.”

    Fiore, the Peter H. Stone and Paola Malanotte Stone Professor in Earth, Atmospheric and Planetary Sciences, analyzes data from on-the-ground readings and from satellites, along with models, to better understand the chemistry and behavior of air pollutants — which ultimately can inform mitigation strategies and policy setting.

    A global concern

    At the United Nations’ most recent climate change conference, COP26, air quality management was a topic discussed over two days of presentations.

    “Breathing is vital. It’s life. But for the vast majority of people on this planet right now, the air that they breathe is not giving life, but cutting it short,” said Sarah Vogel, senior vice president for health at the Environmental Defense Fund, at the COP26 session.

    “We need to confront this twin challenge now through both a climate and clean air lens, of targeting those pollutants that warm both the air and harm our health.”

    Earlier this year, the World Health Organization (WHO) updated its global air quality guidelines it had issued 15 years earlier for six key pollutants including ozone (O3), nitrogen dioxide (NO2), sulfur dioxide (SO2), and carbon monoxide (CO). The new guidelines are more stringent based on what the WHO stated is the “quality and quantity of evidence” of how these pollutants affect human health. WHO estimates that roughly 7 million premature deaths are attributable to the joint effects of air pollution.

    “We’ve had all these health-motivated reductions of aerosol and ozone precursor emissions. What are the implications for the climate system, both locally but also around the globe? How does air quality respond to climate change? We study these two-way interactions between air pollution and the climate system,” says Fiore.

    But fundamental science is still required to understand how gases, such as ozone and nitrogen dioxide, linger and move throughout the troposphere — the lowermost layer of our atmosphere, containing the air we breathe.

    “We care about ozone in the air we’re breathing where we live at the Earth’s surface,” says Fiore. “Ozone reacts with biological tissue, and can be damaging to plants and human lungs. Even if you’re a healthy adult, if you’re out running hard during an ozone smog event, you might feel an extra weight on your lungs.”

    Telltale signs from space

    Ozone is not emitted directly, but instead forms through chemical reactions catalyzed by radiation from the sun interacting with nitrogen oxides — pollutants released in large part from burning fossil fuels—and volatile organic compounds. However, current satellite instruments cannot sense ground-level ozone.

    “We can’t retrieve surface- or even near-surface ozone from space,” says Fiore of the satellite data, “although the anticipated launch of a new instrument looks promising for new advances in retrieving lower-tropospheric ozone”. Instead, scientists can look at signatures from other gas emissions to get a sense of ozone formation. “Nitrogen dioxide and formaldehyde are a heavy focus of our research because they serve as proxies for two of the key ingredients that go on to form ozone in the atmosphere.”

    To understand ozone formation via these precursor pollutants, scientists have gathered data for more than two decades using spectrometer instruments aboard satellites that measure sunlight in ultraviolet and visible wavelengths that interact with these pollutants in the Earth’s atmosphere — known as solar backscatter radiation.

    Satellites, such as NASA’s Aura, carry instruments like the Ozone Monitoring Instrument (OMI). OMI, along with European-launched satellites such as the Global Ozone Monitoring Experiment (GOME) and the Scanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY), and the newest generation TROPOspheric Monitoring instrument (TROPOMI), all orbit the Earth, collecting data during daylight hours when sunlight is interacting with the atmosphere over a particular location.

    In a recent paper from Fiore’s group, former graduate student Xiaomeng Jin (now a postdoc at the University of California at Berkeley), demonstrated that she could bring together and “beat down the noise in the data,” as Fiore says, to identify trends in ozone formation chemistry over several U.S. metropolitan areas that “are consistent with our on-the-ground understanding from in situ ozone measurements.”

    “This finding implies that we can use these records to learn about changes in surface ozone chemistry in places where we lack on-the-ground monitoring,” says Fiore. Extracting these signals by stringing together satellite data — OMI, GOME, and SCIAMACHY — to produce a two-decade record required reconciling the instruments’ differing orbit days, times, and fields of view on the ground, or spatial resolutions. 

    Currently, spectrometer instruments aboard satellites are retrieving data once per day. However, newer instruments, such as the Geostationary Environment Monitoring Spectrometer launched in February 2020 by the National Institute of Environmental Research in the Ministry of Environment of South Korea, will monitor a particular region continuously, providing much more data in real time.

    Over North America, the Tropospheric Emissions: Monitoring of Pollution Search (TEMPO) collaboration between NASA and the Smithsonian Astrophysical Observatory, led by Kelly Chance of Harvard University, will provide not only a stationary view of the atmospheric chemistry over the continent, but also a finer-resolution view — with the instrument recording pollution data from only a few square miles per pixel (with an anticipated launch in 2022).

    “What we’re very excited about is the opportunity to have continuous coverage where we get hourly measurements that allow us to follow pollution from morning rush hour through the course of the day and see how plumes of pollution are evolving in real time,” says Fiore.

    Data for the people

    Providing Earth-observing data to people in addition to scientists — namely environmental managers, city planners, and other government officials — is the goal for the NASA Health and Air Quality Applied Sciences Team (HAQAST).

    Since 2016, Fiore has been part of HAQAST, including collaborative “tiger teams” — projects that bring together scientists, nongovernment entities, and government officials — to bring data to bear on real issues.

    For example, in 2017, Fiore led a tiger team that provided guidance to state air management agencies on how satellite data can be incorporated into state implementation plans (SIPs). “Submission of a SIP is required for any state with a region in non-attainment of U.S. National Ambient Air Quality Standards to demonstrate their approach to achieving compliance with the standard,” says Fiore. “What we found is that small tweaks in, for example, the metrics we use to convey the science findings, can go a long way to making the science more usable, especially when there are detailed policy frameworks in place that must be followed.”

    Now, in 2021, Fiore is part of two tiger teams announced by HAQAST in late September. One team is looking at data to address environmental justice issues, by providing data to assess communities disproportionately affected by environmental health risks. Such information can be used to estimate the benefits of governmental investments in environmental improvements for disproportionately burdened communities. The other team is looking at urban emissions of nitrogen oxides to try to better quantify and communicate uncertainties in the estimates of anthropogenic sources of pollution.

    “For our HAQAST work, we’re looking at not just the estimate of the exposure to air pollutants, or in other words their concentrations,” says Fiore, “but how confident are we in our exposure estimates, which in turn affect our understanding of the public health burden due to exposure. We have stakeholder partners at the New York Department of Health who will pair exposure datasets with health data to help prioritize decisions around public health.

    “I enjoy working with stakeholders who have questions that require science to answer and can make a difference in their decisions.” Fiore says. More

  • in

    Predator interactions chiefly determine where Prochlorococcus thrive

    Prochlorococcus are the smallest and most abundant photosynthesizing organisms on the planet. A single Prochlorococcus cell is dwarfed by a human red blood cell, yet globally the microbes number in the octillions and are responsible for a large fraction of the world’s oxygen production as they turn sunlight into energy.

    Prochlorococcus can be found in the ocean’s warm surface waters, and their population drops off dramatically in regions closer to the poles. Scientists have assumed that, as with many marine species, Prochlorococcus’ range is set by temperature: The colder the waters, the less likely the microbes are to live there.

    But MIT scientists have found that where the microbe lives is not determined primarily by temperature. While Prochlorococcus populations do drop off in colder waters, it’s a relationship with a shared predator, and not temperature, that sets the microbe’s range. These findings, published today in the Proceedings of the National Academy of Sciences, could help scientists predict how the microbes’ populations will shift with climate change.

    “People assume that if the ocean warms up, Prochlorococcus will move poleward. And that may be true, but not for the reason they’re predicting,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, temperature is a bit of a red herring.”

    Dutkiewicz’s co-authors on the study are lead author and EAPS Research Scientist Christopher Follett, EAPS Professor Mick Follows, François Ribalet and Virginia Armbrust of the University of Washington, and Emily Zakem and David Caron of the University of Southern California at Los Angeles.

    Temperature’s collapse

    While temperature is thought to set the range of Prochloroccus and other phytoplankton in the ocean, Follett, Dutkiewicz, and their colleagues noticed a curious dissonance in data.

    The team examined observations from several research cruises that sailed through the northeast Pacific Ocean in 2003, 2016, and 2017. Each vessel traversed different latitudes, sampling waters continuously and measuring concentrations of various species of bacteria and phytoplankton, including Prochlorococcus. 

    The MIT team used the publicly archived cruise data to map out the locations where Prochlorococcus noticeably decreased or collapsed, along with each location’s ocean temperature. Surprisingly, they found that Prochlorococcus’ collapse occurred in regions of widely varying temperatures, ranging from around 13 to 18 degrees Celsius. Curiously, the upper end of this range has been shown in lab experiments to be suitable conditions for Prochlorococcus to grow and thrive.

    “Temperature itself was not able to explain where we saw these drop-offs,” Follett says.

    Follett was also working out an alternate idea related to Prochlorococcus and nutrient supply. As a byproduct of its photosynthesis, the microbe produces carbohydrate — an essential nutrient for heterotrophic bacteria, which are single-celled organisms that do not photosynthesize but live off the organic matter produced by phytoplankton.

    “Somewhere along the way, I wondered, what would happen if this food source Prochlorococcus was producing increased? What if we took that knob and spun it?” Follett says.

    In other words, how would the balance of Prochlorococcus and bacteria shift if the bacteria’s food increased as a result of, say, an increase in other carbohydrate-producing phytoplankton? The team also wondered: If the bacteria in question were about the same size as Prochlorococcus, the two would likely share a common grazer, or predator. How would the grazer’s population also shift with a change in carbohydrate supply?

    “Then we went to the whiteboard and started writing down equations and solving them for various cases, and realized that as soon as you reach an environment where other species add carbohydrates to the mix, bacteria and grazers grow up and annihilate Prochlorococcus,” Dutkiewicz says.

    Nutrient shift

    To test this idea, the researchers employed simulations of ocean circulation and marine ecosystem interactions. The team ran the MITgcm, a general circulation model that simulates, in this case, the ocean currents and regions of upwelling waters around the world. They overlaid a biogeochemistry model that simulates how nutrients are redistributed in the ocean. To all of this, they linked a complex ecosystem model that simulates the interactions between many different species of bacteria and phytoplankton, including Prochlorococcus.

    When they ran the simulations without incorporating a representation of bacteria, they found that Prochlorococcus persisted all the way to the poles, contrary to theory and observations. When they added in the equations outlining the relationship between the microbe, bacteria, and a shared predator, Prochlorococcus’ range shifted away from the poles, matching the observations of the original research cruises.

    In particular, the team observed that Prochlorococcus thrived in waters with very low nutrient levels, and where it is the dominant source of food for bacteria. These waters also happen to be warm, and Prochlorococcus and bacteria live in balance, along with their shared predator. But in more nutrient-rich enviroments, such as polar regions, where cold water and nutrients are upwelled from the deep ocean, many more species of phytoplankton can thrive. Bacteria can then feast and grow on more food sources, and in turn feed and grow more of its shared predator. Prochlorococcus, unable to keep up, is quickly decimated. 

    The results show that a relationship with a shared predator, and not temperature, sets Prochlorococcus’ range. Incorporating this mechanism into models will be crucial in predicting how the microbe — and possibly other marine species — will shift with climate change.

    “Prochlorococcus is a big harbinger of changes in the global ocean,” Dutkiewicz says. “If its range expands, that’s a canary — a sign that things have changed in the ocean by a great deal.”

    “There are reasons to believe its range will expand with a warming world,” Follett adds.” But we have to understand the physical mechanisms that set these ranges. And predictions just based on temperature will not be correct.” More

  • in

    Scientists build new atlas of ocean’s oxygen-starved waters

    Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

    Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

    The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

    From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers — roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

    The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

    “It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

    The team’s study appears today in the journal Global Biogeochemical Cycles.

    Airing out artifacts

    Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

    Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

    “But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

    Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

    Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

    “We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

    Bottoming out

    The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

    The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

    “We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says. 

    The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

    “We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

    Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

    “How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

    “This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s oxygen supply is controlled.”

    This research is supported, in part, by the Simons Foundation. More

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More

  • in

    Nanograins make for a seismic shift

    In Earth’s crust, tectonic blocks slide and grind past each other like enormous ships loosed from anchor. Earthquakes are generated along these fault zones when enough stress builds for a block to stick, then suddenly slip.

    These slips can be aided by several factors that reduce friction within a fault zone, such as hotter temperatures or pressurized gases that can separate blocks like pucks on an air-hockey table. The decreasing friction enables one tectonic block to accelerate against the other until it runs out of energy. Seismologists have long believed this kind of frictional instability can explain how all crustal earthquakes start. But that might not be the whole story.

    In a study published today in Nature Communications, scientists Hongyu Sun and Matej Pec, from MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS), find that ultra-fine-grained crystals within fault zones can behave like low-viscosity fluids. The finding offers an alternative explanation for the instability that leads to crustal earthquakes. It also suggests a link between quakes in the crust and other types of temblors that occur deep in the Earth.

    Nanograins are commonly found in rocks from seismic environments along the smooth surface of “fault mirrors.” These polished, reflective rock faces betray the slipping, sliding forces of past earthquakes. However, it was unclear whether the crystals caused quakes or were merely formed by them.

    To better characterize how these crystals behaved within a fault, the researchers used a planetary ball milling machine to pulverize granite rocks into particles resembling those found in nature. Like a super-powered washing machine filled with ceramic balls, the machine pounded the rock until all its crystals were about 100 nanometers in width, each grain 1/2,000 the size of an average grain of sand.

    After packing the nanopowder into postage-stamp sized cylinders jacketed in gold, the researchers then subjected the material to stresses and heat, creating laboratory miniatures of real fault zones. This process enabled them to isolate the effect of the crystals from the complexity of other factors involved in an actual earthquake.

    The researchers report that the crystals were extremely weak when shearing was initiated — an order of magnitude weaker than more common microcrystals. But the nanocrystals became significantly stronger when the deformation rate was accelerated. Pec, professor of geophysics and the Victor P. Starr Career Development Chair, compares this characteristic, called “rate-strengthening,” to stirring honey in a jar. Stirring the honey slowly is easy, but becomes more difficult the faster you stir.

    The experiment suggests something similar happens in fault zones. As tectonic blocks accelerate past each other, the crystals gum things up between them like honey stirred in a seismic pot.

    Sun, the study’s lead author and EAPS graduate student, explains that their finding runs counter to the dominant frictional weakening theory of how earthquakes start. That theory would predict surfaces of a fault zone have material that gets weaker as the fault block accelerates, and friction should be decreasing. The nanocrystals did just the opposite. However, the crystals’ intrinsic weakness could mean that when enough of them accumulate within a fault, they can give way, causing an earthquake.

    “We don’t totally disagree with the old theorem, but our study really opens new doors to explain the mechanisms of how earthquakes happen in the crust,” Sun says.

    The finding also suggests a previously unrecognized link between earthquakes in the crust and the earthquakes that rumble hundreds of kilometers beneath the surface, where the same tectonic dynamics aren’t at play. That deep, there are no tectonic blocks to grind against each other, and even if there were, the immense pressure would prevent the type of quakes observed in the crust that necessitate some dilatancy and void creation.

    “We know that earthquakes happen all the way down to really big depths where this motion along a frictional fault is basically impossible,” says Pec. “And so clearly, there must be different processes that allow for these earthquakes to happen.”

    Possible mechanisms for these deep-Earth tremors include “phase transitions” which occur due to atomic re-arrangement in minerals and are accompanied by a volume change, and other kinds of metamorphic reactions, such as dehydration of water-bearing minerals, in which the released fluid is pumped through pores and destabilizes a fault. These mechanisms are all characterized by a weak, rate-strengthening layer.

    If weak, rate-strengthening nanocrystals are abundant in the deep Earth, they could present another possible mechanism, says Pec. “Maybe crustal earthquakes are not a completely different beast than the deeper earthquakes. Maybe they have something in common.” More

  • in

    Scientists project increased risk to water supplies in South Africa this century

    In 2018, Cape Town, South Africa’s second most populous city, came very close to running out of water as the multi-year “Day Zero” drought depleted its reservoirs. Since then, researchers from Stanford University determined that climate change had made this extreme drought five to six times more likely, and warned that a lot more Day Zero events could occur in regions with similar climates in the future. A better understanding of likely surface air temperature and precipitation trends in South Africa and other dry, populated areas around the world in the coming decades could empower decision-makers to pursue science-based climate mitigation and adaptation measures designed to reduce the risk of future Day Zero events.    

    Toward that end, researchers at the MIT Joint Program on the Science and Policy of Global Change, International Food Policy Research Institute, and CGIAR have produced modeled projections of 21st-century changes in seasonal surface air temperature and precipitation for South Africa that systematically and comprehensively account for uncertainties in how Earth and socioeconomic systems behave and co-evolve. Presented in a study in the journal Climatic Change, these projections show how temperature and precipitation over three sub-national regions — western, central, and eastern South Africa — are likely to change under a wide range of global climate mitigation policy scenarios.

    In a business-as-usual global climate policy scenario in which no emissions or climate targets are set or met, the projections show that for all three regions, there’s a greater-than 50 percent likelihood that mid-century temperatures will increase threefold over the current climate’s range of variability. But the risk of these mid-century temperature increases is effectively eliminated through more aggressive climate targets.

    The business-as-usual projections indicate that the risk of decreased precipitation levels in western and central South Africa is three to four times higher than the risk of increased precipitation levels. Under a global climate mitigation policy designed to cap global warming at 1.5 degrees Celsius by 2100, the risk of precipitation changes within South Africa toward the end of the century (2065-74) is similar to the risk during the 2030s in the business-as-usual scenario.

    Rising risks of substantially reduced precipitation levels throughout this century under a business-as-usual scenario suggest increased reliance and stress on the widespread water-efficiency measures established in the aftermath of the Day Zero drought. But a 1.5 C global climate mitigation policy would delay these risks by 30 years, giving South Africa ample lead time to prepare for and adapt to them.

    “Our analysis provides risk-based evidence on the benefits of climate mitigation policies as well as unavoidable climate impacts that will need to be addressed through adaptive measures,” says MIT Joint Program Deputy Director C. Adam Schlosser, the lead author of the study. “Global action to limit human-induced warming could give South Africa enough time to secure sufficient water supplies to sustain its population. Otherwise, anticipated climate shifts by the middle of the next decade may well make Day-Zero situations more common.”

    This study is part of an ongoing effort to assess the risks that climate change poses for South Africa’s agricultural, economic, energy and infrastructure sectors. More

  • in

    MIT collaborates with Biogen on three-year, $7 million initiative to address climate, health, and equity

    MIT and Biogen have announced that they will collaborate with the goal to accelerate the science and action on climate change to improve human health. This collaboration is supported by a three-year, $7 million commitment from the company and the Biogen Foundation. The biotechnology company, headquartered in Cambridge, Massachusetts’ Kendall Square, discovers and develops therapies for people living with serious neurological diseases.

    “We have long believed it is imperative for Biogen to make the fight against climate change central to our long-term corporate responsibility commitments. Through this collaboration with MIT, we aim to identify and share innovative climate solutions that will deliver co-benefits for both health and equity,” says Michel Vounatsos, CEO of Biogen. “We are also proud to support the MIT Museum, which promises to make world-class science and education accessible to all, and honor Biogen co-founder Phillip A. Sharp with a dedication inside the museum that recognizes his contributions to its development.”

    Biogen and the Biogen Foundation are supporting research and programs across a range of areas at MIT.

    Advancing climate, health, and equity

    The first such effort involves new work within the MIT Joint Program on the Science and Policy of Global Change to establish a state-of-the-art integrated model of climate and health aimed at identifying targets that deliver climate and health co-benefits.

    “Evidence suggests that not all climate-related actions deliver equal health benefits, yet policymakers, planners, and stakeholders traditionally lack the tools to consider how decisions in one arena impact the other,” says C. Adam Schlosser, deputy director of the MIT Joint Program. “Biogen’s collaboration with the MIT Joint Program — and its support of a new distinguished Biogen Fellow who will develop the new climate/health model — will accelerate our efforts to provide decision-makers with these tools.”

    Biogen is also supporting the MIT Technology and Policy Program’s Research to Policy Engagement Initiative to infuse human health as a key new consideration in decision-making on the best pathways forward to address the global climate crisis, and bridge the knowledge-to-action gap by connecting policymakers, researchers, and diverse stakeholders. As part of this work, Biogen is underwriting a distinguished Biogen Fellow to advance new research on climate, health, and equity.

    “Our work with Biogen has allowed us to make progress on key questions that matter to human health and well-being under climate change,” says Noelle Eckley Selin, who directs the MIT Technology and Policy Program and is a professor in the MIT Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences. “Further, their support of the Research to Policy Engagement Initiative helps all of our research become more effective in making change.”

    In addition, Biogen has joined 13 other companies in the MIT Climate and Sustainability Consortium (MCSC), which is supporting faculty and student research and developing impact pathways that present a range of actionable steps that companies can take — within and across industries — to advance progress toward climate targets.

    “Biogen joining the MIT Climate and Sustainability Consortium represents our commitment to working with member companies across a diverse range of industries, an approach that aims to drive changes swift and broad enough to match the scale of the climate challenge,” says Jeremy Gregory, executive director of the MCSC. “We are excited to welcome a member from the biotechnology space and look forward to harnessing Biogen’s perspectives as we continue to collaborate and work together with the MIT community in exciting and meaningful ways.”

    Making world-class science and education available to MIT Museum visitors

    Support from Biogen will honor Nobel laureate, MIT Institute professor, and Biogen co-founder Phillip A. Sharp with a named space inside the new Kendall Square location of the MIT Museum, set to open in spring 2022. Biogen also is supporting one of the museum’s opening exhibitions, “Essential MIT,” with a section focused on solving real-world problems such as climate change. It is also providing programmatic support for the museum’s Life Sciences Maker Engagement Program.

    “Phil has provided fantastic support to the MIT Museum for more than a decade as an advisory board member and now as board chair, and he has been deeply involved in plans for the new museum at Kendall Square,” says John Durant, the Mark R. Epstein (Class of 1963) Director of the museum. “Seeing his name on the wall will be a constant reminder of his key role in this development, as well as a mark of our gratitude.”

    Inspiring and empowering the next generation of scientists

    Biogen funding is also being directed to engage the next generation of scientists through support for the Biogen-MIT Biotech in Action: Virtual Lab, a program designed to foster a love of science among diverse and under-served student populations.

    Biogen’s support is part of its Healthy Climate, Healthy Lives initiative, a $250 million, 20-year commitment to eliminate fossil fuels across its operations and collaborate with renowned institutions to advance the science of climate and health and support under-served communities. Additional support is provided by the Biogen Foundation to further its long-standing focus on providing students with equitable access to outstanding science education. More

  • in

    Rover images confirm Jezero crater is an ancient Martian lake

    The first scientific analysis of images taken by NASA’s Perseverance rover has now confirmed that Mars’ Jezero crater — which today is a dry, wind-eroded depression — was once a quiet lake, fed steadily by a small river some 3.7 billion years ago.

    The images also reveal evidence that the crater endured flash floods. This flooding was energetic enough to sweep up large boulders from tens of miles upstream and deposit them into the lakebed, where the massive rocks lie today.

    The new analysis, published today in the journal Science, is based on images of the outcropping rocks inside the crater on its western side. Satellites had previously shown that this outcrop, seen from above, resembled river deltas on Earth, where layers of sediment are deposited in the shape of a fan as the river feeds into a lake.

    Perseverance’s new images, taken from inside the crater, confirm that this outcrop was indeed a river delta. Based on the sedimentary layers in the outcrop, it appears that the river delta fed into a lake that was calm for much of its existence, until a dramatic shift in climate triggered episodic flooding at or toward the end of the lake’s history.

    “If you look at these images, you’re basically staring at this epic desert landscape. It’s the most forlorn place you could ever visit,” says Benjamin Weiss, professor of planetary sciences in MIT’s Department of Earth, Atmospheric and Planetary Sciences and a member of the analysis team. “There’s not a drop of water anywhere, and yet, here we have evidence of a very different past. Something very profound happened in the planet’s history.”

    As the rover explores the crater, scientists hope to uncover more clues to its climatic evolution. Now that they have confirmed the crater was once a lake environment, they believe its sediments could hold traces of ancient aqueous life. In its mission going forward, Perseverance will look for locations to collect and preserve sediments. These samples will eventually be returned to Earth, where scientists can probe them for Martian biosignatures.

    “We now have the opportunity to look for fossils,” says team member Tanja Bosak, associate professor of geobiology at MIT. “It will take some time to get to the rocks that we really hope to sample for signs of life. So, it’s a marathon, with a lot of potential.”

    Tilted beds

    On Feb. 18, 2021, the Perseverance rover landed on the floor of Jezero crater, a little more than a mile away from its western fan-shaped outcrop. In the first three months, the vehicle remained stationary as NASA engineers performed remote checks of the rover’s many instruments.

    During this time, two of Perseverance’s cameras, Mastcam-Z and the SuperCam Remote Micro-Imager (RMI), captured images of their surroundings, including long-distance photos of the outcrop’s edge and a formation known as Kodiak butte, a smaller outcop that planetary geologists surmise may have once been connected to the main fan-shaped outcrop but has since partially eroded.

    Once the rover downlinked images to Earth, NASA’s Perseverance science team processed and combined the images, and were able to observe distinct beds of sediment along Kodiak butte in surprisingly high resolution. The researchers measured each layer’s thickness, slope, and lateral extent, finding that the sediment must have been deposited by flowing water into a lake, rather than by wind, sheet-like floods, or other geologic processes.

    The rover also captured similar tilted sediment beds along the main outcrop. These images, together with those of Kodiak, confirm that the fan-shaped formation was indeed an ancient delta and that this delta fed into an ancient Martian lake.

    “Without driving anywhere, the rover was able to solve one of the big unknowns, which was that this crater was once a lake,” Weiss says. “Until we actually landed there and confirmed it was a lake, it was always a question.”

    Boulder flow

    When the researchers took a closer look at images of the main outcrop, they noticed large boulders and cobbles embedded in the youngest, topmost layers of the delta. Some boulders measured as wide as 1 meter across, and were estimated to weigh up to several tons. These massive rocks, the team concluded, must have come from outside the crater, and was likely part of bedrock located on the crater rim or else 40 or more miles upstream.

    Judging from their current location and dimensions, the team says the boulders were carried downstream and into the lakebed by a flash-flood that flowed up to 9 meters per second and moved up to 3,000 cubic meters of water per second.

    “You need energetic flood conditions to carry rocks that big and heavy,” Weiss says. “It’s a special thing that may be indicative of a fundamental change in the local hydrology or perhaps the regional climate on Mars.”

    Because the huge rocks lie in the upper layers of the delta, they represent the most recently deposited material. The boulders sit atop layers of older, much finer sediment. This stratification, the researchers say, indicates that for much of its existence, the ancient lake was filled by a gently flowing river. Fine sediments — and possibly organic material — drifted down the river, and settled into a gradual, sloping delta.

    However, the crater later experienced sudden flash floods that deposited large boulders onto the delta. Once the lake dried up, and over billions of years wind eroded the landscape, leaving the crater we see today.

    The cause of this climate turnaround is unknown, although Weiss says the delta’s boulders may hold some answers.

    “The most surprising thing that’s come out of these images is the potential opportunity to catch the time when this crater transitioned from an Earth-like habitable environment, to this desolate landscape wasteland we see now,” he says. “These boulder beds may be records of this transition, and we haven’t seen this in other places on Mars.”

    This research was supported, in part, by NASA. More