More stories

  • in

    Smarter regulation of global shipping emissions could improve air quality and health outcomes

    Emissions from shipping activities around the world account for nearly 3 percent of total human-caused greenhouse gas emissions, and could increase by up to 50 percent by 2050, making them an important and often overlooked target for global climate mitigation. At the same time, shipping-related emissions of additional pollutants, particularly nitrogen and sulfur oxides, pose a significant threat to global health, as they degrade air quality enough to cause premature deaths.

    The main source of shipping emissions is the combustion of heavy fuel oil in large diesel engines, which disperses pollutants into the air over coastal areas. The nitrogen and sulfur oxides emitted from these engines contribute to the formation of PM2.5, airborne particulates with diameters of up to 2.5 micrometers that are linked to respiratory and cardiovascular diseases. Previous studies have estimated that PM2.5  from shipping emissions contribute to about 60,000 cardiopulmonary and lung cancer deaths each year, and that IMO 2020, an international policy that caps engine fuel sulfur content at 0.5 percent, could reduce PM2.5 concentrations enough to lower annual premature mortality by 34 percent.

    Global shipping emissions arise from both domestic (between ports in the same country) and international (between ports of different countries) shipping activities, and are governed by national and international policies, respectively. Consequently, effective mitigation of the air quality and health impacts of global shipping emissions will require that policymakers quantify the relative contributions of domestic and international shipping activities to these adverse impacts in an integrated global analysis.

    A new study in the journal Environmental Research Letters provides that kind of analysis for the first time. To that end, the study’s co-authors — researchers from MIT and the Hong Kong University of Science and Technology — implement a three-step process. First, they create global shipping emission inventories for domestic and international vessels based on ship activity records of the year 2015 from the Automatic Identification System (AIS). Second, they apply an atmospheric chemistry and transport model to this data to calculate PM2.5 concentrations generated by that year’s domestic and international shipping activities. Finally, they apply a model that estimates mortalities attributable to these pollutant concentrations.

    The researchers find that approximately 94,000 premature deaths were associated with PM2.5 exposure due to maritime shipping in 2015 — 83 percent international and 17 percent domestic. While international shipping accounted for the vast majority of the global health impact, some regions experienced significant health burdens from domestic shipping operations. This is especially true in East Asia: In China, 44 percent of shipping-related premature deaths were attributable to domestic shipping activities.

    “By comparing the health impacts from international and domestic shipping at the global level, our study could help inform decision-makers’ efforts to coordinate shipping emissions policies across multiple scales, and thereby reduce the air quality and health impacts of these emissions more effectively,” says Yiqi Zhang, a researcher at the Hong Kong University of Science and Technology who led the study as a visiting student supported by the MIT Joint Program on the Science and Policy of Global Change.

    In addition to estimating the air-quality and health impacts of domestic and international shipping, the researchers evaluate potential health outcomes under different shipping emissions-control policies that are either currently in effect or likely to be implemented in different regions in the near future.

    They estimate about 30,000 avoided deaths per year under a scenario consistent with IMO 2020, an international regulation limiting the sulfur content in shipping fuel oil to 0.5 percent — a finding that tracks with previous studies. Further strengthening regulations on sulfur content would yield only slight improvement; limiting sulfur content to 0.1 percent reduces annual shipping-attributable PM2.5-related premature deaths by an additional 5,000. In contrast, regulating nitrogen oxides instead, involving a Tier III NOx Standard would produce far greater benefits than a 0.1-percent sulfur cap, with 33,000 further avoided deaths.

    “Areas with high proportions of mortalities contributed by domestic shipping could effectively use domestic regulations to implement controls,” says study co-author Noelle Selin, a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences, and a faculty affiliate of the MIT Joint Program. “For other regions where much damage comes from international vessels, further international cooperation is required to mitigate impacts.” More

  • in

    Global warming begets more warming, new paleoclimate study finds

    It is increasingly clear that the prolonged drought conditions, record-breaking heat, sustained wildfires, and frequent, more extreme storms experienced in recent years are a direct result of rising global temperatures brought on by humans’ addition of carbon dioxide to the atmosphere. And a new MIT study on extreme climate events in Earth’s ancient history suggests that today’s planet may become more volatile as it continues to warm.

    The study, appearing today in Science Advances, examines the paleoclimate record of the last 66 million years, during the Cenozoic era, which began shortly after the extinction of the dinosaurs. The scientists found that during this period, fluctuations in the Earth’s climate experienced a surprising “warming bias.” In other words, there were far more warming events — periods of prolonged global warming, lasting thousands to tens of thousands of years — than cooling events. What’s more, warming events tended to be more extreme, with greater shifts in temperature, than cooling events.

    The researchers say a possible explanation for this warming bias may lie in a “multiplier effect,” whereby a modest degree of warming — for instance from volcanoes releasing carbon dioxide into the atmosphere — naturally speeds up certain biological and chemical processes that enhance these fluctuations, leading, on average, to still more warming.

    Interestingly, the team observed that this warming bias disappeared about 5 million years ago, around the time when ice sheets started forming in the Northern Hemisphere. It’s unclear what effect the ice has had on the Earth’s response to climate shifts. But as today’s Arctic ice recedes, the new study suggests that a multiplier effect may kick back in, and the result may be a further amplification of human-induced global warming.

    “The Northern Hemisphere’s ice sheets are shrinking, and could potentially disappear as a long-term consequence of human actions” says the study’s lead author Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “Our research suggests that this may make the Earth’s climate fundamentally more susceptible to extreme, long-term global warming events such as those seen in the geologic past.”

    Arnscheidt’s study co-author is Daniel Rothman, professor of geophysics at MIT, and  co-founder and co-director of MIT’s Lorenz Center.

    A volatile push

    For their analysis, the team consulted large databases of sediments containing deep-sea benthic foraminifera — single-celled organisms that have been around for hundreds of millions of years and whose hard shells are preserved in sediments. The composition of these shells is affected by the ocean temperatures as organisms are growing; the shells are therefore considered a reliable proxy for the Earth’s ancient temperatures.

    For decades, scientists have analyzed the composition of these shells, collected from all over the world and dated to various time periods, to track how the Earth’s temperature has fluctuated over millions of years. 

    “When using these data to study extreme climate events, most studies have focused on individual large spikes in temperature, typically of a few degrees Celsius warming,” Arnscheidt says. “Instead, we tried to look at the overall statistics and consider all the fluctuations involved, rather than picking out the big ones.”

    The team first carried out a statistical analysis of the data and observed that, over the last 66 million years, the distribution of global temperature fluctuations didn’t resemble a standard bell curve, with symmetric tails representing an equal probability of extreme warm and extreme cool fluctuations. Instead, the curve was noticeably lopsided, skewed toward more warm than cool events. The curve also exhibited a noticeably longer tail, representing warm events that were more extreme, or of higher temperature, than the most extreme cold events.

    “This indicates there’s some sort of amplification relative to what you would otherwise have expected,” Arnscheidt says. “Everything’s pointing to something fundamental that’s causing this push, or bias toward warming events.”

    “It’s fair to say that the Earth system becomes more volatile, in a warming sense,” Rothman adds.

    A warming multiplier

    The team wondered whether this warming bias might have been a result of “multiplicative noise” in the climate-carbon cycle. Scientists have long understood that higher temperatures, up to a point, tend to speed up biological and chemical processes. Because the carbon cycle, which is a key driver of long-term climate fluctuations, is itself composed of such processes, increases in temperature may lead to larger fluctuations, biasing the system towards extreme warming events.

    In mathematics, there exists a set of equations that describes such general amplifying, or multiplicative effects. The researchers applied this multiplicative theory to their analysis to see whether the equations could predict the asymmetrical distribution, including the degree of its skew and the length of its tails.

    In the end, they found that the data, and the observed bias toward warming, could be explained by the multiplicative theory. In other words, it’s very likely that, over the last 66 million years, periods of modest warming were on average further enhanced by multiplier effects, such as the response of biological and chemical processes that further warmed the planet.

    As part of the study, the researchers also looked at the correlation between past warming events and changes in Earth’s orbit. Over hundreds of thousands of years, Earth’s orbit around the sun regularly becomes more or less elliptical. But scientists have wondered why many past warming events appeared to coincide with these changes, and why these events feature outsized warming compared with what the change in Earth’s orbit could have wrought on its own.

    So, Arnscheidt and Rothman incorporated the Earth’s orbital changes into the multiplicative model and their analysis of Earth’s temperature changes, and found that multiplier effects could predictably amplify, on average, the modest temperature rises due to changes in Earth’s orbit.

    “Climate warms and cools in synchrony with orbital changes, but the orbital cycles themselves would predict only modest changes in climate,” Rothman says. “But if we consider a multiplicative model, then modest warming, paired with this multiplier effect, can result in extreme events that tend to occur at the same time as these orbital changes.”

    “Humans are forcing the system in a new way,” Arnscheidt adds. “And this study is showing that, when we increase temperature, we’re likely going to interact with these natural, amplifying effects.”

    This research was supported, in part, by MIT’s School of Science. More

  • in

    A new approach to preventing human-induced earthquakes

    When humans pump large volumes of fluid into the ground, they can set off potentially damaging earthquakes, depending on the underlying geology. This has been the case in certain oil- and gas-producing regions, where wastewater, often mixed with oil, is disposed of by injecting it back into the ground — a process that has triggered sizable seismic events in recent years.

    Now MIT researchers, working with an interdisciplinary team of scientists from industry and academia, have developed a method to manage such human-induced seismicity, and have demonstrated that the technique successfully reduced the number of earthquakes occurring in an active oil field.

    Their results, appearing today in Nature, could help mitigate earthquakes caused by the oil and gas industry, not just from the injection of wastewater produced with oil, but also that produced from hydraulic fracturing, or “fracking.” The team’s approach could also help prevent quakes from other human activities, such as the filling of water reservoirs and aquifers, and the sequestration of carbon dioxide in deep geologic formations.

    “Triggered seismicity is a problem that goes way beyond producing oil,” says study lead author Bradford Hager, the Cecil and Ida Green Professor of Earth Sciences in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “This is a huge problem for society that will have to be confronted if we are to safely inject carbon dioxide into the subsurface. We demonstrated the kind of study that will be necessary for doing this.”

    The study’s co-authors include Ruben Juanes, professor of civil and environmental engineering at MIT, and collaborators from the University of California at Riverside, the University of Texas at Austin, Harvard University, and Eni, a multinational oil and gas company based in Italy.

    Safe injections

    Both natural and human-induced earthquakes occur along geologic faults, or fractures between two blocks of rock in the Earth’s crust. In stable periods, the rocks on either side of a fault are held in place by the pressures generated by surrounding rocks. But when a large volume of fluid is suddenly injected at high rates, it can upset a fault’s fluid stress balance. In some cases, this sudden injection can lubricate a fault and cause rocks on either side to slip and trigger an earthquake.

    The most common source of such fluid injections is from the oil and gas industry’s disposal of wastewater that is brought up along with oil. Field operators dispose of this water through injection wells that continuously pump the water back into the ground at high pressures.

    “There’s a lot of water produced with the oil, and that water is injected into the ground, which has caused a large number of quakes,” Hager notes. “So, for a while, oil-producing regions in Oklahoma had more magnitude 3 quakes than California, because of all this wastewater that was being injected.”

    In recent years, a similar problem arose in southern Italy, where injection wells on oil fields operated by Eni triggered microseisms in an area where large naturally occurring earthquakes had previously occurred. The company, looking for ways to address the problem, sought consulation from Hager and Juanes, both leading experts in seismicity and subsurface flows.

    “This was an opportunity for us to get access to high-quality seismic data about the subsurface, and learn how to do these injections safely,” Juanes says.

    Seismic blueprint

    The team made use of detailed information, accumulated by the oil company over years of operation in the Val D’Agri oil field, a region of southern Italy that lies in a tectonically active basin. The data included information about the region’s earthquake record, dating back to the 1600s, as well as the structure of rocks and faults, and the state of the subsurface corresponding to the various injection rates of each well.

    This video shows the change in stress on the geologic faults of the Val d’Agri field from 2001 to 2019, as predicted by a new MIT-derived model. Video credit: A. Plesch (Harvard University)

    This video shows small earthquakes occurring on the Costa Molina fault within the Val d’Agri field from 2004 to 2016. Each event is shown for two years fading from an initial bright color to the final dark color. Video credit: A. Plesch (Harvard University)

    The researchers integrated these data into a coupled subsurface flow and geomechanical model, which predicts how the stresses and strains of underground structures evolve as the volume of pore fluid, such as from the injection of water, changes. They connected this model to an earthquake mechanics model in order to translate the changes in underground stress and fluid pressure into a likelihood of triggering earthquakes. They then quantified the rate of earthquakes associated with various rates of water injection, and identified scenarios that were unlikely to trigger large quakes.

    When they ran the models using data from 1993 through 2016, the predictions of seismic activity matched with the earthquake record during this period, validating their approach. They then ran the models forward in time, through the year 2025, to predict the region’s seismic response to three different injection rates: 2,000, 2,500, and 3,000 cubic meters per day. The simulations showed that large earthquakes could be avoided if operators kept injection rates at 2,000 cubic meters per day — a flow rate comparable to a small public fire hydrant.

    Eni field operators implemented the team’s recommended rate at the oil field’s single water injection well over a 30-month period between January 2017 and June 2019. In this time, the team observed only a few tiny seismic events, which coincided with brief periods when operators went above the recommended injection rate.

    “The seismicity in the region has been very low in these two-and-a-half years, with around four quakes of 0.5 magnitude, as opposed to hundreds of quakes, of up to 3 magnitude, that were happening between 2006 and 2016,” Hager says. 

    The results demonstrate that operators can successfully manage earthquakes by adjusting injection rates, based on the underlying geology. Juanes says the team’s modeling approach may help to prevent earthquakes related to other processes, such as the building of water reservoirs and the sequestration of carbon dioxide — as long as there is detailed information about a region’s subsurface.

    “A lot of effort needs to go into understanding the geologic setting,” says Juanes, who notes that, if carbon sequestration were carried out on depleted oil fields, “such reservoirs could have this type of history, seismic information, and geologic interpretation that you could use to build similar models for carbon sequestration. We show it’s at least possible to manage seismicity in an operational setting. And we offer a blueprint for how to do it.”

    This research was supported, in part, by Eni. More