More stories

  • in

    Microscopic defects in ice influence how massive glaciers flow, study shows

    As they seep and calve into the sea, melting glaciers and ice sheets are raising global water levels at unprecedented rates. To predict and prepare for future sea-level rise, scientists need a better understanding of how fast glaciers melt and what influences their flow.Now, a study by MIT scientists offers a new picture of glacier flow, based on microscopic deformation in the ice. The results show that a glacier’s flow depends strongly on how microscopic defects move through the ice.The researchers found they could estimate a glacier’s flow based on whether the ice is prone to microscopic defects of one kind versus another. They used this relationship between micro- and macro-scale deformation to develop a new model for how glaciers flow. With the new model, they mapped the flow of ice in locations across the Antarctic Ice Sheet.Contrary to conventional wisdom, they found, the ice sheet is not a monolith but instead is more varied in where and how it flows in response to warming-driven stresses. The study “dramatically alters the climate conditions under which marine ice sheets may become unstable and drive rapid rates of sea-level rise,” the researchers write in their paper.“This study really shows the effect of microscale processes on macroscale behavior,” says Meghana Ranganathan PhD ’22, who led the study as a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS) and is now a postdoc at Georgia Tech. “These mechanisms happen at the scale of water molecules and ultimately can affect the stability of the West Antarctic Ice Sheet.”“Broadly speaking, glaciers are accelerating, and there are a lot of variants around that,” adds co-author and EAPS Associate Professor Brent Minchew. “This is the first study that takes a step from the laboratory to the ice sheets and starts evaluating what the stability of ice is in the natural environment. That will ultimately feed into our understanding of the probability of catastrophic sea-level rise.”Ranganathan and Minchew’s study appears this week in the Proceedings of the National Academy of Sciences.Micro flowGlacier flow describes the movement of ice from the peak of a glacier, or the center of an ice sheet, down to the edges, where the ice then breaks off and melts into the ocean — a normally slow process that contributes over time to raising the world’s average sea level.In recent years, the oceans have risen at unprecedented rates, driven by global warming and the accelerated melting of glaciers and ice sheets. While the loss of polar ice is known to be a major contributor to sea-level rise, it is also the biggest uncertainty when it comes to making predictions.“Part of it’s a scaling problem,” Ranganathan explains. “A lot of the fundamental mechanisms that cause ice to flow happen at a really small scale that we can’t see. We wanted to pin down exactly what these microphysical processes are that govern ice flow, which hasn’t been represented in models of sea-level change.”The team’s new study builds on previous experiments from the early 2000s by geologists at the University of Minnesota, who studied how small chips of ice deform when physically stressed and compressed. Their work revealed two microscopic mechanisms by which ice can flow: “dislocation creep,” where molecule-sized cracks migrate through the ice, and “grain boundary sliding,” where individual ice crystals slide against each other, causing the boundary between them to move through the ice.The geologists found that ice’s sensitivity to stress, or how likely it is to flow, depends on which of the two mechanisms is dominant. Specifically, ice is more sensitive to stress when microscopic defects occur via dislocation creep rather than grain boundary sliding.Ranganathan and Minchew realized that those findings at the microscopic level could redefine how ice flows at much larger, glacial scales.“Current models for sea-level rise assume a single value for the sensitivity of ice to stress and hold this value constant across an entire ice sheet,” Ranganathan explains. “What these experiments showed was that actually, there’s quite a bit of variability in ice sensitivity, due to which of these mechanisms is at play.”A mapping matchFor their new study, the MIT team took insights from the previous experiments and developed a model to estimate an icy region’s sensitivity to stress, which directly relates to how likely that ice is to flow. The model takes in information such as the ambient temperature, the average size of ice crystals, and the estimated mass of ice in the region, and calculates how much the ice is deforming by dislocation creep versus grain boundary sliding. Depending on which of the two mechanisms is dominant, the model then estimates the region’s sensitivity to stress.The scientists fed into the model actual observations from various locations across the Antarctic Ice Sheet, where others had previously recorded data such as the local height of ice, the size of ice crystals, and the ambient temperature. Based on the model’s estimates, the team generated a map of ice sensitivity to stress across the Antarctic Ice Sheet. When they compared this map to satellite and field measurements taken of the ice sheet over time, they observed a close match, suggesting that the model could be used to accurately predict how glaciers and ice sheets will flow in the future.“As climate change starts to thin glaciers, that could affect the sensitivity of ice to stress,” Ranganathan says. “The instabilities that we expect in Antarctica could be very different, and we can now capture those differences, using this model.”  More

  • in

    Study: Heavy snowfall and rain may contribute to some earthquakes

    When scientists look for an earthquake’s cause, their search often starts underground. As centuries of seismic studies have made clear, it’s the collision of tectonic plates and the movement of subsurface faults and fissures that primarily trigger a temblor.But MIT scientists have now found that certain weather events may also play a role in setting off some quakes.In a study appearing today in Science Advances, the researchers report that episodes of heavy snowfall and rain likely contributed to a swarm of earthquakes over the past several years in northern Japan. The study is the first to show that climate conditions could initiate some quakes.“We see that snowfall and other environmental loading at the surface impacts the stress state underground, and the timing of intense precipitation events is well-correlated with the start of this earthquake swarm,” says study author William Frank, an assistant professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, climate obviously has an impact on the response of the solid earth, and part of that response is earthquakes.”The new study focuses on a series of ongoing earthquakes in Japan’s Noto Peninsula. The team discovered that seismic activity in the region is surprisingly synchronized with certain changes in underground pressure, and that those changes are influenced by seasonal patterns of snowfall and precipitation. The scientists suspect that this new connection between quakes and climate may not be unique to Japan and could play a role in shaking up other parts of the world.Looking to the future, they predict that the climate’s influence on earthquakes could be more pronounced with global warming.“If we’re going into a climate that’s changing, with more extreme precipitation events, and we expect a redistribution of water in the atmosphere, oceans, and continents, that will change how the Earth’s crust is loaded,” Frank adds. “That will have an impact for sure, and it’s a link we could further explore.”The study’s lead author is former MIT research associate Qing-Yu Wang (now at Grenoble Alpes University), and also includes EAPS postdoc Xin Cui, Yang Lu of the University of Vienna, Takashi Hirose of Tohoku University, and Kazushige Obara of the University of Tokyo.Seismic speedSince late 2020, hundreds of small earthquakes have shaken up Japan’s Noto Peninsula — a finger of land that curves north from the country’s main island into the Sea of Japan. Unlike a typical earthquake sequence, which begins as a main shock that gives way to a series of aftershocks before dying out, Noto’s seismic activity is an “earthquake swarm” — a pattern of multiple, ongoing quakes with no obvious main shock, or seismic trigger.The MIT team, along with their colleagues in Japan, aimed to spot any patterns in the swarm that would explain the persistent quakes. They started by looking through the Japanese Meteorological Agency’s catalog of earthquakes that provides data on seismic activity throughout the country over time. They focused on quakes in the Noto Peninsula over the last 11 years, during which the region has experienced episodic earthquake activity, including the most recent swarm.With seismic data from the catalog, the team counted the number of seismic events that occurred in the region over time, and found that the timing of quakes prior to 2020 appeared sporadic and unrelated, compared to late 2020, when earthquakes grew more intense and clustered in time, signaling the start of the swarm, with quakes that are correlated in some way.The scientists then looked to a second dataset of seismic measurements taken by monitoring stations over the same 11-year period. Each station continuously records any displacement, or local shaking that occurs. The shaking from one station to another can give scientists an idea of how fast a seismic wave travels between stations. This “seismic velocity” is related to the structure of the Earth through which the seismic wave is traveling. Wang used the station measurements to calculate the seismic velocity between every station in and around Noto over the last 11 years.The researchers generated an evolving picture of seismic velocity beneath the Noto Peninsula and observed a surprising pattern: In 2020, around when the earthquake swarm is thought to have begun, changes in seismic velocity appeared to be synchronized with the seasons.“We then had to explain why we were observing this seasonal variation,” Frank says.Snow pressureThe team wondered whether environmental changes from season to season could influence the underlying structure of the Earth in a way that would set off an earthquake swarm. Specifically, they looked at how seasonal precipitation would affect the underground “pore fluid pressure” — the amount of pressure that fluids in the Earth’s cracks and fissures exert within the bedrock.“When it rains or snows, that adds weight, which increases pore pressure, which allows seismic waves to travel through slower,” Frank explains. “When all that weight is removed, through evaporation or runoff, all of a sudden, that pore pressure decreases and seismic waves are faster.”Wang and Cui developed a hydromechanical model of the Noto Peninsula to simulate the underlying pore pressure over the last 11 years in response to seasonal changes in precipitation. They fed into the model meteorological data from this same period, including measurements of daily snow, rainfall, and sea-level changes. From their model, they were able to track changes in excess pore pressure beneath the Noto Peninsula, before and during the earthquake swarm. They then compared this timeline of evolving pore pressure with their evolving picture of seismic velocity.“We had seismic velocity observations, and we had the model of excess pore pressure, and when we overlapped them, we saw they just fit extremely well,” Frank says.In particular, they found that when they included snowfall data, and especially, extreme snowfall events, the fit between the model and observations was stronger than if they only considered rainfall and other events. In other words, the ongoing earthquake swarm that Noto residents have been experiencing can be explained in part by seasonal precipitation, and particularly, heavy snowfall events.“We can see that the timing of these earthquakes lines up extremely well with multiple times where we see intense snowfall,” Frank says. “It’s well-correlated with earthquake activity. And we think there’s a physical link between the two.”The researchers suspect that heavy snowfall and similar extreme precipitation could play a role in earthquakes elsewhere, though they emphasize that the primary trigger will always originate underground.“When we first want to understand how earthquakes work, we look to plate tectonics, because that is and will always be the number one reason why an earthquake happens,” Frank says. “But, what are the other things that could affect when and how an earthquake happens? That’s when you start to go to second-order controlling factors, and the climate is obviously one of those.”This research was supported, in part, by the National Science Foundation. More

  • in

    MIT-derived algorithm helps forecast the frequency of extreme weather

    To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades, and even centuries, forward in time, but only at a coarse resolution. These models might be used to gauge, for instance, future climate conditions for the northeastern U.S., but not specifically for Boston.

    To estimate Boston’s future risk of extreme weather such as flooding, policymakers can combine a coarse model’s large-scale predictions with a finer-resolution model, tuned to estimate how often Boston is likely to experience damaging floods as the climate warms. But this risk analysis is only as accurate as the predictions from that first, coarser climate model.

    “If you get those wrong for large-scale environments, then you miss everything in terms of what extreme events will look like at smaller scales, such as over individual cities,” says Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering.

    Sapsis and his colleagues have now developed a method to “correct” the predictions from coarse climate models. By combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales. When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades, compared to predictions made without the correction scheme.

    Play video

    This animation shows the evolution of storms around the northern hemisphere, as a result of a high-resolution storm model, combined with the MIT team’s corrected global climate model. The simulation improves the modeling of extreme values for wind, temperature, and humidity, which typically have significant errors in coarse scale models. Credit: Courtesy of Ruby Leung and Shixuan Zhang, PNNL

    Sapsis says the new correction scheme is general in form and can be applied to any global climate model. Once corrected, the models can help to determine where and how often extreme weather will strike as global temperatures rise over the coming years. 

    “Climate change will have an effect on every aspect of human life, and every type of life on the planet, from biodiversity to food security to the economy,” Sapsis says. “If we have capabilities to know accurately how extreme weather will change, especially over specific locations, it can make a lot of difference in terms of preparation and doing the right engineering to come up with solutions. This is the method that can open the way to do that.”

    The team’s results appear today in the Journal of Advances in Modeling Earth Systems. The study’s MIT co-authors include postdoc Benedikt Barthel Sorensen and Alexis-Tzianni Charalampopoulos SM ’19, PhD ’23, with Shixuan Zhang, Bryce Harrop, and Ruby Leung of the Pacific Northwest National Laboratory in Washington state.

    Over the hood

    Today’s large-scale climate models simulate weather features such as the average temperature, humidity, and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power, and in order to simulate how weather features will interact and evolve over periods of decades or longer, models average out features every 100 kilometers or so.

    “It’s a very heavy computation requiring supercomputers,” Sapsis notes. “But these models still do not resolve very important processes like clouds or storms, which occur over smaller scales of a kilometer or less.”

    To improve the resolution of these coarse climate models, scientists typically have gone under the hood to try and fix a model’s underlying dynamical equations, which describe how phenomena in the atmosphere and oceans should physically interact.

    “People have tried to dissect into climate model codes that have been developed over the last 20 to 30 years, which is a nightmare, because you can lose a lot of stability in your simulation,” Sapsis explains. “What we’re doing is a completely different approach, in that we’re not trying to correct the equations but instead correct the model’s output.”

    The team’s new approach takes a model’s output, or simulation, and overlays an algorithm that nudges the simulation toward something that more closely represents real-world conditions. The algorithm is based on a machine-learning scheme that takes in data, such as past information for temperature and humidity around the world, and learns associations within the data that represent fundamental dynamics among weather features. The algorithm then uses these learned associations to correct a model’s predictions.

    “What we’re doing is trying to correct dynamics, as in how an extreme weather feature, such as the windspeeds during a Hurricane Sandy event, will look like in the coarse model, versus in reality,” Sapsis says. “The method learns dynamics, and dynamics are universal. Having the correct dynamics eventually leads to correct statistics, for example, frequency of rare extreme events.”

    Climate correction

    As a first test of their new approach, the team used the machine-learning scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM), a climate model run by the U.S. Department of Energy, that simulates climate patterns around the world at a resolution of 110 kilometers. The researchers used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They then ran the climate model forward in time for about 36 years and applied the trained algorithm to the model’s simulations. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years, not used for training.

    “We’re not talking about huge differences in absolute terms,” Sapsis says. “An extreme event in the uncorrected simulation might be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.”

    When the team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones, they found the approach accurately reproduced the frequency of extreme storms in specific locations around the world.

    “We now have a coarse model that can get you the right frequency of events, for the present climate. It’s much more improved,” Sapsis says. “Once we correct the dynamics, this is a relevant correction, even when you have a different average global temperature, and it can be used for understanding how forest fires, flooding events, and heat waves will look in a future climate. Our ongoing work is focusing on analyzing future climate scenarios.”

    “The results are particularly impressive as the method shows promising results on E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group at the University of Chicago and was not involved with the study. “It would be interesting to see what climate change projections this framework yields once future greenhouse-gas emission scenarios are incorporated.”

    This work was supported, in part, by the U.S. Defense Advanced Research Projects Agency. More

  • in

    Gosha Geogdzhayev and Sadhana Lolla named 2024 Gates Cambridge Scholars

    This article was updated on April 23 to reflect the promotion of Gosha Geogdzhayev from alternate to winner of the Gates Cambridge Scholarship.

    MIT seniors Gosha Geogdzhayev and Sadhana Lolla have won the prestigious Gates Cambridge Scholarship, which offers students an opportunity to pursue graduate study in the field of their choice at Cambridge University in the U.K.

    Established in 2000, Gates Cambridge offers full-cost post-graduate scholarships to outstanding applicants from countries outside of the U.K. The mission of Gates Cambridge is to build a global network of future leaders committed to improving the lives of others.

    Gosha Geogdzhayev

    Originally from New York City, Geogdzhayev is a senior majoring in physics with minors in mathematics and computer science. At Cambridge, Geogdzhayev intends to pursue an MPhil in quantitative climate and environmental science. He is interested in applying these subjects to climate science and intends to spend his career developing novel statistical methods for climate prediction.

    At MIT, Geogdzhayev researches climate emulators with Professor Raffaele Ferrari’s group in the Department of Earth, Atmospheric and Planetary Sciences and is part of the “Bringing Computation to the Climate Challenge” Grand Challenges project. He is currently working on an operator-based emulator for the projection of climate extremes. Previously, Geogdzhayev studied the statistics of changing chaotic systems, work that has recently been published as a first-author paper.

    As a recipient of the National Oceanic and Atmospheric Agency (NOAA) Hollings Scholarship, Geogdzhayev has worked on bias correction methods for climate data at the NOAA Geophysical Fluid Dynamics Laboratory. He is the recipient of several other awards in the field of earth and atmospheric sciences, notably the American Meteorological Society Ward and Eileen Seguin Scholarship.

    Outside of research, Geogdzhayev enjoys writing poetry and is actively involved with his living community, Burton 1, for which he has previously served as floor chair.

    Sadhana Lolla

    Lolla, a senior from Clarksburg, Maryland, is majoring in computer science and minoring in mathematics and literature. At Cambridge, she will pursue an MPhil in technology policy.

    In the future, Lolla aims to lead conversations on deploying and developing technology for marginalized communities, such as the rural Indian village that her family calls home, while also conducting research in embodied intelligence.

    At MIT, Lolla conducts research on safe and trustworthy robotics and deep learning at the Distributed Robotics Laboratory with Professor Daniela Rus. Her research has spanned debiasing strategies for autonomous vehicles and accelerating robotic design processes. At Microsoft Research and Themis AI, she works on creating uncertainty-aware frameworks for deep learning, which has impacts across computational biology, language modeling, and robotics. She has presented her work at the Neural Information Processing Systems (NeurIPS) conference and the International Conference on Machine Learning (ICML). 

    Outside of research, Lolla leads initiatives to make computer science education more accessible globally. She is an instructor for class 6.s191 (MIT Introduction to Deep Learning), one of the largest AI courses in the world, which reaches millions of students annually. She serves as the curriculum lead for Momentum AI, the only U.S. program that teaches AI to underserved students for free, and she has taught hundreds of students in Northern Scotland as part of the MIT Global Teaching Labs program.

    Lolla was also the director for xFair, MIT’s largest student-run career fair, and is an executive board member for Next Sing, where she works to make a cappella more accessible for students across musical backgrounds. In her free time, she enjoys singing, solving crossword puzzles, and baking. More

  • in

    New tool predicts flood risk from hurricanes in a warming climate

    Coastal cities and communities will face more frequent major hurricanes with climate change in the coming years. To help prepare coastal cities against future storms, MIT scientists have developed a method to predict how much flooding a coastal community is likely to experience as hurricanes evolve over the next decades.

    When hurricanes make landfall, strong winds whip up salty ocean waters that generate storm surge in coastal regions. As the storms move over land, torrential rainfall can induce further flooding inland. When multiple flood sources such as storm surge and rainfall interact, they can compound a hurricane’s hazards, leading to significantly more flooding than would result from any one source alone. The new study introduces a physics-based method for predicting how the risk of such complex, compound flooding may evolve under a warming climate in coastal cities.

    One example of compound flooding’s impact is the aftermath from Hurricane Sandy in 2012. The storm made landfall on the East Coast of the United States as heavy winds whipped up a towering storm surge that combined with rainfall-driven flooding in some areas to cause historic and devastating floods across New York and New Jersey.

    In their study, the MIT team applied the new compound flood-modeling method to New York City to predict how climate change may influence the risk of compound flooding from Sandy-like hurricanes over the next decades.  

    They found that, in today’s climate, a Sandy-level compound flooding event will likely hit New York City every 150 years. By midcentury, a warmer climate will drive up the frequency of such flooding, to every 60 years. At the end of the century, destructive Sandy-like floods will deluge the city every 30 years — a fivefold increase compared to the present climate.

    “Long-term average damages from weather hazards are usually dominated by the rare, intense events like Hurricane Sandy,” says study co-author Kerry Emanuel, professor emeritus of atmospheric science at MIT. “It is important to get these right.”

    While these are sobering projections, the researchers hope the flood forecasts can help city planners prepare and protect against future disasters. “Our methodology equips coastal city authorities and policymakers with essential tools to conduct compound flooding risk assessments from hurricanes in coastal cities at a detailed, granular level, extending to each street or building, in both current and future decades,” says study author Ali Sarhadi, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences.

    The team’s open-access study appears online today in the Bulletin of the American Meteorological Society. Co-authors include Raphaël Rousseau-Rizzi at MIT’s Lorenz Center, Kyle Mandli at Columbia University, Jeffrey Neal at the University of Bristol, Michael Wiper at the Charles III University of Madrid, and Monika Feldmann at the Swiss Federal Institute of Technology Lausanne.

    The seeds of floods

    To forecast a region’s flood risk, weather modelers typically look to the past. Historical records contain measurements of previous hurricanes’ wind speeds, rainfall, and spatial extent, which scientists use to predict where and how much flooding may occur with coming storms. But Sarhadi believes that the limitations and brevity of these historical records are insufficient for predicting future hurricanes’ risks.

    “Even if we had lengthy historical records, they wouldn’t be a good guide for future risks because of climate change,” he says. “Climate change is changing the structural characteristics, frequency, intensity, and movement of hurricanes, and we cannot rely on the past.”

    Sarhadi and his colleagues instead looked to predict a region’s risk of hurricane flooding in a changing climate using a physics-based risk assessment methodology. They first paired simulations of hurricane activity with coupled ocean and atmospheric models over time. With the hurricane simulations, developed originally by Emanuel, the researchers virtually scatter tens of thousands of “seeds” of hurricanes into a simulated climate. Most seeds dissipate, while a few grow into category-level storms, depending on the conditions of the ocean and atmosphere.

    When the team drives these hurricane simulations with climate models of ocean and atmospheric conditions under certain global temperature projections, they can see how hurricanes change, for instance in terms of intensity, frequency, and size, under past, current, and future climate conditions.

    The team then sought to precisely predict the level and degree of compound flooding from future hurricanes in coastal cities. The researchers first used rainfall models to simulate rain intensity for a large number of simulated hurricanes, then applied numerical models to hydraulically translate that rainfall intensity into flooding on the ground during landfalling of hurricanes, given information about a region such as its surface and topography characteristics. They also simulated the same hurricanes’ storm surges, using hydrodynamic models to translate hurricanes’ maximum wind speed and sea level pressure into surge height in coastal areas. The simulation further assessed the propagation of ocean waters into coastal areas, causing coastal flooding.

    Then, the team developed a numerical hydrodynamic model to predict how two sources of hurricane-induced flooding, such as storm surge and rain-driven flooding, would simultaneously interact through time and space, as simulated hurricanes make landfall in coastal regions such as New York City, in both current and future climates.  

    “There’s a complex, nonlinear hydrodynamic interaction between saltwater surge-driven flooding and freshwater rainfall-driven flooding, that forms compound flooding that a lot of existing methods ignore,” Sarhadi says. “As a result, they underestimate the risk of compound flooding.”

    Amplified risk

    With their flood-forecasting method in place, the team applied it to a specific test case: New York City. They used the multipronged method to predict the city’s risk of compound flooding from hurricanes, and more specifically from Sandy-like hurricanes, in present and future climates. Their simulations showed that the city’s odds of experiencing Sandy-like flooding will increase significantly over the next decades as the climate warms, from once every 150 years in the current climate, to every 60 years by 2050, and every 30 years by 2099.

    Interestingly, they found that much of this increase in risk has less to do with how hurricanes themselves will change with warming climates, but with how sea levels will increase around the world.

    “In future decades, we will experience sea level rise in coastal areas, and we also incorporated that effect into our models to see how much that would increase the risk of compound flooding,” Sarhadi explains. “And in fact, we see sea level rise is playing a major role in amplifying the risk of compound flooding from hurricanes in New York City.”

    The team’s methodology can be applied to any coastal city to assess the risk of compound flooding from hurricanes and extratropical storms. With this approach, Sarhadi hopes decision-makers can make informed decisions regarding the implementation of adaptive measures, such as reinforcing coastal defenses to enhance infrastructure and community resilience.

    “Another aspect highlighting the urgency of our research is the projected 25 percent increase in coastal populations by midcentury, leading to heightened exposure to damaging storms,” Sarhadi says. “Additionally, we have trillions of dollars in assets situated in coastal flood-prone areas, necessitating proactive strategies to reduce damages from compound flooding from hurricanes under a warming climate.”

    This research was supported, in part, by Homesite Insurance. More

  • in

    A mineral produced by plate tectonics has a global cooling effect, study finds

    MIT geologists have found that a clay mineral on the seafloor, called smectite, has a surprisingly powerful ability to sequester carbon over millions of years.

    Under a microscope, a single grain of the clay resembles the folds of an accordion. These folds are known to be effective traps for organic carbon.

    Now, the MIT team has shown that the carbon-trapping clays are a product of plate tectonics: When oceanic crust crushes against a continental plate, it can bring rocks to the surface that, over time, can weather into minerals including smectite. Eventually, the clay sediment settles back in the ocean, where the minerals trap bits of dead organisms in their microscopic folds. This keeps the organic carbon from being consumed by microbes and expelled back into the atmosphere as carbon dioxide.

    Over millions of years, smectite can have a global effect, helping to cool the entire planet. Through a series of analyses, the researchers showed that smectite was likely produced after several major tectonic events over the last 500 million years. During each tectonic event, the clays trapped enough carbon to cool the Earth and induce the subsequent ice age.

    The findings are the first to show that plate tectonics can trigger ice ages through the production of carbon-trapping smectite.

    These clays can be found in certain tectonically active regions today, and the scientists believe that smectite continues to sequester carbon, providing a natural, albeit slow-acting, buffer against humans’ climate-warming activities.

    “The influence of these unassuming clay minerals has wide-ranging implications for the habitability of planets,” says Joshua Murray, a graduate student in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “There may even be a modern application for these clays in offsetting some of the carbon that humanity has placed into the atmosphere.”

    Murray and Oliver Jagoutz, professor of geology at MIT, have published their findings today in Nature Geoscience.

    A clear and present clay

    The new study follows up on the team’s previous work, which showed that each of the Earth’s major ice ages was likely triggered by a tectonic event in the tropics. The researchers found that each of these tectonic events exposed ocean rocks called ophiolites to the atmosphere. They put forth the idea that, when a tectonic collision occurs in a tropical region, ophiolites can undergo certain weathering effects, such as exposure to wind, rain, and chemical interactions, that transform the rocks into various minerals, including clays.

    “Those clay minerals, depending on the kinds you create, influence the climate in different ways,” Murray explains.

    At the time, it was unclear which minerals could come out of this weathering effect, and whether and how these minerals could directly contribute to cooling the planet. So, while it appeared there was a link between plate tectonics and ice ages, the exact mechanism by which one could trigger the other was still in question.

    With the new study, the team looked to see whether their proposed tectonic tropical weathering process would produce carbon-trapping minerals, and in quantities that would be sufficient to trigger a global ice age.

    The team first looked through the geologic literature and compiled data on the ways in which major magmatic minerals weather over time, and on the types of clay minerals this weathering can produce. They then worked these measurements into a weathering simulation of different rock types that are known to be exposed in tectonic collisions.

    “Then we look at what happens to these rock types when they break down due to weathering and the influence of a tropical environment, and what minerals form as a result,” Jagoutz says.

    Next, they plugged each weathered, “end-product” mineral into a simulation of the Earth’s carbon cycle to see what effect a given mineral might have, either in interacting with organic carbon, such as bits of dead organisms, or with inorganic, in the form of carbon dioxide in the atmosphere.

    From these analyses, one mineral had a clear presence and effect: smectite. Not only was the clay a naturally weathered product of tropical tectonics, it was also highly effective at trapping organic carbon. In theory, smectite seemed like a solid connection between tectonics and ice ages.

    But were enough of the clays actually present to trigger the previous four ice ages? Ideally, researchers should confirm this by finding smectite in ancient rock layers dating back to each global cooling period.

    “Unfortunately, as clays are buried by other sediments, they get cooked a bit, so we can’t measure them directly,” Murray says. “But we can look for their fingerprints.”

    A slow build

    The team reasoned that, as smectites are a product of ophiolites, these ocean rocks also bear characteristic elements such as nickel and chromium, which would be preserved in ancient sediments. If smectites were present in the past, nickel and chromium should be as well.

    To test this idea, the team looked through a database containing thousands of oceanic sedimentary rocks that were deposited over the last 500 million years. Over this time period, the Earth experienced four separate ice ages. Looking at rocks around each of these periods, the researchers observed large spikes of nickel and chromium, and inferred from this that smectite must also have been present.

    By their estimates, the clay mineral could have increased the preservation of organic carbon by less than one-tenth of a percent. In absolute terms, this is a miniscule amount. But over millions of years, they calculated that the clay’s accumulated, sequestered carbon was enough to trigger each of the four major ice ages.

    “We found that you really don’t need much of this material to have a huge effect on the climate,” Jagoutz says.

    “These clays also have probably contributed some of the Earth’s cooling in the last 3 to 5 million years, before humans got involved,” Murray adds. “In the absence of humans, these clays are probably making a difference to the climate. It’s just such a slow process.”

    “Jagoutz and Murray’s work is a nice demonstration of how important it is to consider all biotic and physical components of the global carbon cycle,” says Lee Kump, a professor of geosciences at Penn State University, who was not involved with the study. “Feedbacks among all these components control atmospheric greenhouse gas concentrations on all time scales, from the annual rise and fall of atmospheric carbon dioxide levels to the swings from icehouse to greenhouse over millions of years.”

    Could smectites be harnessed intentionally to further bring down the world’s carbon emissions? Murray sees some potential, for instance to shore up carbon reservoirs such as regions of permafrost. Warming temperatures are predicted to melt permafrost and expose long-buried organic carbon. If smectites could be applied to these regions, the clays could prevent this exposed carbon from escaping into and further warming the atmosphere.

    “If you want to understand how nature works, you have to understand it on the mineral and grain scale,” Jagoutz says. “And this is also the way forward for us to find solutions for this climatic catastrophe. If you study these natural processes, there’s a good chance you will stumble on something that will be actually useful.”

    This research was funded, in part, by the National Science Foundation. More

  • in

    Explained: The 1.5 C climate benchmark

    The summer of 2023 has been a season of weather extremes.

    In June, uncontrolled wildfires ripped through parts of Canada, sending smoke into the U.S. and setting off air quality alerts in dozens of downwind states. In July, the world set the hottest global temperature on record, which it held for three days in a row, then broke again on day four.

    From July into August, unrelenting heat blanketed large parts of Europe, Asia, and the U.S., while India faced a torrential monsoon season, and heavy rains flooded regions in the northeastern U.S. And most recently, whipped up by high winds and dry vegetation, a historic wildfire tore through Maui, devastating an entire town.

    These extreme weather events are mainly a consequence of climate change driven by humans’ continued burning of coal, oil, and natural gas. Climate scientists agree that extreme weather such as what people experienced this summer will likely grow more frequent and intense in the coming years unless something is done, on a persistent and planet-wide scale, to rein in global temperatures.

    Just how much reining-in are they talking about? The number that is internationally agreed upon is 1.5 degrees Celsius. To prevent worsening and potentially irreversible effects of climate change, the world’s average temperature should not exceed that of preindustrial times by more than 1.5 degrees Celsius (2.7 degrees Fahrenheit).

    As more regions around the world face extreme weather, it’s worth taking stock of the 1.5-degree bar, where the planet stands in relation to this threshold, and what can be done at the global, regional, and personal level, to “keep 1.5 alive.”

    Why 1.5 C?

    In 2015, in response to the growing urgency of climate impacts, nearly every country in the world signed onto the Paris Agreement, a landmark international treaty under which 195 nations pledged to hold the Earth’s temperature to “well below 2 degrees Celsius above pre-industrial levels,” and going further, aim to “limit the temperature increase to 1.5 degrees Celsius above pre-industrial levels.”

    The treaty did not define a particular preindustrial period, though scientists generally consider the years from 1850 to 1900 to be a reliable reference; this time predates humans’ use of fossil fuels and is also the earliest period when global observations of land and sea temperatures are available. During this period, the average global temperature, while swinging up and down in certain years, generally hovered around 13.5 degrees Celsius, or 56.3 degrees Fahrenheit.

    The treaty was informed by a fact-finding report which concluded that, even global warming of 1.5 degrees Celsius above the preindustrial average, over an extended, decades-long period, would lead to high risks for “some regions and vulnerable ecosystems.” The recommendation then, was to set the 1.5 degrees Celsius limit as a “defense line” — if the world can keep below this line, it potentially could avoid the more extreme and irreversible climate effects that would occur with a 2 degrees Celsius increase, and for some places, an even smaller increase than that.

    But, as many regions are experiencing today, keeping below the 1.5 line is no guarantee of avoiding extreme, global warming effects.

    “There is nothing magical about the 1.5 number, other than that is an agreed aspirational target. Keeping at 1.4 is better than 1.5, and 1.3 is better than 1.4, and so on,” says Sergey Paltsev, deputy director of MIT’s Joint Program on the Science and Policy of Global Change. “The science does not tell us that if, for example, the temperature increase is 1.51 degrees Celsius, then it would definitely be the end of the world. Similarly, if the temperature would stay at 1.49 degrees increase, it does not mean that we will eliminate all impacts of climate change. What is known: The lower the target for an increase in temperature, the lower the risks of climate impacts.”

    How close are we to 1.5 C?

    In 2022, the average global temperature was about 1.15 degrees Celsius above preindustrial levels. According to the World Meteorological Organization (WMO), the cyclical weather phenomenon La Niña recently contributed to temporarily cooling and dampening the effects of human-induced climate change. La Niña lasted for three years and ended around March of 2023.

    In May, the WMO issued a report that projected a significant likelihood (66 percent) that the world would exceed the 1.5 degrees Celsius threshold in the next four years. This breach would likely be driven by human-induced climate change, combined with a warming El Niño — a cyclical weather phenomenon that temporarily heats up ocean regions and pushes global temperatures higher.

    This summer, an El Niño is currently underway, and the event typically raises global temperatures in the year after it sets in, which in this case would be in 2024. The WMO predicts that, for each of the next four years, the global average temperature is likely to swing between 1.1 and 1.8 degrees Celsius above preindustrial levels.

    Though there is a good chance the world will get hotter than the 1.5-degree limit as the result of El Niño, the breach would be temporary, and for now, would not have failed the Paris Agreement, which aims to keep global temperatures below the 1.5-degree limit over the long term (averaged over several decades rather than a single year).

    “But we should not forget that this is a global average, and there are variations regionally and seasonally,” says Elfatih Eltahir, the H.M. King Bhumibol Professor and Professor of Civil and Environmental Engineering at MIT. “This year, we had extreme conditions around the world, even though we haven’t reached the 1.5 C threshold. So, even if we control the average at a global magnitude, we are going to see events that are extreme, because of climate change.”

    More than a number

    To hold the planet’s long-term average temperature to below the 1.5-degree threshold, the world will have to reach net zero emissions by the year 2050, according to the Intergovernmental Panel on Climate Change (IPCC). This means that, in terms of the emissions released by the burning of coal, oil, and natural gas, the entire world will have to remove as much as it puts into the atmosphere.

    “In terms of innovations, we need all of them — even those that may seem quite exotic at this point: fusion, direct air capture, and others,” Paltsev says.

    The task of curbing emissions in time is particularly daunting for the United States, which generates the most carbon dioxide emissions of any other country in the world.

    “The U.S.’s burning of fossil fuels and consumption of energy is just way above the rest of the world. That’s a persistent problem,” Eltahir says. “And the national statistics are an aggregate of what a lot of individuals are doing.”

    At an individual level, there are things that can be done to help bring down one’s personal emissions, and potentially chip away at rising global temperatures.

    “We are consumers of products that either embody greenhouse gases, such as meat, clothes, computers, and homes, or we are directly responsible for emitting greenhouse gases, such as when we use cars, airplanes, electricity, and air conditioners,” Paltsev says. “Our everyday choices affect the amount of emissions that are added to the atmosphere.”

    But to compel people to change their emissions, it may be less about a number, and more about a feeling.

    “To get people to act, my hypothesis is, you need to reach them not just by convincing them to be good citizens and saying it’s good for the world to keep below 1.5 degrees, but showing how they individually will be impacted,” says Eltahir, who specializes on the study of regional climates, focusing on how climate change impacts the water cycle and frequency of extreme weather such as heat waves.

    “True climate progress requires a dramatic change in how the human system gets its energy,” Paltsev says. “It is a huge undertaking. Are you ready personally to make sacrifices and to change the way of your life? If one gets an honest answer to that question, it would help to understand why true climate progress is so difficult to achieve.” More

  • in

    Study: The ocean’s color is changing as a consequence of climate change

    The ocean’s color has changed significantly over the last 20 years, and the global trend is likely a consequence of human-induced climate change, report scientists at MIT, the National Oceanography Center in the U.K., and elsewhere.  

    In a study appearing today in Nature, the team writes that they have detected changes in ocean color over the past two decades that cannot be explained by natural, year-to-year variability alone. These color shifts, though subtle to the human eye, have occurred over 56 percent of the world’s oceans — an expanse that is larger than the total land area on Earth.

    In particular, the researchers found that tropical ocean regions near the equator have become steadily greener over time. The shift in ocean color indicates that ecosystems within the surface ocean must also be changing, as the color of the ocean is a literal reflection of the organisms and materials in its waters.

    At this point, the researchers cannot say how exactly marine ecosystems are changing to reflect the shifting color. But they are pretty sure of one thing: Human-induced climate change is likely the driver.

    “I’ve been running simulations that have been telling me for years that these changes in ocean color are going to happen,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences and the Center for Global Change Science. “To actually see it happening for real is not surprising, but frightening. And these changes are consistent with man-induced changes to our climate.”

    “This gives additional evidence of how human activities are affecting life on Earth over a huge spatial extent,” adds lead author B. B. Cael PhD ’19 of the National Oceanography Center in Southampton, U.K. “It’s another way that humans are affecting the biosphere.”

    The study’s co-authors also include Stephanie Henson of the National Oceanography Center, Kelsey Bisson at Oregon State University, and Emmanuel Boss of the University of Maine.

    Above the noise

    The ocean’s color is a visual product of whatever lies within its upper layers. Generally, waters that are deep blue reflect very little life, whereas greener waters indicate the presence of ecosystems, and mainly phytoplankton — plant-like microbes that are abundant in upper ocean and that contain the green pigment chlorophyll. The pigment helps plankton harvest sunlight, which they use to capture carbon dioxide from the atmosphere and convert it into sugars.

    Phytoplankton are the foundation of the marine food web that sustains progressively more complex organisms, on up to krill, fish, and seabirds and marine mammals. Phytoplankton are also a powerful muscle in the ocean’s ability to capture and store carbon dioxide. Scientists are therefore keen to monitor phytoplankton across the surface oceans and to see how these essential communities might respond to climate change. To do so, scientists have tracked changes in chlorophyll, based on the ratio of how much blue versus green light is reflected from the ocean surface, which can be monitored from space

    But around a decade ago, Henson, who is a co-author of the current study, published a paper with others, which showed that, if scientists were tracking chlorophyll alone, it would take at least 30 years of continuous monitoring to detect any trend that was driven specifically by climate change. The reason, the team argued, was that the large, natural variations in chlorophyll from year to year would overwhelm any anthropogenic influence on chlorophyll concentrations. It would therefore take several decades to pick out a meaningful, climate-change-driven signal amid the normal noise.

    In 2019, Dutkiewicz and her colleagues published a separate paper, showing through a new model that the natural variation in other ocean colors is much smaller compared to that of chlorophyll. Therefore, any signal of climate-change-driven changes should be easier to detect over the smaller, normal variations of other ocean colors. They predicted that such changes should be apparent within 20, rather than 30 years of monitoring.

    “So I thought, doesn’t it make sense to look for a trend in all these other colors, rather than in chlorophyll alone?” Cael says. “It’s worth looking at the whole spectrum, rather than just trying to estimate one number from bits of the spectrum.”

     The power of seven

    In the current study, Cael and the team analyzed measurements of ocean color taken by the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite, which has been monitoring ocean color for 21 years. MODIS takes measurements in seven visible wavelengths, including the two colors researchers traditionally use to estimate chlorophyll.

    The differences in color that the satellite picks up are too subtle for human eyes to differentiate. Much of the ocean appears blue to our eye, whereas the true color may contain a mix of subtler wavelengths, from blue to green and even red.

    Cael carried out a statistical analysis using all seven ocean colors measured by the satellite from 2002 to 2022 together. He first looked at how much the seven colors changed from region to region during a given year, which gave him an idea of their natural variations. He then zoomed out to see how these annual variations in ocean color changed over a longer stretch of two decades. This analysis turned up a clear trend, above the normal year-to-year variability.

    To see whether this trend is related to climate change, he then looked to Dutkiewicz’s model from 2019. This model simulated the Earth’s oceans under two scenarios: one with the addition of greenhouse gases, and the other without it. The greenhouse-gas model predicted that a significant trend should show up within 20 years and that this trend should cause changes to ocean color in about 50 percent of the world’s surface oceans — almost exactly what Cael found in his analysis of real-world satellite data.

    “This suggests that the trends we observe are not a random variation in the Earth system,” Cael says. “This is consistent with anthropogenic climate change.”

    The team’s results show that monitoring ocean colors beyond chlorophyll could give scientists a clearer, faster way to detect climate-change-driven changes to marine ecosystems.

    “The color of the oceans has changed,” Dutkiewicz says. “And we can’t say how. But we can say that changes in color reflect changes in plankton communities, that will impact everything that feeds on plankton. It will also change how much the ocean will take up carbon, because different types of plankton have different abilities to do that. So, we hope people take this seriously. It’s not only models that are predicting these changes will happen. We can now see it happening, and the ocean is changing.”

    This research was supported, in part, by NASA. More