More stories

  • in

    Making climate models relevant for local decision-makers

    Climate models are a key technology in predicting the impacts of climate change. By running simulations of the Earth’s climate, scientists and policymakers can estimate conditions like sea level rise, flooding, and rising temperatures, and make decisions about how to appropriately respond. But current climate models struggle to provide this information quickly or affordably enough to be useful on smaller scales, such as the size of a city. Now, authors of a new open-access paper published in the Journal of Advances in Modeling Earth Systems have found a method to leverage machine learning to utilize the benefits of current climate models, while reducing the computational costs needed to run them. “It turns the traditional wisdom on its head,” says Sai Ravela, a principal research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS) who wrote the paper with EAPS postdoc Anamitra Saha. Traditional wisdomIn climate modeling, downscaling is the process of using a global climate model with coarse resolution to generate finer details over smaller regions. Imagine a digital picture: A global model is a large picture of the world with a low number of pixels. To downscale, you zoom in on just the section of the photo you want to look at — for example, Boston. But because the original picture was low resolution, the new version is blurry; it doesn’t give enough detail to be particularly useful. “If you go from coarse resolution to fine resolution, you have to add information somehow,” explains Saha. Downscaling attempts to add that information back in by filling in the missing pixels. “That addition of information can happen two ways: Either it can come from theory, or it can come from data.” Conventional downscaling often involves using models built on physics (such as the process of air rising, cooling, and condensing, or the landscape of the area), and supplementing it with statistical data taken from historical observations. But this method is computationally taxing: It takes a lot of time and computing power to run, while also being expensive. A little bit of both In their new paper, Saha and Ravela have figured out a way to add the data another way. They’ve employed a technique in machine learning called adversarial learning. It uses two machines: One generates data to go into our photo. But the other machine judges the sample by comparing it to actual data. If it thinks the image is fake, then the first machine has to try again until it convinces the second machine. The end-goal of the process is to create super-resolution data. Using machine learning techniques like adversarial learning is not a new idea in climate modeling; where it currently struggles is its inability to handle large amounts of basic physics, like conservation laws. The researchers discovered that simplifying the physics going in and supplementing it with statistics from the historical data was enough to generate the results they needed. “If you augment machine learning with some information from the statistics and simplified physics both, then suddenly, it’s magical,” says Ravela. He and Saha started with estimating extreme rainfall amounts by removing more complex physics equations and focusing on water vapor and land topography. They then generated general rainfall patterns for mountainous Denver and flat Chicago alike, applying historical accounts to correct the output. “It’s giving us extremes, like the physics does, at a much lower cost. And it’s giving us similar speeds to statistics, but at much higher resolution.” Another unexpected benefit of the results was how little training data was needed. “The fact that that only a little bit of physics and little bit of statistics was enough to improve the performance of the ML [machine learning] model … was actually not obvious from the beginning,” says Saha. It only takes a few hours to train, and can produce results in minutes, an improvement over the months other models take to run. Quantifying risk quicklyBeing able to run the models quickly and often is a key requirement for stakeholders such as insurance companies and local policymakers. Ravela gives the example of Bangladesh: By seeing how extreme weather events will impact the country, decisions about what crops should be grown or where populations should migrate to can be made considering a very broad range of conditions and uncertainties as soon as possible.“We can’t wait months or years to be able to quantify this risk,” he says. “You need to look out way into the future and at a large number of uncertainties to be able to say what might be a good decision.”While the current model only looks at extreme precipitation, training it to examine other critical events, such as tropical storms, winds, and temperature, is the next step of the project. With a more robust model, Ravela is hoping to apply it to other places like Boston and Puerto Rico as part of a Climate Grand Challenges project.“We’re very excited both by the methodology that we put together, as well as the potential applications that it could lead to,” he says.  More

  • in

    Study: Heavy snowfall and rain may contribute to some earthquakes

    When scientists look for an earthquake’s cause, their search often starts underground. As centuries of seismic studies have made clear, it’s the collision of tectonic plates and the movement of subsurface faults and fissures that primarily trigger a temblor.But MIT scientists have now found that certain weather events may also play a role in setting off some quakes.In a study appearing today in Science Advances, the researchers report that episodes of heavy snowfall and rain likely contributed to a swarm of earthquakes over the past several years in northern Japan. The study is the first to show that climate conditions could initiate some quakes.“We see that snowfall and other environmental loading at the surface impacts the stress state underground, and the timing of intense precipitation events is well-correlated with the start of this earthquake swarm,” says study author William Frank, an assistant professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, climate obviously has an impact on the response of the solid earth, and part of that response is earthquakes.”The new study focuses on a series of ongoing earthquakes in Japan’s Noto Peninsula. The team discovered that seismic activity in the region is surprisingly synchronized with certain changes in underground pressure, and that those changes are influenced by seasonal patterns of snowfall and precipitation. The scientists suspect that this new connection between quakes and climate may not be unique to Japan and could play a role in shaking up other parts of the world.Looking to the future, they predict that the climate’s influence on earthquakes could be more pronounced with global warming.“If we’re going into a climate that’s changing, with more extreme precipitation events, and we expect a redistribution of water in the atmosphere, oceans, and continents, that will change how the Earth’s crust is loaded,” Frank adds. “That will have an impact for sure, and it’s a link we could further explore.”The study’s lead author is former MIT research associate Qing-Yu Wang (now at Grenoble Alpes University), and also includes EAPS postdoc Xin Cui, Yang Lu of the University of Vienna, Takashi Hirose of Tohoku University, and Kazushige Obara of the University of Tokyo.Seismic speedSince late 2020, hundreds of small earthquakes have shaken up Japan’s Noto Peninsula — a finger of land that curves north from the country’s main island into the Sea of Japan. Unlike a typical earthquake sequence, which begins as a main shock that gives way to a series of aftershocks before dying out, Noto’s seismic activity is an “earthquake swarm” — a pattern of multiple, ongoing quakes with no obvious main shock, or seismic trigger.The MIT team, along with their colleagues in Japan, aimed to spot any patterns in the swarm that would explain the persistent quakes. They started by looking through the Japanese Meteorological Agency’s catalog of earthquakes that provides data on seismic activity throughout the country over time. They focused on quakes in the Noto Peninsula over the last 11 years, during which the region has experienced episodic earthquake activity, including the most recent swarm.With seismic data from the catalog, the team counted the number of seismic events that occurred in the region over time, and found that the timing of quakes prior to 2020 appeared sporadic and unrelated, compared to late 2020, when earthquakes grew more intense and clustered in time, signaling the start of the swarm, with quakes that are correlated in some way.The scientists then looked to a second dataset of seismic measurements taken by monitoring stations over the same 11-year period. Each station continuously records any displacement, or local shaking that occurs. The shaking from one station to another can give scientists an idea of how fast a seismic wave travels between stations. This “seismic velocity” is related to the structure of the Earth through which the seismic wave is traveling. Wang used the station measurements to calculate the seismic velocity between every station in and around Noto over the last 11 years.The researchers generated an evolving picture of seismic velocity beneath the Noto Peninsula and observed a surprising pattern: In 2020, around when the earthquake swarm is thought to have begun, changes in seismic velocity appeared to be synchronized with the seasons.“We then had to explain why we were observing this seasonal variation,” Frank says.Snow pressureThe team wondered whether environmental changes from season to season could influence the underlying structure of the Earth in a way that would set off an earthquake swarm. Specifically, they looked at how seasonal precipitation would affect the underground “pore fluid pressure” — the amount of pressure that fluids in the Earth’s cracks and fissures exert within the bedrock.“When it rains or snows, that adds weight, which increases pore pressure, which allows seismic waves to travel through slower,” Frank explains. “When all that weight is removed, through evaporation or runoff, all of a sudden, that pore pressure decreases and seismic waves are faster.”Wang and Cui developed a hydromechanical model of the Noto Peninsula to simulate the underlying pore pressure over the last 11 years in response to seasonal changes in precipitation. They fed into the model meteorological data from this same period, including measurements of daily snow, rainfall, and sea-level changes. From their model, they were able to track changes in excess pore pressure beneath the Noto Peninsula, before and during the earthquake swarm. They then compared this timeline of evolving pore pressure with their evolving picture of seismic velocity.“We had seismic velocity observations, and we had the model of excess pore pressure, and when we overlapped them, we saw they just fit extremely well,” Frank says.In particular, they found that when they included snowfall data, and especially, extreme snowfall events, the fit between the model and observations was stronger than if they only considered rainfall and other events. In other words, the ongoing earthquake swarm that Noto residents have been experiencing can be explained in part by seasonal precipitation, and particularly, heavy snowfall events.“We can see that the timing of these earthquakes lines up extremely well with multiple times where we see intense snowfall,” Frank says. “It’s well-correlated with earthquake activity. And we think there’s a physical link between the two.”The researchers suspect that heavy snowfall and similar extreme precipitation could play a role in earthquakes elsewhere, though they emphasize that the primary trigger will always originate underground.“When we first want to understand how earthquakes work, we look to plate tectonics, because that is and will always be the number one reason why an earthquake happens,” Frank says. “But, what are the other things that could affect when and how an earthquake happens? That’s when you start to go to second-order controlling factors, and the climate is obviously one of those.”This research was supported, in part, by the National Science Foundation. More

  • in

    An AI dataset carves new paths to tornado detection

    The return of spring in the Northern Hemisphere touches off tornado season. A tornado’s twisting funnel of dust and debris seems an unmistakable sight. But that sight can be obscured to radar, the tool of meteorologists. It’s hard to know exactly when a tornado has formed, or even why.

    A new dataset could hold answers. It contains radar returns from thousands of tornadoes that have hit the United States in the past 10 years. Storms that spawned tornadoes are flanked by other severe storms, some with nearly identical conditions, that never did. MIT Lincoln Laboratory researchers who curated the dataset, called TorNet, have now released it open source. They hope to enable breakthroughs in detecting one of nature’s most mysterious and violent phenomena.

    “A lot of progress is driven by easily available, benchmark datasets. We hope TorNet will lay a foundation for machine learning algorithms to both detect and predict tornadoes,” says Mark Veillette, the project’s co-principal investigator with James Kurdzo. Both researchers work in the Air Traffic Control Systems Group. 

    Along with the dataset, the team is releasing models trained on it. The models show promise for machine learning’s ability to spot a twister. Building on this work could open new frontiers for forecasters, helping them provide more accurate warnings that might save lives. 

    Swirling uncertainty

    About 1,200 tornadoes occur in the United States every year, causing millions to billions of dollars in economic damage and claiming 71 lives on average. Last year, one unusually long-lasting tornado killed 17 people and injured at least 165 others along a 59-mile path in Mississippi.  

    Yet tornadoes are notoriously difficult to forecast because scientists don’t have a clear picture of why they form. “We can see two storms that look identical, and one will produce a tornado and one won’t. We don’t fully understand it,” Kurdzo says.

    A tornado’s basic ingredients are thunderstorms with instability caused by rapidly rising warm air and wind shear that causes rotation. Weather radar is the primary tool used to monitor these conditions. But tornadoes lay too low to be detected, even when moderately close to the radar. As the radar beam with a given tilt angle travels further from the antenna, it gets higher above the ground, mostly seeing reflections from rain and hail carried in the “mesocyclone,” the storm’s broad, rotating updraft. A mesocyclone doesn’t always produce a tornado.

    With this limited view, forecasters must decide whether or not to issue a tornado warning. They often err on the side of caution. As a result, the rate of false alarms for tornado warnings is more than 70 percent. “That can lead to boy-who-cried-wolf syndrome,” Kurdzo says.  

    In recent years, researchers have turned to machine learning to better detect and predict tornadoes. However, raw datasets and models have not always been accessible to the broader community, stifling progress. TorNet is filling this gap.

    The dataset contains more than 200,000 radar images, 13,587 of which depict tornadoes. The rest of the images are non-tornadic, taken from storms in one of two categories: randomly selected severe storms or false-alarm storms (those that led a forecaster to issue a warning but that didn’t produce a tornado).

    Each sample of a storm or tornado comprises two sets of six radar images. The two sets correspond to different radar sweep angles. The six images portray different radar data products, such as reflectivity (showing precipitation intensity) or radial velocity (indicating if winds are moving toward or away from the radar).

    A challenge in curating the dataset was first finding tornadoes. Within the corpus of weather radar data, tornadoes are extremely rare events. The team then had to balance those tornado samples with difficult non-tornado samples. If the dataset were too easy, say by comparing tornadoes to snowstorms, an algorithm trained on the data would likely over-classify storms as tornadic.

    “What’s beautiful about a true benchmark dataset is that we’re all working with the same data, with the same level of difficulty, and can compare results,” Veillette says. “It also makes meteorology more accessible to data scientists, and vice versa. It becomes easier for these two parties to work on a common problem.”

    Both researchers represent the progress that can come from cross-collaboration. Veillette is a mathematician and algorithm developer who has long been fascinated by tornadoes. Kurdzo is a meteorologist by training and a signal processing expert. In grad school, he chased tornadoes with custom-built mobile radars, collecting data to analyze in new ways.

    “This dataset also means that a grad student doesn’t have to spend a year or two building a dataset. They can jump right into their research,” Kurdzo says.

    This project was funded by Lincoln Laboratory’s Climate Change Initiative, which aims to leverage the laboratory’s diverse technical strengths to help address climate problems threatening human health and global security.

    Chasing answers with deep learning

    Using the dataset, the researchers developed baseline artificial intelligence (AI) models. They were particularly eager to apply deep learning, a form of machine learning that excels at processing visual data. On its own, deep learning can extract features (key observations that an algorithm uses to make a decision) from images across a dataset. Other machine learning approaches require humans to first manually label features. 

    “We wanted to see if deep learning could rediscover what people normally look for in tornadoes and even identify new things that typically aren’t searched for by forecasters,” Veillette says.

    The results are promising. Their deep learning model performed similar to or better than all tornado-detecting algorithms known in literature. The trained algorithm correctly classified 50 percent of weaker EF-1 tornadoes and over 85 percent of tornadoes rated EF-2 or higher, which make up the most devastating and costly occurrences of these storms.

    They also evaluated two other types of machine-learning models, and one traditional model to compare against. The source code and parameters of all these models are freely available. The models and dataset are also described in a paper submitted to a journal of the American Meteorological Society (AMS). Veillette presented this work at the AMS Annual Meeting in January.

    “The biggest reason for putting our models out there is for the community to improve upon them and do other great things,” Kurdzo says. “The best solution could be a deep learning model, or someone might find that a non-deep learning model is actually better.”

    TorNet could be useful in the weather community for others uses too, such as for conducting large-scale case studies on storms. It could also be augmented with other data sources, like satellite imagery or lightning maps. Fusing multiple types of data could improve the accuracy of machine learning models.

    Taking steps toward operations

    On top of detecting tornadoes, Kurdzo hopes that models might help unravel the science of why they form.

    “As scientists, we see all these precursors to tornadoes — an increase in low-level rotation, a hook echo in reflectivity data, specific differential phase (KDP) foot and differential reflectivity (ZDR) arcs. But how do they all go together? And are there physical manifestations we don’t know about?” he asks.

    Teasing out those answers might be possible with explainable AI. Explainable AI refers to methods that allow a model to provide its reasoning, in a format understandable to humans, of why it came to a certain decision. In this case, these explanations might reveal physical processes that happen before tornadoes. This knowledge could help train forecasters, and models, to recognize the signs sooner. 

    “None of this technology is ever meant to replace a forecaster. But perhaps someday it could guide forecasters’ eyes in complex situations, and give a visual warning to an area predicted to have tornadic activity,” Kurdzo says.

    Such assistance could be especially useful as radar technology improves and future networks potentially grow denser. Data refresh rates in a next-generation radar network are expected to increase from every five minutes to approximately one minute, perhaps faster than forecasters can interpret the new information. Because deep learning can process huge amounts of data quickly, it could be well-suited for monitoring radar returns in real time, alongside humans. Tornadoes can form and disappear in minutes.

    But the path to an operational algorithm is a long road, especially in safety-critical situations, Veillette says. “I think the forecaster community is still, understandably, skeptical of machine learning. One way to establish trust and transparency is to have public benchmark datasets like this one. It’s a first step.”

    The next steps, the team hopes, will be taken by researchers across the world who are inspired by the dataset and energized to build their own algorithms. Those algorithms will in turn go into test beds, where they’ll eventually be shown to forecasters, to start a process of transitioning into operations.

    In the end, the path could circle back to trust.

    “We may never get more than a 10- to 15-minute tornado warning using these tools. But if we could lower the false-alarm rate, we could start to make headway with public perception,” Kurdzo says. “People are going to use those warnings to take the action they need to save their lives.” More

  • in

    MIT-derived algorithm helps forecast the frequency of extreme weather

    To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades, and even centuries, forward in time, but only at a coarse resolution. These models might be used to gauge, for instance, future climate conditions for the northeastern U.S., but not specifically for Boston.

    To estimate Boston’s future risk of extreme weather such as flooding, policymakers can combine a coarse model’s large-scale predictions with a finer-resolution model, tuned to estimate how often Boston is likely to experience damaging floods as the climate warms. But this risk analysis is only as accurate as the predictions from that first, coarser climate model.

    “If you get those wrong for large-scale environments, then you miss everything in terms of what extreme events will look like at smaller scales, such as over individual cities,” says Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering.

    Sapsis and his colleagues have now developed a method to “correct” the predictions from coarse climate models. By combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales. When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades, compared to predictions made without the correction scheme.

    Play video

    This animation shows the evolution of storms around the northern hemisphere, as a result of a high-resolution storm model, combined with the MIT team’s corrected global climate model. The simulation improves the modeling of extreme values for wind, temperature, and humidity, which typically have significant errors in coarse scale models. Credit: Courtesy of Ruby Leung and Shixuan Zhang, PNNL

    Sapsis says the new correction scheme is general in form and can be applied to any global climate model. Once corrected, the models can help to determine where and how often extreme weather will strike as global temperatures rise over the coming years. 

    “Climate change will have an effect on every aspect of human life, and every type of life on the planet, from biodiversity to food security to the economy,” Sapsis says. “If we have capabilities to know accurately how extreme weather will change, especially over specific locations, it can make a lot of difference in terms of preparation and doing the right engineering to come up with solutions. This is the method that can open the way to do that.”

    The team’s results appear today in the Journal of Advances in Modeling Earth Systems. The study’s MIT co-authors include postdoc Benedikt Barthel Sorensen and Alexis-Tzianni Charalampopoulos SM ’19, PhD ’23, with Shixuan Zhang, Bryce Harrop, and Ruby Leung of the Pacific Northwest National Laboratory in Washington state.

    Over the hood

    Today’s large-scale climate models simulate weather features such as the average temperature, humidity, and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power, and in order to simulate how weather features will interact and evolve over periods of decades or longer, models average out features every 100 kilometers or so.

    “It’s a very heavy computation requiring supercomputers,” Sapsis notes. “But these models still do not resolve very important processes like clouds or storms, which occur over smaller scales of a kilometer or less.”

    To improve the resolution of these coarse climate models, scientists typically have gone under the hood to try and fix a model’s underlying dynamical equations, which describe how phenomena in the atmosphere and oceans should physically interact.

    “People have tried to dissect into climate model codes that have been developed over the last 20 to 30 years, which is a nightmare, because you can lose a lot of stability in your simulation,” Sapsis explains. “What we’re doing is a completely different approach, in that we’re not trying to correct the equations but instead correct the model’s output.”

    The team’s new approach takes a model’s output, or simulation, and overlays an algorithm that nudges the simulation toward something that more closely represents real-world conditions. The algorithm is based on a machine-learning scheme that takes in data, such as past information for temperature and humidity around the world, and learns associations within the data that represent fundamental dynamics among weather features. The algorithm then uses these learned associations to correct a model’s predictions.

    “What we’re doing is trying to correct dynamics, as in how an extreme weather feature, such as the windspeeds during a Hurricane Sandy event, will look like in the coarse model, versus in reality,” Sapsis says. “The method learns dynamics, and dynamics are universal. Having the correct dynamics eventually leads to correct statistics, for example, frequency of rare extreme events.”

    Climate correction

    As a first test of their new approach, the team used the machine-learning scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM), a climate model run by the U.S. Department of Energy, that simulates climate patterns around the world at a resolution of 110 kilometers. The researchers used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They then ran the climate model forward in time for about 36 years and applied the trained algorithm to the model’s simulations. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years, not used for training.

    “We’re not talking about huge differences in absolute terms,” Sapsis says. “An extreme event in the uncorrected simulation might be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.”

    When the team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones, they found the approach accurately reproduced the frequency of extreme storms in specific locations around the world.

    “We now have a coarse model that can get you the right frequency of events, for the present climate. It’s much more improved,” Sapsis says. “Once we correct the dynamics, this is a relevant correction, even when you have a different average global temperature, and it can be used for understanding how forest fires, flooding events, and heat waves will look in a future climate. Our ongoing work is focusing on analyzing future climate scenarios.”

    “The results are particularly impressive as the method shows promising results on E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group at the University of Chicago and was not involved with the study. “It would be interesting to see what climate change projections this framework yields once future greenhouse-gas emission scenarios are incorporated.”

    This work was supported, in part, by the U.S. Defense Advanced Research Projects Agency. More

  • in

    New tool predicts flood risk from hurricanes in a warming climate

    Coastal cities and communities will face more frequent major hurricanes with climate change in the coming years. To help prepare coastal cities against future storms, MIT scientists have developed a method to predict how much flooding a coastal community is likely to experience as hurricanes evolve over the next decades.

    When hurricanes make landfall, strong winds whip up salty ocean waters that generate storm surge in coastal regions. As the storms move over land, torrential rainfall can induce further flooding inland. When multiple flood sources such as storm surge and rainfall interact, they can compound a hurricane’s hazards, leading to significantly more flooding than would result from any one source alone. The new study introduces a physics-based method for predicting how the risk of such complex, compound flooding may evolve under a warming climate in coastal cities.

    One example of compound flooding’s impact is the aftermath from Hurricane Sandy in 2012. The storm made landfall on the East Coast of the United States as heavy winds whipped up a towering storm surge that combined with rainfall-driven flooding in some areas to cause historic and devastating floods across New York and New Jersey.

    In their study, the MIT team applied the new compound flood-modeling method to New York City to predict how climate change may influence the risk of compound flooding from Sandy-like hurricanes over the next decades.  

    They found that, in today’s climate, a Sandy-level compound flooding event will likely hit New York City every 150 years. By midcentury, a warmer climate will drive up the frequency of such flooding, to every 60 years. At the end of the century, destructive Sandy-like floods will deluge the city every 30 years — a fivefold increase compared to the present climate.

    “Long-term average damages from weather hazards are usually dominated by the rare, intense events like Hurricane Sandy,” says study co-author Kerry Emanuel, professor emeritus of atmospheric science at MIT. “It is important to get these right.”

    While these are sobering projections, the researchers hope the flood forecasts can help city planners prepare and protect against future disasters. “Our methodology equips coastal city authorities and policymakers with essential tools to conduct compound flooding risk assessments from hurricanes in coastal cities at a detailed, granular level, extending to each street or building, in both current and future decades,” says study author Ali Sarhadi, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences.

    The team’s open-access study appears online today in the Bulletin of the American Meteorological Society. Co-authors include Raphaël Rousseau-Rizzi at MIT’s Lorenz Center, Kyle Mandli at Columbia University, Jeffrey Neal at the University of Bristol, Michael Wiper at the Charles III University of Madrid, and Monika Feldmann at the Swiss Federal Institute of Technology Lausanne.

    The seeds of floods

    To forecast a region’s flood risk, weather modelers typically look to the past. Historical records contain measurements of previous hurricanes’ wind speeds, rainfall, and spatial extent, which scientists use to predict where and how much flooding may occur with coming storms. But Sarhadi believes that the limitations and brevity of these historical records are insufficient for predicting future hurricanes’ risks.

    “Even if we had lengthy historical records, they wouldn’t be a good guide for future risks because of climate change,” he says. “Climate change is changing the structural characteristics, frequency, intensity, and movement of hurricanes, and we cannot rely on the past.”

    Sarhadi and his colleagues instead looked to predict a region’s risk of hurricane flooding in a changing climate using a physics-based risk assessment methodology. They first paired simulations of hurricane activity with coupled ocean and atmospheric models over time. With the hurricane simulations, developed originally by Emanuel, the researchers virtually scatter tens of thousands of “seeds” of hurricanes into a simulated climate. Most seeds dissipate, while a few grow into category-level storms, depending on the conditions of the ocean and atmosphere.

    When the team drives these hurricane simulations with climate models of ocean and atmospheric conditions under certain global temperature projections, they can see how hurricanes change, for instance in terms of intensity, frequency, and size, under past, current, and future climate conditions.

    The team then sought to precisely predict the level and degree of compound flooding from future hurricanes in coastal cities. The researchers first used rainfall models to simulate rain intensity for a large number of simulated hurricanes, then applied numerical models to hydraulically translate that rainfall intensity into flooding on the ground during landfalling of hurricanes, given information about a region such as its surface and topography characteristics. They also simulated the same hurricanes’ storm surges, using hydrodynamic models to translate hurricanes’ maximum wind speed and sea level pressure into surge height in coastal areas. The simulation further assessed the propagation of ocean waters into coastal areas, causing coastal flooding.

    Then, the team developed a numerical hydrodynamic model to predict how two sources of hurricane-induced flooding, such as storm surge and rain-driven flooding, would simultaneously interact through time and space, as simulated hurricanes make landfall in coastal regions such as New York City, in both current and future climates.  

    “There’s a complex, nonlinear hydrodynamic interaction between saltwater surge-driven flooding and freshwater rainfall-driven flooding, that forms compound flooding that a lot of existing methods ignore,” Sarhadi says. “As a result, they underestimate the risk of compound flooding.”

    Amplified risk

    With their flood-forecasting method in place, the team applied it to a specific test case: New York City. They used the multipronged method to predict the city’s risk of compound flooding from hurricanes, and more specifically from Sandy-like hurricanes, in present and future climates. Their simulations showed that the city’s odds of experiencing Sandy-like flooding will increase significantly over the next decades as the climate warms, from once every 150 years in the current climate, to every 60 years by 2050, and every 30 years by 2099.

    Interestingly, they found that much of this increase in risk has less to do with how hurricanes themselves will change with warming climates, but with how sea levels will increase around the world.

    “In future decades, we will experience sea level rise in coastal areas, and we also incorporated that effect into our models to see how much that would increase the risk of compound flooding,” Sarhadi explains. “And in fact, we see sea level rise is playing a major role in amplifying the risk of compound flooding from hurricanes in New York City.”

    The team’s methodology can be applied to any coastal city to assess the risk of compound flooding from hurricanes and extratropical storms. With this approach, Sarhadi hopes decision-makers can make informed decisions regarding the implementation of adaptive measures, such as reinforcing coastal defenses to enhance infrastructure and community resilience.

    “Another aspect highlighting the urgency of our research is the projected 25 percent increase in coastal populations by midcentury, leading to heightened exposure to damaging storms,” Sarhadi says. “Additionally, we have trillions of dollars in assets situated in coastal flood-prone areas, necessitating proactive strategies to reduce damages from compound flooding from hurricanes under a warming climate.”

    This research was supported, in part, by Homesite Insurance. More

  • in

    3 Questions: How are cities managing record-setting temperatures?

    July 2023 was the hottest month globally since humans began keeping records. People all over the U.S. experienced punishingly high temperatures this summer. In Phoenix, there were a record-setting 31 consecutive days with a high temperature of 110 degrees Fahrenheit or more. July was the hottest month on record in Miami. A scan of high temperatures around the country often yielded some startlingly high numbers: Dallas, 110 F; Reno, 108 F; Salt Lake City, 106 F; Portland, 105 F.

    Climate change is a global and national crisis that cannot be solved by city governments alone, but cities suffering from it can try to enact new policies reducing emissions and adapting its effects. MIT’s David Hsu, an associate professor of urban and environmental planning, is an expert on metropolitan and regional climate policy. In one 2017 paper, Hsu and some colleagues estimated how 11 major U.S. cities could best reduce their carbon dioxide emissions, through energy-efficient home construction and retrofitting, improvements in vehicle gas mileage, more housing density, robust transit systems, and more. As we near the end of this historically hot summer, MIT News talked to Hsu about what cities are now doing in response to record heat, and the possibilities for new policy measures.

    Q: We’ve had record-setting temperatures in many cities across the U.S. this summer. Dealing with climate change certainly isn’t just the responsibility of those cities, but what have they been doing to make a difference, to the extent they can?

    A: I think this is a very top-of-mind question because even 10 or 15 years ago, we talked about adapting to a changed climate future, which seemed further off. But literally every week this summer we can refer to [dramatic] things that are already happening, clearly linked to climate change, and are going to get worse. We had wildfire smoke in the Northeast and throughout the Eastern Seaboard in June, this tragic wildfire in Hawaii that led to more deaths than any other wildfire in the U.S., [plus record high temperatures]. A lot of city leaders face climate challenges they thought were maybe 20 or 30 years in the future, and didn’t expect to see happen with this severity and intensity.

    One thing you’re seeing is changes in governance. A lot of cities have recently appointed a chief heat officer. Miami and Phoenix have them now, and this is someone responsible for coordinating response to heat waves, which turn out to be one of the biggest killers among climatological effects. There is an increasing realization not only among local governments, but insurance companies and the building industry, that flooding is going to affect many places. We have already seen flooding in the seaport area in Boston, the most recently built part of our city. In some sense just the realization among local governments, insurers, building owners, and residents, that some risks are here and now, already is changing how people think about those risks.

    Q: To what extent does a city being active about climate change at least signal to everyone, at the state or national level, that we have to do more? At the same time, some states are reacting against cities that are trying to institute climate initiatives and trying to prevent clean energy advances. What is possible at this point?

    A: We have this very large, heterogeneous and polarized country, and we have differences between states and within states in how they’re approaching climate change. You’ve got some cities trying to enact things like natural gas bans, or trying to limit greenhouse gas emissions, with some state governments trying to preempt them entirely. I think cities have a role in showing leadership. But one thing I harp on, having worked in city government myself, is that sometimes in cities we can be complacent. While we pride ourselves on being centers of innovation and less per-capita emissions — we’re using less than rural areas, and you’ll see people celebrating New York City as the greenest in the world — cities are responsible for consumption that produces a majority of emissions in most countries. If we’re going to decarbonize society, we have to get to zero altogether, and that requires cities to act much more aggressively.

    There is not only a pessimistic narrative. With the Inflation Reduction Act, which is rapidly accelerating the production of renewable energy, you see many of those subsidies going to build new manufacturing in red states. There’s a possibility people will see there are plenty of better paying, less dangerous jobs in [clean energy]. People don’t like monopolies wherever they live, so even places people consider fairly conservative would like local control [of energy], and that might mean greener jobs and lower prices. Yes, there is a doomscrolling loop of thinking polarization is insurmountable, but I feel surprisingly optimistic sometimes.

    Large parts of the Midwest, even in places people think of as being more conservative, have chosen to build a lot of wind energy, partly because it’s profitable. Historically, some farmers were self-reliant and had wind power before the electrical grid came. Even now in some places where people don’t want to address climate change, they’re more than happy to have wind power.

    Q: You’ve published work on which cities can pursue which policies to reduce emissions the most: better housing construction, more transit, more fuel-efficient vehicles, possibly higher housing density, and more. The exact recipe varies from place to place. But what are the common threads people can think about?

    A: It’s important to think about what the status quo is, and what we should be preparing for. The status quo simply doesn’t serve large parts of the population right now. Heat risk, flooding, and wildfires all disproportionately affect populations that are already vulnerable. If you’re elderly, or lack access to mobility, information, or warnings, you probably have a lower risk of surviving a wildfire. Many people do not have high-quality housing, and may be more exposed to heat or smoke. We know the climate has already changed, and is going to change more, but we have failed to prepare for foreseeable changes that already here. Lots of things that are climate-related but not only about climate change, like affordable housing, transportation, energy access for everyone so they can have services like cooking and the internet — those are things that we can change going forward. The hopeful message is: Cities are always changing and being built, so we should make them better. The urgent message is: We shouldn’t accept the status quo. More

  • in

    Explained: The 1.5 C climate benchmark

    The summer of 2023 has been a season of weather extremes.

    In June, uncontrolled wildfires ripped through parts of Canada, sending smoke into the U.S. and setting off air quality alerts in dozens of downwind states. In July, the world set the hottest global temperature on record, which it held for three days in a row, then broke again on day four.

    From July into August, unrelenting heat blanketed large parts of Europe, Asia, and the U.S., while India faced a torrential monsoon season, and heavy rains flooded regions in the northeastern U.S. And most recently, whipped up by high winds and dry vegetation, a historic wildfire tore through Maui, devastating an entire town.

    These extreme weather events are mainly a consequence of climate change driven by humans’ continued burning of coal, oil, and natural gas. Climate scientists agree that extreme weather such as what people experienced this summer will likely grow more frequent and intense in the coming years unless something is done, on a persistent and planet-wide scale, to rein in global temperatures.

    Just how much reining-in are they talking about? The number that is internationally agreed upon is 1.5 degrees Celsius. To prevent worsening and potentially irreversible effects of climate change, the world’s average temperature should not exceed that of preindustrial times by more than 1.5 degrees Celsius (2.7 degrees Fahrenheit).

    As more regions around the world face extreme weather, it’s worth taking stock of the 1.5-degree bar, where the planet stands in relation to this threshold, and what can be done at the global, regional, and personal level, to “keep 1.5 alive.”

    Why 1.5 C?

    In 2015, in response to the growing urgency of climate impacts, nearly every country in the world signed onto the Paris Agreement, a landmark international treaty under which 195 nations pledged to hold the Earth’s temperature to “well below 2 degrees Celsius above pre-industrial levels,” and going further, aim to “limit the temperature increase to 1.5 degrees Celsius above pre-industrial levels.”

    The treaty did not define a particular preindustrial period, though scientists generally consider the years from 1850 to 1900 to be a reliable reference; this time predates humans’ use of fossil fuels and is also the earliest period when global observations of land and sea temperatures are available. During this period, the average global temperature, while swinging up and down in certain years, generally hovered around 13.5 degrees Celsius, or 56.3 degrees Fahrenheit.

    The treaty was informed by a fact-finding report which concluded that, even global warming of 1.5 degrees Celsius above the preindustrial average, over an extended, decades-long period, would lead to high risks for “some regions and vulnerable ecosystems.” The recommendation then, was to set the 1.5 degrees Celsius limit as a “defense line” — if the world can keep below this line, it potentially could avoid the more extreme and irreversible climate effects that would occur with a 2 degrees Celsius increase, and for some places, an even smaller increase than that.

    But, as many regions are experiencing today, keeping below the 1.5 line is no guarantee of avoiding extreme, global warming effects.

    “There is nothing magical about the 1.5 number, other than that is an agreed aspirational target. Keeping at 1.4 is better than 1.5, and 1.3 is better than 1.4, and so on,” says Sergey Paltsev, deputy director of MIT’s Joint Program on the Science and Policy of Global Change. “The science does not tell us that if, for example, the temperature increase is 1.51 degrees Celsius, then it would definitely be the end of the world. Similarly, if the temperature would stay at 1.49 degrees increase, it does not mean that we will eliminate all impacts of climate change. What is known: The lower the target for an increase in temperature, the lower the risks of climate impacts.”

    How close are we to 1.5 C?

    In 2022, the average global temperature was about 1.15 degrees Celsius above preindustrial levels. According to the World Meteorological Organization (WMO), the cyclical weather phenomenon La Niña recently contributed to temporarily cooling and dampening the effects of human-induced climate change. La Niña lasted for three years and ended around March of 2023.

    In May, the WMO issued a report that projected a significant likelihood (66 percent) that the world would exceed the 1.5 degrees Celsius threshold in the next four years. This breach would likely be driven by human-induced climate change, combined with a warming El Niño — a cyclical weather phenomenon that temporarily heats up ocean regions and pushes global temperatures higher.

    This summer, an El Niño is currently underway, and the event typically raises global temperatures in the year after it sets in, which in this case would be in 2024. The WMO predicts that, for each of the next four years, the global average temperature is likely to swing between 1.1 and 1.8 degrees Celsius above preindustrial levels.

    Though there is a good chance the world will get hotter than the 1.5-degree limit as the result of El Niño, the breach would be temporary, and for now, would not have failed the Paris Agreement, which aims to keep global temperatures below the 1.5-degree limit over the long term (averaged over several decades rather than a single year).

    “But we should not forget that this is a global average, and there are variations regionally and seasonally,” says Elfatih Eltahir, the H.M. King Bhumibol Professor and Professor of Civil and Environmental Engineering at MIT. “This year, we had extreme conditions around the world, even though we haven’t reached the 1.5 C threshold. So, even if we control the average at a global magnitude, we are going to see events that are extreme, because of climate change.”

    More than a number

    To hold the planet’s long-term average temperature to below the 1.5-degree threshold, the world will have to reach net zero emissions by the year 2050, according to the Intergovernmental Panel on Climate Change (IPCC). This means that, in terms of the emissions released by the burning of coal, oil, and natural gas, the entire world will have to remove as much as it puts into the atmosphere.

    “In terms of innovations, we need all of them — even those that may seem quite exotic at this point: fusion, direct air capture, and others,” Paltsev says.

    The task of curbing emissions in time is particularly daunting for the United States, which generates the most carbon dioxide emissions of any other country in the world.

    “The U.S.’s burning of fossil fuels and consumption of energy is just way above the rest of the world. That’s a persistent problem,” Eltahir says. “And the national statistics are an aggregate of what a lot of individuals are doing.”

    At an individual level, there are things that can be done to help bring down one’s personal emissions, and potentially chip away at rising global temperatures.

    “We are consumers of products that either embody greenhouse gases, such as meat, clothes, computers, and homes, or we are directly responsible for emitting greenhouse gases, such as when we use cars, airplanes, electricity, and air conditioners,” Paltsev says. “Our everyday choices affect the amount of emissions that are added to the atmosphere.”

    But to compel people to change their emissions, it may be less about a number, and more about a feeling.

    “To get people to act, my hypothesis is, you need to reach them not just by convincing them to be good citizens and saying it’s good for the world to keep below 1.5 degrees, but showing how they individually will be impacted,” says Eltahir, who specializes on the study of regional climates, focusing on how climate change impacts the water cycle and frequency of extreme weather such as heat waves.

    “True climate progress requires a dramatic change in how the human system gets its energy,” Paltsev says. “It is a huge undertaking. Are you ready personally to make sacrifices and to change the way of your life? If one gets an honest answer to that question, it would help to understand why true climate progress is so difficult to achieve.” More

  • in

    Preparing Colombia’s cities for life amid changing forests

    It was an uncharacteristically sunny morning as Marcela Angel MCP ’18, flanked by a drone pilot from the Boston engineering firm AirWorks and a data collection team from the Colombian regional environmental agency Corpoamazonia, climbed a hill in the Andes Mountains of southwest Colombia. The area’s usual mountain cloud cover — one of the major challenges to working with satellite imagery or flying UAVs (unpiloted aerial vehicles, or drones) in the Pacific highlands of the Amazon — would roll through in the hours to come. But for now, her team had chosen a good day to hike out for their first flight. Angel is used to long travel for her research. Raised in Bogotá, she maintained strong ties to Colombia throughout her master’s program in the MIT Department of Urban Studies and Planning (DUSP). Her graduate thesis, examining Bogotá’s management of its public green space, took her regularly back to her hometown, exploring how the city could offer residents more equal access to the clean air, flood protection and day-to-day health and social benefits provided by parks and trees. But the hill she was hiking this morning, outside the remote city of Mocoa, had taken an especially long time to climb: five years building relationships with the community of Mocoa and the Colombian government, recruiting project partners, and navigating the bureaucracy of bringing UAVs into the country. Now, her team finally unwrapped their first, knee-high drone from its tarp and set it carefully in the grass. Under the gathering gray clouds, the buzz of its rotors joined the hum of insects in the trees, and the machine at last took to the skies.

    From Colombia to Cambridge

    “I actually grew up on the last street before the eastern mountains reserve,” Angel says of her childhood in Bogotá. “I’ve always been at that border between city and nature.” This idea, that urban areas are married to the ecosystems around them, would inform Angel’s whole education and career. Before coming to MIT, she studied architecture at Bogotá’s Los Andes University; for her graduation project she proposed a plan to resettle an informal neighborhood on Bogotá’s outskirts to minimize environmental risks to its residents. Among her projects at MIT was an initiative to spatially analyze Bogotá’s tree canopy, providing data for the city to plan a tree-planting program as a strategy to give vulnerable populations in the city more access to nature. And she was naturally intrigued when Colombia’s former minister of environment and sustainable development came to MIT in 2017 to give a guest presentation to the DUSP master’s program. The minister, Luis Gilberto Murillo (now the Colombian ambassador to the United States), introduced the students to the challenges triggered by a recent disaster in the city of Mocoa, on the border between the lowland Amazon and the Andes Mountains. Unprecedented rainstorms had destabilized the surrounding forests, and that April a devastating flood and landslide had killed hundreds of people and destroyed entire neighborhoods. And as climate change contributed to growing rainfall in the region, the risks of more landslide events were rising. Murillo provided useful insights into how city planning decisions had contributed to the crisis. But he also asked for MIT’s support addressing future landslide risks in the area. Angel and Juan Camilo Osorio, a PhD candidate at DUSP, decided to take up the challenge, and in January 2018 and 2019, a research delegation from MIT traveled to Colombia for a newly-created graduate course. Returning once again to Bogotá, Angel interviewed government agencies and nonprofits to understand the state of landslide monitoring and public policy. In Mocoa, further interviews and a series of workshops helped clarify what locals needed most and what MIT could provide: better information on where and when landslides might strike, and a process to increase risk awareness and involve traditionally marginalized groups in decision-making processes around that risk. Over the coming year, a core team formed to put the insights from this trip into action, including Angel, Osorio, postdoc Norhan Bayomi of the MIT Environmental Solutions Initiative (ESI) and MIT Professor John Fernández, director of the ESI and one of Angel’s mentors at DUSP. After a second visit to Mocoa that brought into the fold Indigenous groups, environmental agencies, and the national army, a plan was formed: MIT would partner with Corpoamazonia and build a network of community researchers to deploy and test drone technology and machine learning models to monitor the mountain forests for both landslide risks and signs of forest health, while implementing a participatory planning process with residents. “What our projects aim to do is give the communities new tools to continue protecting and restoring the forest,” says Angel, “and support new and inclusive development models, even in the face of new challenges.”

    Lifelines for the climate

    The goal of tropical forest conservation is an urgent one. As forests are cut down, their trees and soils release carbon they have stored over millennia, adding huge amounts of heat-trapping carbon dioxide to the atmosphere. Deforestation, mainly in the tropics, is now estimated to contribute more to climate change than any country besides the United States and China — and once lost, tropical forests are exceptionally hard to restore. “Tropical forests should be a natural way to slow and reverse climate change,” says Angel. “And they can be. But today, we are reaching critical tipping points where it is just the opposite.” This became the motivating force for Angel’s career after her graduation. In 2019, Fernández invited her to join the ESI and lead a new Natural Climate Solutions Program, with the Mocoa project as its first centerpiece. She quickly mobilized the partners to raise funding for the project from the Global Environmental Facility and the CAF Development Bank of Latin America and the Caribbean, and recruited additional partners including MIT Lincoln Laboratories, AirWorks, and the Pratt Institute, where Osorio had become an assistant professor. She hired machine learning specialists from MIT to begin design on UAVs’ data processing, and helped assemble a local research network in Mocoa to increase risk awareness, promote community participation, and better understand what information city officials and community groups needed for city planning and conservation. “This is the amazing thing about MIT,” she says. “When you study a problem here, you’re not just playing in a sandbox. Everyone I’ve worked with is motivated by the complexity of the technical challenge and the opportunity for meaningful engagement in Mocoa, and hopefully in many more places besides.” At the same time, Angel created opportunities for the next generation of MIT graduate students to follow in her footsteps. With Fernández and Bayomi, she created a new course, 4.S23 (Biodiversity and Cities), in which students traveled to Colombia to develop urban planning strategies for the cities of Quidbó and Leticia, located in carbon-rich and biodiverse areas. The course has been taught twice, with Professor Gabriella Carolini joining the teaching team for spring 2023, and has already led to a student report to city officials in Quidbó recommending ways to enhance biodiversity and adapt to climate change as the city grows, a multi-stakeholder partnership to train local youth and implement a citizen-led biodiversity survey, and a seed grant from the MIT Climate and Sustainability Consortium to begin providing both cities detailed data on their tree cover derived from satellite images. “These regions face serious threats, especially on a warming planet, but many of the solutions for climate change, biodiversity conservation, and environmental equity in the region go hand-in-hand,” Angel says. “When you design a city to use fewer resources, to contribute less to climate change, it also causes less pressure on the environment around it. When you design a city for equity and quality of life, you’re giving attention to its green spaces and what they can provide for people and as habitat for other species. When you protect and restore forests, you’re protecting local bioeconomies.”

    Bringing the data home

    Meanwhile, in Mocoa, Angel’s original vision is taking flight. With the team’s test flights behind them, they can now begin creating digital models of the surrounding area. Regular drone flights and soil samples will fill in changing information about trees, water, and local geology, allowing the project’s machine learning specialists to identify warning signs for future landslides and extreme weather events. More importantly, there is now an established network of local community researchers and leaders ready to make use of this information. With feedback from their Mocoan partners, Angel’s team has built a prototype of the online platform they will use to share their UAV data; they’re now letting Mocoa residents take it for a test drive and suggest how it can be made more user-friendly. Her visit this January also paved the way for new projects that will tie the Environmental Solutions Initiative more tightly to Mocoa. With her project partners, Angel is exploring developing a course to teach local students how to use UAVs like the ones her team is flying. She is also considering expanded efforts to collect the kind of informal knowledge of Mocoa, on the local ecology and culture, that people everywhere use in making their city planning and emergency response decisions, but that is rarely codified and included in scientific risk analyses. It’s a great deal of work to offer this one community the tools to adapt successfully to climate change. But even with all the robotics and machine learning models in the world, this close, slow-unfolding engagement, grounded in trust and community inclusion, is what it takes to truly prepare people to confront profound changes in their city and environment. “Protecting natural carbon sinks is a global socio-environmental challenge, and one where it is not enough for MIT to just contribute to the knowledge base or develop a new technology,” says Angel. “But we can help mobilize decision-makers and nontraditional actors, and design more inclusive and technology-enhanced processes, to make this easier for the people who have lifelong stakes in these ecosystems. That is the vision.” More