More stories

  • in

    Study: Marshes provide cost-effective coastal protection

    Images of coastal houses being carried off into the sea due to eroding coastlines and powerful storm surges are becoming more commonplace as climate change brings a rising sea level coupled with more powerful storms. In the U.S. alone, coastal storms caused $165 billion in losses in 2022.Now, a study from MIT shows that protecting and enhancing salt marshes in front of protective seawalls can significantly help protect some coastlines, at a cost that makes this approach reasonable to implement.The new findings are being reported in the journal Communications Earth and Environment, in a paper by MIT graduate student Ernie I. H. Lee and professor of civil and environmental engineering Heidi Nepf. This study, Nepf says, shows that restoring coastal marshes “is not just something that would be nice to do, but it’s actually economically justifiable.” The researchers found that, among other things, the wave-attenuating effects of salt marsh mean that the seawall behind it can be built significantly lower, reducing construction cost while still providing as much protection from storms.“One of the other exciting things that the study really brings to light,” Nepf says, “is that you don’t need a huge marsh to get a good effect. It could be a relatively short marsh, just tens of meters wide, that can give you benefit.” That makes her hopeful, Nepf says, that this information might be applied in places where planners may have thought saving a smaller marsh was not worth the expense. “We show that it can make enough of a difference to be financially viable,” she says.While other studies have previously shown the benefits of natural marshes in attenuating damaging storms, Lee says that such studies “mainly focus on landscapes that have a wide marsh on the order of hundreds of meters. But we want to show that it also applies in urban settings where not as much marsh land is available, especially since in these places existing gray infrastructure (seawalls) tends to already be in place.”The study was based on computer modeling of waves propagating over different shore profiles, using the morphology of various salt marsh plants — the height and stiffness of the plants, and their spatial density — rather than an empirical drag coefficient. “It’s a physically based model of plant-wave interaction, which allowed us to look at the influence of plant species and changes in morphology across seasons,” without having to go out and calibrate the vegetation drag coefficient with field measurements for each different condition, Nepf says.The researchers based their benefit-cost analysis on a simple metric: To protect a certain length of shoreline, how much could the height of a given seawall be reduced if it were accompanied by a given amount of marsh? Other ways of assessing the value, such as including the value of real estate that might be damaged by a given amount of flooding, “vary a lot depending on how you value the assets if a flood happens,” Lee says. “We use a more concrete value to quantify the benefits of salt marshes, which is the equivalent height of seawall you would need to deliver the same protection value.”They used models of a variety of plants, reflecting differences in height and the stiffness across different seasons. They found a twofold variation in the various plants’ effectiveness in attenuating waves, but all provided a useful benefit.To demonstrate the details in a real-world example and help to validate the simulations, Nepf and Lee studied local salt marshes in Salem, Massachusetts, where projects are already underway to try to restore marshes that had been degraded. Including the specific example provided a template for others, Nepf says. In Salem, their model showed that a healthy salt marsh could offset the need for an additional seawall height of 1.7 meters (about 5.5 feet), based on satisfying a rate of wave overtopping that was set for the safety of pedestrians.However, the real-world data needed to model a marsh, including maps of salt marsh species, plant height, and shoots per bed area, are “very labor-intensive” to put together, Nepf says. Lee is now developing a method to use drone imaging and machine learning to facilitate this mapmaking. Nepf says this will enable researchers or planners to evaluate a given area of marshland and say, “How much is this marsh worth in terms of its ability to reduce flooding?”The White House Office of Information and Regulatory Affairs recently released guidance for assessing the value of ecosystem services in planning of federal projects, Nepf explains.  “But in many scenarios, it lacks specific methods for quantifying value, and this study is meeting that need,” she says.The Federal Emergency Management Agency also has a benefit-cost analysis (BCA) toolkit, Lee notes. “They have guidelines on how to quantify each of the environmental services, and one of the novelties of this paper is quantifying the cost and the protection value of marshes. This is one of the applications that policymakers can consider on how to quantify the environmental service values of marshes,” he says.The software that environmental engineers can apply to specific sites has been made available online for free on GitHub. “It’s a one-dimensional model accessible by a standard consulting firm,” Nepf says.“This paper presents a practical tool for translating the wave attenuation capabilities of marshes into economic values, which could assist decision-makers in the adaptation of marshes for nature-based coastal defense,” says Xioaxia Zhang, a professor at Shenzen University in China who was not involved in this work. “The results indicate that salt marshes are not only environmentally beneficial but also cost-effective.”The study “is a very important and crucial step to quantifying the protective value of marshes,” adds Bas Borsje, an associate professor of nature-based flood protection at the University of Twente in the Netherlands, who was not associated with this work. “The most important step missing at the moment is how to translate our findings to the decision makers. This is the first time I’m aware of that decision-makers are quantitatively informed on the protection value of salt marshes.”Lee received support for this work from the Schoettler Scholarship Fund, administered by the MIT Department of Civil and Environmental Engineering. More

  • in

    MIT-derived algorithm helps forecast the frequency of extreme weather

    To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades, and even centuries, forward in time, but only at a coarse resolution. These models might be used to gauge, for instance, future climate conditions for the northeastern U.S., but not specifically for Boston.

    To estimate Boston’s future risk of extreme weather such as flooding, policymakers can combine a coarse model’s large-scale predictions with a finer-resolution model, tuned to estimate how often Boston is likely to experience damaging floods as the climate warms. But this risk analysis is only as accurate as the predictions from that first, coarser climate model.

    “If you get those wrong for large-scale environments, then you miss everything in terms of what extreme events will look like at smaller scales, such as over individual cities,” says Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering.

    Sapsis and his colleagues have now developed a method to “correct” the predictions from coarse climate models. By combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales. When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades, compared to predictions made without the correction scheme.

    Play video

    This animation shows the evolution of storms around the northern hemisphere, as a result of a high-resolution storm model, combined with the MIT team’s corrected global climate model. The simulation improves the modeling of extreme values for wind, temperature, and humidity, which typically have significant errors in coarse scale models. Credit: Courtesy of Ruby Leung and Shixuan Zhang, PNNL

    Sapsis says the new correction scheme is general in form and can be applied to any global climate model. Once corrected, the models can help to determine where and how often extreme weather will strike as global temperatures rise over the coming years. 

    “Climate change will have an effect on every aspect of human life, and every type of life on the planet, from biodiversity to food security to the economy,” Sapsis says. “If we have capabilities to know accurately how extreme weather will change, especially over specific locations, it can make a lot of difference in terms of preparation and doing the right engineering to come up with solutions. This is the method that can open the way to do that.”

    The team’s results appear today in the Journal of Advances in Modeling Earth Systems. The study’s MIT co-authors include postdoc Benedikt Barthel Sorensen and Alexis-Tzianni Charalampopoulos SM ’19, PhD ’23, with Shixuan Zhang, Bryce Harrop, and Ruby Leung of the Pacific Northwest National Laboratory in Washington state.

    Over the hood

    Today’s large-scale climate models simulate weather features such as the average temperature, humidity, and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power, and in order to simulate how weather features will interact and evolve over periods of decades or longer, models average out features every 100 kilometers or so.

    “It’s a very heavy computation requiring supercomputers,” Sapsis notes. “But these models still do not resolve very important processes like clouds or storms, which occur over smaller scales of a kilometer or less.”

    To improve the resolution of these coarse climate models, scientists typically have gone under the hood to try and fix a model’s underlying dynamical equations, which describe how phenomena in the atmosphere and oceans should physically interact.

    “People have tried to dissect into climate model codes that have been developed over the last 20 to 30 years, which is a nightmare, because you can lose a lot of stability in your simulation,” Sapsis explains. “What we’re doing is a completely different approach, in that we’re not trying to correct the equations but instead correct the model’s output.”

    The team’s new approach takes a model’s output, or simulation, and overlays an algorithm that nudges the simulation toward something that more closely represents real-world conditions. The algorithm is based on a machine-learning scheme that takes in data, such as past information for temperature and humidity around the world, and learns associations within the data that represent fundamental dynamics among weather features. The algorithm then uses these learned associations to correct a model’s predictions.

    “What we’re doing is trying to correct dynamics, as in how an extreme weather feature, such as the windspeeds during a Hurricane Sandy event, will look like in the coarse model, versus in reality,” Sapsis says. “The method learns dynamics, and dynamics are universal. Having the correct dynamics eventually leads to correct statistics, for example, frequency of rare extreme events.”

    Climate correction

    As a first test of their new approach, the team used the machine-learning scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM), a climate model run by the U.S. Department of Energy, that simulates climate patterns around the world at a resolution of 110 kilometers. The researchers used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They then ran the climate model forward in time for about 36 years and applied the trained algorithm to the model’s simulations. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years, not used for training.

    “We’re not talking about huge differences in absolute terms,” Sapsis says. “An extreme event in the uncorrected simulation might be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.”

    When the team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones, they found the approach accurately reproduced the frequency of extreme storms in specific locations around the world.

    “We now have a coarse model that can get you the right frequency of events, for the present climate. It’s much more improved,” Sapsis says. “Once we correct the dynamics, this is a relevant correction, even when you have a different average global temperature, and it can be used for understanding how forest fires, flooding events, and heat waves will look in a future climate. Our ongoing work is focusing on analyzing future climate scenarios.”

    “The results are particularly impressive as the method shows promising results on E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group at the University of Chicago and was not involved with the study. “It would be interesting to see what climate change projections this framework yields once future greenhouse-gas emission scenarios are incorporated.”

    This work was supported, in part, by the U.S. Defense Advanced Research Projects Agency. More

  • in

    New tool predicts flood risk from hurricanes in a warming climate

    Coastal cities and communities will face more frequent major hurricanes with climate change in the coming years. To help prepare coastal cities against future storms, MIT scientists have developed a method to predict how much flooding a coastal community is likely to experience as hurricanes evolve over the next decades.

    When hurricanes make landfall, strong winds whip up salty ocean waters that generate storm surge in coastal regions. As the storms move over land, torrential rainfall can induce further flooding inland. When multiple flood sources such as storm surge and rainfall interact, they can compound a hurricane’s hazards, leading to significantly more flooding than would result from any one source alone. The new study introduces a physics-based method for predicting how the risk of such complex, compound flooding may evolve under a warming climate in coastal cities.

    One example of compound flooding’s impact is the aftermath from Hurricane Sandy in 2012. The storm made landfall on the East Coast of the United States as heavy winds whipped up a towering storm surge that combined with rainfall-driven flooding in some areas to cause historic and devastating floods across New York and New Jersey.

    In their study, the MIT team applied the new compound flood-modeling method to New York City to predict how climate change may influence the risk of compound flooding from Sandy-like hurricanes over the next decades.  

    They found that, in today’s climate, a Sandy-level compound flooding event will likely hit New York City every 150 years. By midcentury, a warmer climate will drive up the frequency of such flooding, to every 60 years. At the end of the century, destructive Sandy-like floods will deluge the city every 30 years — a fivefold increase compared to the present climate.

    “Long-term average damages from weather hazards are usually dominated by the rare, intense events like Hurricane Sandy,” says study co-author Kerry Emanuel, professor emeritus of atmospheric science at MIT. “It is important to get these right.”

    While these are sobering projections, the researchers hope the flood forecasts can help city planners prepare and protect against future disasters. “Our methodology equips coastal city authorities and policymakers with essential tools to conduct compound flooding risk assessments from hurricanes in coastal cities at a detailed, granular level, extending to each street or building, in both current and future decades,” says study author Ali Sarhadi, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences.

    The team’s open-access study appears online today in the Bulletin of the American Meteorological Society. Co-authors include Raphaël Rousseau-Rizzi at MIT’s Lorenz Center, Kyle Mandli at Columbia University, Jeffrey Neal at the University of Bristol, Michael Wiper at the Charles III University of Madrid, and Monika Feldmann at the Swiss Federal Institute of Technology Lausanne.

    The seeds of floods

    To forecast a region’s flood risk, weather modelers typically look to the past. Historical records contain measurements of previous hurricanes’ wind speeds, rainfall, and spatial extent, which scientists use to predict where and how much flooding may occur with coming storms. But Sarhadi believes that the limitations and brevity of these historical records are insufficient for predicting future hurricanes’ risks.

    “Even if we had lengthy historical records, they wouldn’t be a good guide for future risks because of climate change,” he says. “Climate change is changing the structural characteristics, frequency, intensity, and movement of hurricanes, and we cannot rely on the past.”

    Sarhadi and his colleagues instead looked to predict a region’s risk of hurricane flooding in a changing climate using a physics-based risk assessment methodology. They first paired simulations of hurricane activity with coupled ocean and atmospheric models over time. With the hurricane simulations, developed originally by Emanuel, the researchers virtually scatter tens of thousands of “seeds” of hurricanes into a simulated climate. Most seeds dissipate, while a few grow into category-level storms, depending on the conditions of the ocean and atmosphere.

    When the team drives these hurricane simulations with climate models of ocean and atmospheric conditions under certain global temperature projections, they can see how hurricanes change, for instance in terms of intensity, frequency, and size, under past, current, and future climate conditions.

    The team then sought to precisely predict the level and degree of compound flooding from future hurricanes in coastal cities. The researchers first used rainfall models to simulate rain intensity for a large number of simulated hurricanes, then applied numerical models to hydraulically translate that rainfall intensity into flooding on the ground during landfalling of hurricanes, given information about a region such as its surface and topography characteristics. They also simulated the same hurricanes’ storm surges, using hydrodynamic models to translate hurricanes’ maximum wind speed and sea level pressure into surge height in coastal areas. The simulation further assessed the propagation of ocean waters into coastal areas, causing coastal flooding.

    Then, the team developed a numerical hydrodynamic model to predict how two sources of hurricane-induced flooding, such as storm surge and rain-driven flooding, would simultaneously interact through time and space, as simulated hurricanes make landfall in coastal regions such as New York City, in both current and future climates.  

    “There’s a complex, nonlinear hydrodynamic interaction between saltwater surge-driven flooding and freshwater rainfall-driven flooding, that forms compound flooding that a lot of existing methods ignore,” Sarhadi says. “As a result, they underestimate the risk of compound flooding.”

    Amplified risk

    With their flood-forecasting method in place, the team applied it to a specific test case: New York City. They used the multipronged method to predict the city’s risk of compound flooding from hurricanes, and more specifically from Sandy-like hurricanes, in present and future climates. Their simulations showed that the city’s odds of experiencing Sandy-like flooding will increase significantly over the next decades as the climate warms, from once every 150 years in the current climate, to every 60 years by 2050, and every 30 years by 2099.

    Interestingly, they found that much of this increase in risk has less to do with how hurricanes themselves will change with warming climates, but with how sea levels will increase around the world.

    “In future decades, we will experience sea level rise in coastal areas, and we also incorporated that effect into our models to see how much that would increase the risk of compound flooding,” Sarhadi explains. “And in fact, we see sea level rise is playing a major role in amplifying the risk of compound flooding from hurricanes in New York City.”

    The team’s methodology can be applied to any coastal city to assess the risk of compound flooding from hurricanes and extratropical storms. With this approach, Sarhadi hopes decision-makers can make informed decisions regarding the implementation of adaptive measures, such as reinforcing coastal defenses to enhance infrastructure and community resilience.

    “Another aspect highlighting the urgency of our research is the projected 25 percent increase in coastal populations by midcentury, leading to heightened exposure to damaging storms,” Sarhadi says. “Additionally, we have trillions of dollars in assets situated in coastal flood-prone areas, necessitating proactive strategies to reduce damages from compound flooding from hurricanes under a warming climate.”

    This research was supported, in part, by Homesite Insurance. More

  • in

    Q&A: Steven Gonzalez on Indigenous futurist science fiction

    Steven Gonzalez is a PhD candidate in the MIT Doctoral Program in History, Anthropology, Science, Technology, and Society (HASTS), where he researches the environmental impacts of cloud computing and data centers in the United States, Iceland, and Puerto Rico. He is also an author. Writing under the name E.G. Condé, he recently published his first book, “Sordidez.” It’s described as an “Indigenous futurist science fiction novella set in Puerto Rico and the Yucatán.” Set in the near future, it follows the survivors of civil war and climate disaster led by protagonist Vero Diaz, as they reclaim their Indigenous heritage and heal their lands.

    In this Q&A, Gonzalez describes the book’s themes, its inspirations, and its connection to research, people, and classes at MIT.

    Q: Where did the inspiration for this story come from?

    A: I actually began my time at MIT in September of 2017 when Hurricane María struck. It was a really difficult time for me at the Institute, starting a PhD program. And it’s MIT, so there’s a lot of pressure. I was still kind of navigating the new institutional space and trying to understand my place in it. But I had a lot of people at the Institute who were extremely supportive during that time. I had family members in Puerto Rico who were stranded as a result of the hurricane, who I didn’t hear from for a very long time — who I feared dead. It was a very, very chaotic, confusing, and emotionally turbulent time for me, and also incredibly difficult to be trying to be present in a PhD program for the first semester. Karen Gardner, our administrator, was really incredibly supportive in that. Also the folks at the MIT Association of Puerto Ricans, who hosted fundraisers and linked students with counseling resources. But that trauma of the hurricane and the images that I saw of the aftermath of the hurricane, specifically in the town where my grandmother’s house was where I spent time living as a child during the summers, and to me, it was the greenest place that I have ever known. It looked like somebody had torched the entire landscape. It was traumatizing to see that image. But that kind of seeded the idea of, is there a way to burn without fire? There’s climate change, but there’s also climate terror. And so that was sort of one of the premises of the book explores, geoengineering, but also the flip side of geoengineering and terraforming is, of course, climate terror. And in a way, we could frame what’s been happening with the fossil fuel industry as a form of climate terror, as well. So for me, this all began right when I started at MIT, these dual tracks of thought.

    Q: What do you see as the core themes of your novella?

    A: One major theme is rebuilding. As I said, this story was very influenced by the trauma of Hurricane María and the incredibly inspiring accounts from family members, from people in Puerto Rico that I know, of regular people stepping up when the government — both federal and local — essentially abandoned them. There were so many failures of governance. But people stepped up and did what they could to help each other, to help neighbors. Neighbors cleared trees from roads. They banded together to do this. They pooled resources, to run generators so that everyone in the same street could have food that day. They would share medical supplies like insulin and things that were scarce. This was incredibly inspiring for me. And a huge theme of the book is rebuilding in the aftermath of a fictive hurricane, which I call Teddy, named after President Theodore Roosevelt, where Puerto Rico’s journey began as a U.S. commonwealth or a colony.

    Healing is also a huge theme. Healing in the sense of this story was also somewhat critical of Puerto Rican culture. And it’s refracted through my own experience as a queer person navigating the space of Puerto Rico as a very kind of religious and traditional place and a very complex place at that. The main character, Vero, is a trans man. This is a person who’s transitioned and has felt a lot of alienation and as a result of his gender transition, a lot of people don’t accept him and don’t accept his identity or who he is even though he’s incredibly helpful in this rebuilding effort to the point where he’s, in some ways, a leader, if not the leader. And it becomes, in a way, about healing from the trauma of rejection too. And of course, Vero, but other characters who have gone through various traumas that I think are very much shared across Latin America, the Latin American experiences of assimilation, for instance. Latin America is a very complex place. We have Spanish as our language, that is our kind of lingua franca. But there are many Indigenous languages that people speak that have been not valued or people who speak them or use them are actively punished. And there’s this deep trauma of losing language. And in the case of Puerto Rico, the Indigenous language of the Taínos has been destroyed by colonialism. The story is about rebuilding that language and healing and “becoming.” In some ways, it’s about re-indigenization. And then the last part, as I said, healing, reconstruction, but also transformation and metamorphosis. And becoming Taíno. Again, what does that mean? What does it mean to be an Indigenous Caribbean in the future? And so that’s one of the central themes of the story.

    Q: How does the novella intersect with the work you’re doing as a PhD candidate in HASTS?

    A: My research on cloud computing is very much about climate change. It’s pitched within the context of climate change and understanding how our digital ecosystem contributes to not only global warming, but things like desertification. As a social scientist, that’s what I study. My studies of infrastructure are also directly referenced in the book in a lot of ways. For instance, the now collapsed Arecibo Ionosphere Observatory, where some of my pandemic fieldwork occurred, is a setting in the book. And also, I am an anthropologist. I am Puerto Rican. I draw both from my personal experience and my anthropological lens to make a story that I think is very multicultural and multilingual. It’s set in Puerto Rico, but the other half is set in the Yucatán Peninsula in what we’ll call the former Maya world. And there’s a lot of intersections between the two settings. And that goes back to the deeper Indigenous history. Some people are calling this Indigenous futurism because it references the Taínos, who are the Indigenous people of Puerto Rico, but also the Mayas, and many different Maya groups that are throughout the Yucatán Peninsula, but also present-day Guatemala and Honduras. And the story is about exchange between these two worlds. As someone trained as an anthropologist, it’s a really difficult task to kind of pull that off. And I think that my training has really, really helped me achieve that.

    Q: Are there any examples of ways being among the MIT community while writing this book influenced and, in some ways, made this project possible?

    A: I relied on many of my colleagues for support. There’s some sign language in the book. In Puerto Rico, there’s a big tradition of sign language. There’s a version of American sign language called LSPR that’s only found in Puerto Rico. And that’s something I’ve been aware of ever since I was a kid. But I’m not fluent in sign language or deaf communities and their culture. I got a lot of help from Timothy Loh, who’s in the HASTS program, who was extremely helpful to steer me towards sensitivity readers in the deaf community in his networks. My advisor, Stefan Helmreich, is very much a science fiction person in a lot of ways. His research is on the ocean waves, the history and anthropology of biology. He’s done ethnography in deep-sea submersibles. He’s always kind of thinking in a science fictional lens. And he allowed me, for one of my qualifying exam lists, to mesh science fiction with social theory. And that was also a way that I felt very supported by the Institute. In my coursework, I also took a few science fiction courses in other departments. I worked with Shariann Lewitt, who actually read the first version of the story. I workshopped it in her 21W.759 (Writing Science Fiction) class, and got some really amazing feedback that led to what is now a publication and a dream fulfilled in so many ways. She took me under her wing and really believed in this book. More

  • in

    Kerry Emanuel: A climate scientist and meteorologist in the eye of the storm

    Kerry Emanuel once joked that whenever he retired, he would start a “hurricane safari” so other people could experience what it’s like to fly into the eye of a hurricane.

    “All of a sudden, the turbulence stops, the sun comes out, bright sunshine, and it’s amazingly calm. And you’re in this grand stadium [of clouds miles high],” he says. “It’s quite an experience.”

    While the hurricane safari is unlikely to come to fruition — “You can’t just conjure up a hurricane,” he explains — Emanuel, a world-leading expert on links between hurricanes and climate change, is retiring from teaching in the Department of Earth Atmospheric and Planetary Sciences (EAPS) at MIT after a more than 40-year career.

    Best known for his foundational contributions to the science of tropical cyclones, climate, and links between them, Emanuel has also been a prominent voice in public debates on climate change, and what we should do about it.

    “Kerry has had an enormous effect on the world through the students and junior scientists he has trained,” says William Boos PhD ’08, an atmospheric scientist at the University of California at Berkeley. “He’s a brilliant enough scientist and theoretician that he didn’t need any of us to accomplish what he has, but he genuinely cares about educating new generations of scientists and helping to launch their careers.”

    In recognition of Emanuel’s teaching career and contributions to science, a symposium was held in his honor at MIT on June 21 and 22, organized by several of his former students and collaborators, including Boos. Research presented at the symposium focused on the many fields influenced by Emanuel’s more than 200 published research papers — on everything from forecasting the risks posed by tropical cyclones to understanding how rainfall is produced by continent-sized patterns of atmospheric circulation.

    Emanuel’s career observing perturbations of Earth’s atmosphere started earlier than he can remember. “According to my older brother, from the age of 2, I would crawl to the window whenever there was a thunderstorm,” he says. At first, those were the rolling thunderheads of the Midwest where he grew up, then it was the edges of hurricanes during a few teenage years in Florida. Eventually, he would find himself watching from the very eye of the storm, both physically and mathematically.

    Emanuel attended MIT both as an undergraduate studying Earth and planetary sciences, and for his PhD in meteorology, writing a dissertation on thunderstorms that form ahead of cold fronts. Within the department, he worked with some of the central figures of modern meteorology such as Jule Charney, Fred Sanders, and Edward Lorenz — the founder of chaos theory.

    After receiving his PhD in 1978, Emanuel joined the faculty of the University of California at Los Angeles. During this period, he also took a semester sabbatical to film the wind speeds of tornadoes in Texas and Oklahoma. After three years, he returned to MIT and joined the Department of Meteorology in 1981. Two years later, the department merged with Earth and Planetary Sciences to form EAPS as it is known today, and where Emanuel has remained ever since.

    At MIT, he shifted scales. The thunderstorms and tornadoes that had been the focus of Emanuel’s research up to then were local atmospheric phenomena, or “mesoscale” in the language of meteorologists. The larger “synoptic scale” storms that are hurricanes blew into Emanuel’s research when as a young faculty member he was asked to teach a class in tropical meteorology; in prepping for the class, Emanuel found his notes on hurricanes from graduate school no longer made sense.

    “I realized I didn’t understand them because they couldn’t have been correct,” he says. “And so I set out to try to find a much better theoretical formulation for hurricanes.”

    He soon made two important contributions. In 1986, his paper “An Air-Sea Interaction Theory for Tropical Cyclones. Part 1: Steady-State Maintenance” developed a new theory for upper limits of hurricane intensity given atmospheric conditions. This work in turn led to even larger-scale questions to address. “That upper bound had to be dependent on climate, and it was likely to go up if we were to warm the climate,” Emanuel says — a phenomenon he explored in another paper, “The Dependence of Hurricane Intensity on Climate,” which showed how warming sea surface temperatures and changing atmospheric conditions from a warming climate would make hurricanes more destructive.

    “In my view, this is among the most remarkable achievements in theoretical geophysics,” says Adam Sobel PhD ’98, an atmospheric scientist at Columbia University who got to know Emanuel after he graduated and became interested in tropical meteorology. “From first principles, using only pencil-and-paper analysis and physical reasoning, he derives a quantitative bound on hurricane intensity that has held up well over decades of comparison to observations” and underpins current methods of predicting hurricane intensity and how it changes with climate.

    This and diverse subsequent work led to numerous honors, including membership to the American Philosophical Society, the National Academy of Sciences, and the American Academy of Arts and Sciences.

    Emanuel’s research was never confined to academic circles, however; when politicians and industry leaders voiced loud opposition to the idea that human-caused climate change posed a threat, he spoke up.

    “I felt kind of a duty to try to counter that,” says Emanuel. “I thought it was an interesting challenge to see if you could go out and convince what some people call climate deniers, skeptics, that this was a serious risk and we had to treat it as such.”

    In addition to many public lectures and media appearances discussing climate change, Emanuel penned a book for general audiences titled “What We Know About Climate Change,” in addition to a widely-read primer on climate change and risk assessment designed to influence business leaders.

    “Kerry has an unmatched physical understanding of tropical climate phenomena,” says Emanuel’s colleague, Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at EAPS. “But he’s also a great communicator and has generously given his time to public outreach. His book ‘What We Know About Climate Change’ is a beautiful piece of work that is readily understandable and has captivated many a non-expert reader.”

    Along with a number of other prominent climate scientists, Emanuel also began advocating for expanding nuclear power as the most rapid path to decarbonizing the world’s energy systems.

    “I think the impediment to nuclear is largely irrational in the United States,” he says. “So, I’ve been trying to fight that just like I’ve been trying to fight climate denial.”

    One lesson Emanuel has taken from his public work on climate change is that skeptical audiences often respond better to issues framed in positive terms than to doom and gloom; he’s found emphasizing the potential benefits rather than the sacrifices involved in the energy transition can engage otherwise wary audiences.

    “It’s really not opposition to science, per se,” he says. “It’s fear of the societal changes they think are required to do something about it.”

    He has also worked to raise awareness about how insurance companies significantly underestimate climate risks in their policies, in particular by basing hurricane risk on unreliable historical data. One recent practical result has been a project by the First Street Foundation to assess the true flood risk of every property in the United States using hurricane models Emanuel developed.

    “I think it’s transformative,” Emanuel says of the project with First Street. “That may prove to be the most substantive research I’ve done.”

    Though Emanuel is retiring from teaching, he has no plans to stop working. “When I say ‘retire’ it’s in quotes,” he says. In 2011, Emanuel and Professor of Geophysics Daniel Rothman founded the Lorenz Center, a climate research center at MIT in honor of Emanuel’s mentor and friend Edward Lorenz. Emanuel will continue to participate in work at the center, which aims to counter what Emanuel describes as a trend away from “curiosity-driven” work in climate science.

    “Even if there were no such thing as global warming, [climate science] would still be a really, really exciting field,” says Emanuel. “There’s so much to understand about climate, about the climates of the past, about the climates of other planets.”

    In addition to work with the Lorenz Center, he’s become interested once again in tornadoes and severe local storms, and understanding whether climate also controls such local phenomena. He’s also involved in two of MIT’s Climate Grand Challenges projects focused on translating climate hazards to explicit financial and health risks — what will bring the dangers of climate change home to people, he says, is for the public to understand more concrete risks, like agricultural failure, water shortages, electricity shortages, and severe weather events. Capturing that will drive the next few years of his work.

    “I’m going to be stepping up research in some respects,” he says, now living full-time at his home in Maine.

    Of course, “retiring” does mean a bit more free time for new pursuits, like learning a language or an instrument, and “rediscovering the art of sailing,” says Emanuel. He’s looking forward to those days on the water, whatever storms are to come. More

  • in

    Looking forward to forecast the risks of a changing climate

    On April 11, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the third in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    Extreme weather events that were once considered rare have become noticeably less so, from intensifying hurricane activity in the North Atlantic to wildfires generating massive clouds of ozone-damaging smoke. But current climate models are unprepared when it comes to estimating the risk that these increasingly extreme events pose — and without adequate modeling, governments are left unable to take necessary precautions to protect their communities.

    MIT Department of Earth, Atmospheric and Planetary Science (EAPS) Professor Paul O’Gorman researches this trend by studying how climate affects the atmosphere and incorporating what he learns into climate models to improve their accuracy. One particular focus for O’Gorman has been changes in extreme precipitation and midlatitude storms that hit areas like New England.

    “These extreme events are having a lot of impact, but they’re also difficult to model or study,” he says. Seeing the pressing need for better climate models that can be used to develop preparedness plans and climate change mitigation strategies, O’Gorman and collaborators Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in EAPS, and Miho Mazereeuw, associate professor in MIT’s Department of Architecture, are leading an interdisciplinary group of scientists, engineers, and designers to tackle this problem with their MIT Climate Grand Challenges flagship project, “Preparing for a new world of weather and climate extremes.”

    “We know already from observations and from climate model predictions that weather and climate extremes are changing and will change more,” O’Gorman says. “The grand challenge is preparing for those changing extremes.”

    Their proposal is one of five flagship projects recently announced by the MIT Climate Grand Challenges initiative — an Institute-wide effort catalyzing novel research and engineering innovations to address the climate crisis. Selected from a field of almost 100 submissions, the team will receive additional funding and exposure to help accelerate and scale their project goals. Other MIT collaborators on the proposal include researchers from the School of Engineering, the School of Architecture and Planning, the Office of Sustainability, the Center for Global Change Science, and the Institute for Data, Systems and Society.

    Weather risk modeling

    Fifteen years ago, Kerry Emanuel developed a simple hurricane model. It was based on physics equations, rather than statistics, and could run in real time, making it useful for modeling risk assessment. Emanuel wondered if similar models could be used for long-term risk assessment of other things, such as changes in extreme weather because of climate change.

    “I discovered, somewhat to my surprise and dismay, that almost all extant estimates of long-term weather risks in the United States are based not on physical models, but on historical statistics of the hazards,” says Emanuel. “The problem with relying on historical records is that they’re too short; while they can help estimate common events, they don’t contain enough information to make predictions for more rare events.”

    Another limitation of weather risk models which rely heavily on statistics: They have a built-in assumption that the climate is static.

    “Historical records rely on the climate at the time they were recorded; they can’t say anything about how hurricanes grow in a warmer climate,” says Emanuel. The models rely on fixed relationships between events; they assume that hurricane activity will stay the same, even while science is showing that warmer temperatures will most likely push typical hurricane activity beyond the tropics and into a much wider band of latitudes.

    As a flagship project, the goal is to eliminate this reliance on the historical record by emphasizing physical principles (e.g., the laws of thermodynamics and fluid mechanics) in next-generation models. The downside to this is that there are many variables that have to be included. Not only are there planetary-scale systems to consider, such as the global circulation of the atmosphere, but there are also small-scale, extremely localized events, like thunderstorms, that influence predictive outcomes.

    Trying to compute all of these at once is costly and time-consuming — and the results often can’t tell you the risk in a specific location. But there is a way to correct for this: “What’s done is to use a global model, and then use a method called downscaling, which tries to infer what would happen on very small scales that aren’t properly resolved by the global model,” explains O’Gorman. The team hopes to improve downscaling techniques so that they can be used to calculate the risk of very rare but impactful weather events.

    Global climate models, or general circulation models (GCMs), Emanuel explains, are constructed a bit like a jungle gym. Like the playground bars, the Earth is sectioned in an interconnected three-dimensional framework — only it’s divided 100 to 200 square kilometers at a time. Each node comprises a set of computations for characteristics like wind, rainfall, atmospheric pressure, and temperature within its bounds; the outputs of each node are connected to its neighbor. This framework is useful for creating a big picture idea of Earth’s climate system, but if you tried to zoom in on a specific location — like, say, to see what’s happening in Miami or Mumbai — the connecting nodes are too far apart to make predictions on anything specific to those areas.

    Scientists work around this problem by using downscaling. They use the same blueprint of the jungle gym, but within the nodes they weave a mesh of smaller features, incorporating equations for things like topography and vegetation or regional meteorological models to fill in the blanks. By creating a finer mesh over smaller areas they can predict local effects without needing to run the entire global model.

    Of course, even this finer-resolution solution has its trade-offs. While we might be able to gain a clearer picture of what’s happening in a specific region by nesting models within models, it can still make for a computing challenge to crunch all that data at once, with the trade-off being expense and time, or predictions that are limited to shorter windows of duration — where GCMs can be run considering decades or centuries, a particularly complex local model may be restricted to predictions on timescales of just a few years at a time.

    “I’m afraid that most of the downscaling at present is brute force, but I think there’s room to do it in better ways,” says Emanuel, who sees the problem of finding new and novel methods of achieving this goal as an intellectual challenge. “I hope that through the Grand Challenges project we might be able to get students, postdocs, and others interested in doing this in a very creative way.”

    Adapting to weather extremes for cities and renewable energy

    Improving climate modeling is more than a scientific exercise in creativity, however. There’s a very real application for models that can accurately forecast risk in localized regions.

    Another problem is that progress in climate modeling has not kept up with the need for climate mitigation plans, especially in some of the most vulnerable communities around the globe.

    “It is critical for stakeholders to have access to this data for their own decision-making process. Every community is composed of a diverse population with diverse needs, and each locality is affected by extreme weather events in unique ways,” says Mazereeuw, the director of the MIT Urban Risk Lab. 

    A key piece of the team’s project is building on partnerships the Urban Risk Lab has developed with several cities to test their models once they have a usable product up and running. The cities were selected based on their vulnerability to increasing extreme weather events, such as tropical cyclones in Broward County, Florida, and Toa Baja, Puerto Rico, and extratropical storms in Boston, Massachusetts, and Cape Town, South Africa.

    In their proposal, the team outlines a variety of deliverables that the cities can ultimately use in their climate change preparations, with ideas such as online interactive platforms and workshops with stakeholders — such as local governments, developers, nonprofits, and residents — to learn directly what specific tools they need for their local communities. By doing so, they can craft plans addressing different scenarios in their region, involving events such as sea-level rise or heat waves, while also providing information and means of developing adaptation strategies for infrastructure under these conditions that will be the most effective and efficient for them.

    “We are acutely aware of the inequity of resources both in mitigating impacts and recovering from disasters. Working with diverse communities through workshops allows us to engage a lot of people, listen, discuss, and collaboratively design solutions,” says Mazereeuw.

    By the end of five years, the team is hoping that they’ll have better risk assessment and preparedness tool kits, not just for the cities that they’re partnering with, but for others as well.

    “MIT is well-positioned to make progress in this area,” says O’Gorman, “and I think it’s an important problem where we can make a difference.” More

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More

  • in

    How marsh grass protects shorelines

    Marsh plants, which are ubiquitous along the world’s shorelines, can play a major role in mitigating the damage to coastlines as sea levels rise and storm surges increase. Now, a new MIT study provides greater detail about how these protective benefits work under real-world conditions shaped by waves and currents.

    The study combined laboratory experiments using simulated plants in a large wave tank along with mathematical modeling. It appears in the journal Physical Review — Fluids, in a paper by former MIT visiting doctoral student Xiaoxia Zhang, now a postdoc at Dalian University of Technology, and professor of civil and environmental engineering Heidi Nepf.

    It’s already clear that coastal marsh plants provide significant protection from surges and devastating  storms. For example, it has been estimated that the damage caused by Hurricane Sandy was reduced by $625 million thanks to the damping of wave energy provided by extensive areas of marsh along the affected coasts. But the new MIT analysis incorporates details of plant morphology, such as the number and spacing of flexible leaves versus stiffer stems, and the complex interactions of currents and waves that may be coming from different directions.

    This level of detail could enable coastal restoration planners to determine the area of marsh needed to mitigate expected amounts of storm surge or sea-level rise, and to decide which types of plants to introduce to maximize protection.

    “When you go to a marsh, you often will see that the plants are arranged in zones,” says Nepf, who is the Donald and Martha Harleman Professor of Civil and Environmental Engineering. “Along the edge, you tend to have plants that are more flexible, because they are using their flexibility to reduce the wave forces they feel. In the next zone, the plants are a little more rigid and have a bit more leaves.”

    As the zones progress, the plants become stiffer, leafier, and more effective at absorbing wave energy thanks to their greater leaf area. The new modeling done in this research, which incorporated work with simulated plants in the 24-meter-long wave tank at MIT’s Parsons Lab, can enable coastal planners to take these kinds of details into account when planning protection, mitigation, or restoration projects.

    “If you put the stiffest plants at the edge, they might not survive, because they’re feeling very high wave forces. By describing why Mother Nature organizes plants in this way, we can hopefully design a more sustainable restoration,” Nepf says.

    Once established, the marsh plants provide a positive feedback cycle that helps to not only stabilize but also build up these delicate coastal lands, Zhang says. “After a few years, the marsh grasses start to trap and hold the sediment, and the elevation gets higher and higher, which might keep up with sea level rise,” she says.

    The new MIT analysis incorporates details of plant morphology, such as the number and spacing of flexible leaves versus stiffer stems, and the complex interactions of currents and waves that may be coming from different directions.

    Awareness of the protective effects of marshland has been growing, Nepf says. For example, the Netherlands has been restoring lost marshland outside the dikes that surround much of the nation’s agricultural land, finding that the marsh can protect the dikes from erosion; the marsh and dikes work together much more effectively than the dikes alone at preventing flooding.

    But most such efforts so far have been largely empirical, trial-and-error plans, Nepf says. Now, they could take advantage of this modeling to know just how much marshland with what types of plants would be needed to provide the desired level of protection.

    It also provides a more quantitative way to estimate the value provided by marshes, she says. “It could allow you to more accurately say, ‘40 meters of marsh will reduce waves this much and therefore will reduce overtopping of your levee by this much.’ Someone could use that to say, ‘I’m going to save this much money over the next 10 years if I reduce flooding by maintaining this marsh.’ It might help generate some political motivation for restoration efforts.”

    Nepf herself is already trying to get some of these findings included in coastal planning processes. She serves on a practitioner panel led by Chris Esposito of the Water Institute of the Gulf, which serves the storm-battered Louisiana coastline. “We’d like to get this work into the coatal simulations that are used for large-scale restoration and coastal planning,” she says.

    “Understanding the wave damping process in real vegetation wetlands is of critical value, as it is needed in the assessment of the coastal defense value of these wetlands,” says Zhan Hu, an associate professor of marine sciences at Sun Yat-Sen University, who was not associated with this work. “The challenge, however, lies in the quantitative representation of the wave damping process, in which many factors are at play, such as plant flexibility, morphology, and coexisting currents.”

    The new study, Hu says, “neatly combines experimental findings and analytical modeling to reveal the impact of each factor in the wave damping process. … Overall, this work is a solid step forward toward a more accurate assessment of wave damping capacity of real coastal wetlands, which is needed for science-based design and management of nature-based coastal protection.”

    The work was partly supported by the National Science Foundation and the China Scholarship Council.  More