More stories

  • in

    How climate change will impact outdoor activities in the US

    It can be hard to connect a certain amount of average global warming with one’s everyday experience, so researchers at MIT have devised a different approach to quantifying the direct impact of climate change. Instead of focusing on global averages, they came up with the concept of “outdoor days”: the number days per year in a given location when the temperature is not too hot or cold to enjoy normal outdoor activities, such as going for a walk, playing sports, working in the garden, or dining outdoors.In a study published earlier this year, the researchers applied this method to compare the impact of global climate change on different countries around the world, showing that much of the global south would suffer major losses in the number of outdoor days, while some northern countries could see a slight increase. Now, they have applied the same approach to comparing the outcomes for different parts of the United States, dividing the country into nine climatic regions, and finding similar results: Some states, especially Florida and other parts of the Southeast, should see a significant drop in outdoor days, while some, especially in the Northwest, should see a slight increase.The researchers also looked at correlations between economic activity, such as tourism trends, and changing climate conditions, and examined how numbers of outdoor days could result in significant social and economic impacts. Florida’s economy, for example, is highly dependent on tourism and on people moving there for its pleasant climate; a major drop in days when it is comfortable to spend time outdoors could make the state less of a draw.The new findings were published this month in the journal Geophysical Research Letters, in a paper by researchers Yeon-Woo Choi and Muhammad Khalifa and professor of civil and environmental engineering Elfatih Eltahir.“This is something very new in our attempt to understand impacts of climate change impact, in addition to the changing extremes,” Choi says. It allows people to see how these global changes may impact them on a very personal level, as opposed to focusing on global temperature changes or on extreme events such as powerful hurricanes or increased wildfires. “To the best of my knowledge, nobody else takes this same approach” in quantifying the local impacts of climate change, he says. “I hope that many others will parallel our approach to better understand how climate may affect our daily lives.”The study looked at two different climate scenarios — one where maximum efforts are made to curb global emissions of greenhouse gases and one “worst case” scenario where little is done and global warming continues to accelerate. They used these two scenarios with every available global climate model, 32 in all, and the results were broadly consistent across all 32 models.The reality may lie somewhere in between the two extremes that were modeled, Eltahir suggests. “I don’t think we’re going to act as aggressively” as the low-emissions scenarios suggest, he says, “and we may not be as careless” as the high-emissions scenario. “Maybe the reality will emerge in the middle, toward the end of the century,” he says.The team looked at the difference in temperatures and other conditions over various ranges of decades. The data already showed some slight differences in outdoor days from the 1961-1990 period compared to 1991-2020. The researchers then compared these most recent 30 years with the last 30 years of this century, as projected by the models, and found much greater differences ahead for some regions. The strongest effects in the modeling were seen in the Southeastern states. “It seems like climate change is going to have a significant impact on the Southeast in terms of reducing the number of outdoor days,” Eltahir says, “with implications for the quality of life of the population, and also for the attractiveness of tourism and for people who want to retire there.”He adds that “surprisingly, one of the regions that would benefit a little bit is the Northwest.” But the gain there is modest: an increase of about 14 percent in outdoor days projected for the last three decades of this century, compared to the period from 1976 to 2005. The Southwestern U.S., by comparison, faces an average loss of 23 percent of their outdoor days.The study also digs into the relationship between climate and economic activity by looking at tourism trends from U.S. National Park Service visitation data, and how that aligned with differences in climate conditions. “Accounting for seasonal variations, we find a clear connection between the number of outdoor days and the number of tourist visits in the United States,” Choi says.For much of the country, there will be little overall change in the total number of annual outdoor days, the study found, but the seasonal pattern of those days could change significantly. While most parts of the country now see the most outdoor days in summertime, that will shift as summers get hotter, and spring and fall will become the preferred seasons for outdoor activity.In a way, Eltahir says, “what we are talking about that will happen in the future [for most of the country] is already happening in Florida.” There, he says, “the really enjoyable time of year is in the spring and fall, and summer is not the best time of year.”People’s level of comfort with temperatures varies somewhat among individuals and among regions, so the researchers designed a tool, now freely available online, that allows people to set their own definitions of the lowest and highest temperatures they consider suitable for outdoor activities, and then see what the climate models predict would be the change in the number of outdoor days for their location, using their own standards of comfort. For their study, they used a widely accepted range of 10 degrees Celsius (50 degrees Fahrenheit) to 25 C (77 F), which is the “thermoneutral zone” in which the human body does not require either metabolic heat generation or evaporative cooling to maintain its core temperature — in other words, in that range there is generally no need to either shiver or sweat.The model mainly focuses on temperature but also allows people to include humidity or precipitation in their definition of what constitutes a comfortable outdoor day. The model could be extended to incorporate other variables such as air quality, but the researchers say temperature tends to be the major determinant of comfort for most people.Using their software tool, “If you disagree with how we define an outdoor day, you could define one for yourself, and then you’ll see what the impacts of that are on your number of outdoor days and their seasonality,” Eltahir says.This work was inspired by the realization, he says, that “people’s understanding of climate change is based on the assumption that climate change is something that’s going to happen sometime in the future and going to happen to someone else. It’s not going to impact them directly. And I think that contributes to the fact that we are not doing enough.”Instead, the concept of outdoor days “brings the concept of climate change home, brings it to personal everyday activities,” he says. “I hope that people will find that useful to bridge that gap, and provide a better understanding and appreciation of the problem. And hopefully that would help lead to sound policies that are based on science, regarding climate change.”The research was based on work supported by the Community Jameel for Jameel Observatory CREWSnet and Abdul Latif Jameel Water and Food Systems Lab at MIT. More

  • in

    Study evaluates impacts of summer heat in U.S. prison environments

    When summer temperatures spike, so does our vulnerability to heat-related illness or even death. For the most part, people can take measures to reduce their heat exposure by opening a window, turning up the air conditioning, or simply getting a glass of water. But for people who are incarcerated, freedom to take such measures is often not an option. Prison populations therefore are especially vulnerable to heat exposure, due to their conditions of confinement.A new study by MIT researchers examines summertime heat exposure in prisons across the United States and identifies characteristics within prison facilities that can further contribute to a population’s vulnerability to summer heat.The study’s authors used high-spatial-resolution air temperature data to determine the daily average outdoor temperature for each of 1,614 prisons in the U.S., for every summer between the years 1990 and 2023. They found that the prisons that are exposed to the most extreme heat are located in the southwestern U.S., while prisons with the biggest changes in summertime heat, compared to the historical record, are in the Pacific Northwest, the Northeast, and parts of the Midwest.Those findings are not entirely unique to prisons, as any non-prison facility or community in the same geographic locations would be exposed to similar outdoor air temperatures. But the team also looked at characteristics specific to prison facilities that could further exacerbate an incarcerated person’s vulnerability to heat exposure. They identified nine such facility-level characteristics, such as highly restricted movement, poor staffing, and inadequate mental health treatment. People living and working in prisons with any one of these characteristics may experience compounded risk to summertime heat. The team also looked at the demographics of 1,260 prisons in their study and found that the prisons with higher heat exposure on average also had higher proportions of non-white and Hispanic populations. The study, appearing today in the journal GeoHealth, provides policymakers and community leaders with ways to estimate, and take steps to address, a prison population’s heat risk, which they anticipate could worsen with climate change.“This isn’t a problem because of climate change. It’s becoming a worse problem because of climate change,” says study lead author Ufuoma Ovienmhada SM ’20, PhD ’24, a graduate of the MIT Media Lab, who recently completed her doctorate in MIT’s Department of Aeronautics and Astronautics (AeroAstro). “A lot of these prisons were not built to be comfortable or humane in the first place. Climate change is just aggravating the fact that prisons are not designed to enable incarcerated populations to moderate their own exposure to environmental risk factors such as extreme heat.”The study’s co-authors include Danielle Wood, MIT associate professor of media arts and sciences, and of AeroAstro; and Brent Minchew, MIT associate professor of geophysics in the Department of Earth, Atmospheric and Planetary Sciences; along with Ahmed Diongue ’24, Mia Hines-Shanks of Grinnell College, and Michael Krisch of Columbia University.Environmental intersectionsThe new study is an extension of work carried out at the Media Lab, where Wood leads the Space Enabled research group. The group aims to advance social and environmental justice issues through the use of satellite data and other space-enabled technologies.The group’s motivation to look at heat exposure in prisons came in 2020 when, as co-president of MIT’s Black Graduate Student Union, Ovienmhada took part in community organizing efforts following the murder of George Floyd by Minneapolis police.“We started to do more organizing on campus around policing and reimagining public safety. Through that lens I learned more about police and prisons as interconnected systems, and came across this intersection between prisons and environmental hazards,” says Ovienmhada, who is leading an effort to map the various environmental hazards that prisons, jails, and detention centers face. “In terms of environmental hazards, extreme heat causes some of the most acute impacts for incarcerated people.”She, Wood, and their colleagues set out to use Earth observation data to characterize U.S. prison populations’ vulnerability, or their risk of experiencing negative impacts, from heat.The team first looked through a database maintained by the U.S. Department of Homeland Security that lists the location and boundaries of carceral facilities in the U.S. From the database’s more than 6,000 prisons, jails, and detention centers, the researchers highlighted 1,614 prison-specific facilities, which together incarcerate nearly 1.4 million people, and employ about 337,000 staff.They then looked to Daymet, a detailed weather and climate database that tracks daily temperatures across the United States, at a 1-kilometer resolution. For each of the 1,614 prison locations, they mapped the daily outdoor temperature, for every summer between the years 1990 to 2023, noting that the majority of current state and federal correctional facilities in the U.S. were built by 1990.The team also obtained U.S. Census data on each facility’s demographic and facility-level characteristics, such as prison labor activities and conditions of confinement. One limitation of the study that the researchers acknowledge is a lack of information regarding a prison’s climate control.“There’s no comprehensive public resource where you can look up whether a facility has air conditioning,” Ovienmhada notes. “Even in facilities with air conditioning, incarcerated people may not have regular access to those cooling systems, so our measurements of outdoor air temperature may not be far off from reality.”Heat factorsFrom their analysis, the researchers found that more than 98 percent of all prisons in the U.S. experienced at least 10 days in the summer that were hotter than every previous summer, on average, for a given location. Their analysis also revealed the most heat-exposed prisons, and the prisons that experienced the highest temperatures on average, were mostly in the Southwestern U.S. The researchers note that with the exception of New Mexico, the Southwest is a region where there are no universal air conditioning regulations in state-operated prisons.“States run their own prison systems, and there is no uniformity of data collection or policy regarding air conditioning,” says Wood, who notes that there is some information on cooling systems in some states and individual prison facilities, but the data is sparse overall, and too inconsistent to include in the group’s nationwide study.While the researchers could not incorporate air conditioning data, they did consider other facility-level factors that could worsen the effects that outdoor heat triggers. They looked through the scientific literature on heat, health impacts, and prison conditions, and focused on 17 measurable facility-level variables that contribute to heat-related health problems. These include factors such as overcrowding and understaffing.“We know that whenever you’re in a room that has a lot of people, it’s going to feel hotter, even if there’s air conditioning in that environment,” Ovienmhada says. “Also, staffing is a huge factor. Facilities that don’t have air conditioning but still try to do heat risk-mitigation procedures might rely on staff to distribute ice or water every few hours. If that facility is understaffed or has neglectful staff, that may increase people’s susceptibility to hot days.”The study found that prisons with any of nine of the 17 variables showed statistically significant greater heat exposures than the prisons without those variables. Additionally, if a prison exhibits any one of the nine variables, this could worsen people’s heat risk through the combination of elevated heat exposure and vulnerability. The variables, they say, could help state regulators and activists identify prisons to prioritize for heat interventions.“The prison population is aging, and even if you’re not in a ‘hot state,’ every state has responsibility to respond,” Wood emphasizes. “For instance, areas in the Northwest, where you might expect to be temperate overall, have experienced a number of days in recent years of increasing heat risk. A few days out of the year can still be dangerous, particularly for a population with reduced agency to regulate their own exposure to heat.”This work was supported, in part, by NASA, the MIT Media Lab, and MIT’s Institute for Data, Systems and Society’s Research Initiative on Combatting Systemic Racism. More

  • in

    Collaborative effort supports an MIT resilient to the impacts of extreme heat

    Warmer weather can be a welcome change for many across the MIT community. But as climate impacts intensify, warm days are often becoming hot days with increased severity and frequency. Already this summer, heat waves in June and July brought daily highs of over 90 degrees Fahrenheit. According to the Resilient Cambridge report published in 2021, from the 1970s to 2000, data from the Boston Logan International Airport weather station reported an average of 10 days of 90-plus temperatures each year. Now, simulations are predicting that, in the current time frame of 2015-44, the number of days above 90 F could be triple the 1970-2000 average. While the increasing heat is all but certain, how institutions like MIT will be affected and how they respond continues to evolve. “We know what the science is showing, but how will this heat impact the ability of MIT to fulfill its mission and support its community?” asks Brian Goldberg, assistant director of the MIT Office of Sustainability. “What will be the real feel of these temperatures on campus?” These questions and more are guiding staff, researchers, faculty, and students working collaboratively to understand these impacts to MIT and inform decisions and action plans in response.This work is part of developing MIT’s forthcoming Climate Resiliency and Adaptation Roadmap, which is called for in MIT’s climate action plan, and is co-led by Goldberg; Laura Tenny, senior campus planner; and William Colehower, senior advisor to the vice president for campus services and stewardship. This effort is also supported by researchers in the departments of Urban Studies and Planning, Architecture, and Electrical Engineering and Computer Science (EECS), in the Urban Risk Lab and the Senseable City Lab, as well as by staff in MIT Emergency Management and Housing and Residential Services. The roadmap — which builds upon years of resiliency planning and research at MIT — will include an assessment of current and future conditions on campus as well as strategies and proposed interventions to support MIT’s community and campus in the face of increasing climate impacts.A key piece of the resiliency puzzleWhen the City of Cambridge released their Climate Change Vulnerability Assessment in 2015, the report identified flooding and heat as primary resiliency risks to the city. In response, Institute staff worked together with the city to create a full picture of potential flood risks to both Cambridge and the campus, with the latter becoming the MIT Climate Resiliency Dashboard. The dashboard, published in the MIT Sustainability DataPool, has played an important role in campus planning and resiliency efforts since its debut in 2021, but heat has been a missing piece of the tool. This is largely because for heat, unlike flooding, few data exist relative to building-level impacts. The original assessment from Cambridge showed a model of temperature averages that could be expected in portions of the city, but understanding the measured heat impacts down to the building level is essential because impacts of heat can vary so greatly. “Heat also doesn’t conform to topography like flooding, making it harder to map it with localized specificity,” notes Tenny. “Microclimates, humidity levels, shade or sun aspect, and other factors contribute to heat risk.”Collection efforts have been underway for the past three years to fill in this gap in baseline data. Members of the Climate and Resiliency Adaptation Roadmap team and partners have helped build and place heat sensors to record and analyze data. The current heat sensors, which are shoebox-shaped devices on tripods, can be found at multiple outdoor locations on campus during the summer, capturing and recording temperatures multiple times each hour. “Urban environmental phenomena are hyperlocal. While National Weather Service readouts at locations like Logan Airport are extremely valuable, this gives us a more high-resolution understanding of the urban microclimate on our campus,” notes Sanjana Paul, past technical associate with Senseable City and current graduate student in the Department of Urban Studies and Planning who helps oversee data collection and analysis.After collection, temperature data are analyzed and mapped. The data will soon be published in the updated Climate Resiliency Dashboard and will help inform actions through the Climate Resiliency and Adaptation Roadmap, but in the meantime, the information has already provided some important insights. “There were some parts of campus that were much hotter than I expected,” explains Paul. “Some of the temperature readings across campus were regularly going over 100 degrees during heat waves. It’s a bit surprising to see three digits on a temperature reading in Cambridge.” Some strategies are also already being put into action, including planting more trees to support the urban campus forest and launching cooling locations around campus to open during days of extreme heat.As data gathering enters its fourth summer, partners continue to expand. Senseable City first began capturing data in 2021 using sensors placed on MIT Recycling trucks, and the Urban Risk Lab has offered community-centered temperature data collection with the help of its director and associate professor of architecture, Miho Mazereeuw. More recently, students in course 6.900 (Engineering for Impact) worked to design heat sensors to aid in the data collection and grow the fleet of sensors on campus. Co-instructed by EECS senior lecturer Joe Steinmeyer and EECS professor Joel Voldman, students in the course were tasked with developing technology to solve challenges close at hand. “One of the goals of the class is to tackle real-world problems so students emerge with confidence as an engineer,” explains Voldman. “Having them work on a challenge that is outside their comfort zone and impacts them really helps to engage and inspire them.” Centering on peopleWhile the temperature data offer one piece of the resiliency planning puzzle, knowing how these temperatures will affect community members is another. “When we look at impacts to our campus from heat, people are the focus,” explains Goldberg. “While stress on campus infrastructure is one factor we are evaluating, our primary focus is the vulnerability of people to extreme heat.” Impacts to community members can range from disrupted nights of sleep to heat-related illnesses.As the team looked at the data and spoke with individuals across campus, it became clear that some community members might be more vulnerable than others to the impact of extreme heat days, including ground, janitorial, and maintenance crews who work outside; kitchen staff who work close to hot equipment; and student athletes exerting themselves on hot days. “We know that people on our campus are already experiencing these extreme heat days differently,” explains Susy Jones, senior sustainability project manager in the Office of Sustainability who focuses on environmental and climate justice. “We need to design strategies and augment existing interventions with equity in mind, ensuring everyone on campus can fulfill their role at MIT.”To support those strategy decisions, the resiliency team is seeking additional input from the MIT community. One hoped-for outcome of the roadmap and dashboard is for community members to review them and offer their own insight and experiences of heat conditions on campus. “These plans need to work at the campus level and the individual,” says Goldberg. “The data tells an important story, but individuals help us complete the picture.”A model for othersAs the dashboard update nears completion and the broader resiliency and adaptation roadmap of strategies launches, their purpose is twofold: help MIT develop and inform plans and procedures for mitigating and addressing heat on campus, and serve as a model for other universities and communities grappling with the same challenges. “This approach is the center of how we operate at MIT,” explains Director of Sustainability Julie Newman. “We seek to identify solutions for our own campus in a manner that others can learn from and potentially adapt for their own resiliency and climate planning purposes. We’re also looking to align with efforts at the city and state level.” By publishing the roadmap broadly, universities and municipalities can apply lessons and processes to their own spaces.When the updated Climate Resiliency Dashboard and Climate Resiliency and Adaptation Roadmap go live, it will mark the beginning of the next phase of work, rather than an end. “The dashboard is designed to present these impacts in a way everyone can understand so people across campus can respond and help us understand what is needed for them to continue to fulfill their role at MIT,” says Goldberg. Uncertainty plays a big role in resiliency planning, and the dashboard will reflect that. “This work is not something you ever say is done,” says Goldberg. “As information and data evolves, so does our work.”  More

  • in

    Study: Heavy snowfall and rain may contribute to some earthquakes

    When scientists look for an earthquake’s cause, their search often starts underground. As centuries of seismic studies have made clear, it’s the collision of tectonic plates and the movement of subsurface faults and fissures that primarily trigger a temblor.But MIT scientists have now found that certain weather events may also play a role in setting off some quakes.In a study appearing today in Science Advances, the researchers report that episodes of heavy snowfall and rain likely contributed to a swarm of earthquakes over the past several years in northern Japan. The study is the first to show that climate conditions could initiate some quakes.“We see that snowfall and other environmental loading at the surface impacts the stress state underground, and the timing of intense precipitation events is well-correlated with the start of this earthquake swarm,” says study author William Frank, an assistant professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, climate obviously has an impact on the response of the solid earth, and part of that response is earthquakes.”The new study focuses on a series of ongoing earthquakes in Japan’s Noto Peninsula. The team discovered that seismic activity in the region is surprisingly synchronized with certain changes in underground pressure, and that those changes are influenced by seasonal patterns of snowfall and precipitation. The scientists suspect that this new connection between quakes and climate may not be unique to Japan and could play a role in shaking up other parts of the world.Looking to the future, they predict that the climate’s influence on earthquakes could be more pronounced with global warming.“If we’re going into a climate that’s changing, with more extreme precipitation events, and we expect a redistribution of water in the atmosphere, oceans, and continents, that will change how the Earth’s crust is loaded,” Frank adds. “That will have an impact for sure, and it’s a link we could further explore.”The study’s lead author is former MIT research associate Qing-Yu Wang (now at Grenoble Alpes University), and also includes EAPS postdoc Xin Cui, Yang Lu of the University of Vienna, Takashi Hirose of Tohoku University, and Kazushige Obara of the University of Tokyo.Seismic speedSince late 2020, hundreds of small earthquakes have shaken up Japan’s Noto Peninsula — a finger of land that curves north from the country’s main island into the Sea of Japan. Unlike a typical earthquake sequence, which begins as a main shock that gives way to a series of aftershocks before dying out, Noto’s seismic activity is an “earthquake swarm” — a pattern of multiple, ongoing quakes with no obvious main shock, or seismic trigger.The MIT team, along with their colleagues in Japan, aimed to spot any patterns in the swarm that would explain the persistent quakes. They started by looking through the Japanese Meteorological Agency’s catalog of earthquakes that provides data on seismic activity throughout the country over time. They focused on quakes in the Noto Peninsula over the last 11 years, during which the region has experienced episodic earthquake activity, including the most recent swarm.With seismic data from the catalog, the team counted the number of seismic events that occurred in the region over time, and found that the timing of quakes prior to 2020 appeared sporadic and unrelated, compared to late 2020, when earthquakes grew more intense and clustered in time, signaling the start of the swarm, with quakes that are correlated in some way.The scientists then looked to a second dataset of seismic measurements taken by monitoring stations over the same 11-year period. Each station continuously records any displacement, or local shaking that occurs. The shaking from one station to another can give scientists an idea of how fast a seismic wave travels between stations. This “seismic velocity” is related to the structure of the Earth through which the seismic wave is traveling. Wang used the station measurements to calculate the seismic velocity between every station in and around Noto over the last 11 years.The researchers generated an evolving picture of seismic velocity beneath the Noto Peninsula and observed a surprising pattern: In 2020, around when the earthquake swarm is thought to have begun, changes in seismic velocity appeared to be synchronized with the seasons.“We then had to explain why we were observing this seasonal variation,” Frank says.Snow pressureThe team wondered whether environmental changes from season to season could influence the underlying structure of the Earth in a way that would set off an earthquake swarm. Specifically, they looked at how seasonal precipitation would affect the underground “pore fluid pressure” — the amount of pressure that fluids in the Earth’s cracks and fissures exert within the bedrock.“When it rains or snows, that adds weight, which increases pore pressure, which allows seismic waves to travel through slower,” Frank explains. “When all that weight is removed, through evaporation or runoff, all of a sudden, that pore pressure decreases and seismic waves are faster.”Wang and Cui developed a hydromechanical model of the Noto Peninsula to simulate the underlying pore pressure over the last 11 years in response to seasonal changes in precipitation. They fed into the model meteorological data from this same period, including measurements of daily snow, rainfall, and sea-level changes. From their model, they were able to track changes in excess pore pressure beneath the Noto Peninsula, before and during the earthquake swarm. They then compared this timeline of evolving pore pressure with their evolving picture of seismic velocity.“We had seismic velocity observations, and we had the model of excess pore pressure, and when we overlapped them, we saw they just fit extremely well,” Frank says.In particular, they found that when they included snowfall data, and especially, extreme snowfall events, the fit between the model and observations was stronger than if they only considered rainfall and other events. In other words, the ongoing earthquake swarm that Noto residents have been experiencing can be explained in part by seasonal precipitation, and particularly, heavy snowfall events.“We can see that the timing of these earthquakes lines up extremely well with multiple times where we see intense snowfall,” Frank says. “It’s well-correlated with earthquake activity. And we think there’s a physical link between the two.”The researchers suspect that heavy snowfall and similar extreme precipitation could play a role in earthquakes elsewhere, though they emphasize that the primary trigger will always originate underground.“When we first want to understand how earthquakes work, we look to plate tectonics, because that is and will always be the number one reason why an earthquake happens,” Frank says. “But, what are the other things that could affect when and how an earthquake happens? That’s when you start to go to second-order controlling factors, and the climate is obviously one of those.”This research was supported, in part, by the National Science Foundation. More

  • in

    An AI dataset carves new paths to tornado detection

    The return of spring in the Northern Hemisphere touches off tornado season. A tornado’s twisting funnel of dust and debris seems an unmistakable sight. But that sight can be obscured to radar, the tool of meteorologists. It’s hard to know exactly when a tornado has formed, or even why.

    A new dataset could hold answers. It contains radar returns from thousands of tornadoes that have hit the United States in the past 10 years. Storms that spawned tornadoes are flanked by other severe storms, some with nearly identical conditions, that never did. MIT Lincoln Laboratory researchers who curated the dataset, called TorNet, have now released it open source. They hope to enable breakthroughs in detecting one of nature’s most mysterious and violent phenomena.

    “A lot of progress is driven by easily available, benchmark datasets. We hope TorNet will lay a foundation for machine learning algorithms to both detect and predict tornadoes,” says Mark Veillette, the project’s co-principal investigator with James Kurdzo. Both researchers work in the Air Traffic Control Systems Group. 

    Along with the dataset, the team is releasing models trained on it. The models show promise for machine learning’s ability to spot a twister. Building on this work could open new frontiers for forecasters, helping them provide more accurate warnings that might save lives. 

    Swirling uncertainty

    About 1,200 tornadoes occur in the United States every year, causing millions to billions of dollars in economic damage and claiming 71 lives on average. Last year, one unusually long-lasting tornado killed 17 people and injured at least 165 others along a 59-mile path in Mississippi.  

    Yet tornadoes are notoriously difficult to forecast because scientists don’t have a clear picture of why they form. “We can see two storms that look identical, and one will produce a tornado and one won’t. We don’t fully understand it,” Kurdzo says.

    A tornado’s basic ingredients are thunderstorms with instability caused by rapidly rising warm air and wind shear that causes rotation. Weather radar is the primary tool used to monitor these conditions. But tornadoes lay too low to be detected, even when moderately close to the radar. As the radar beam with a given tilt angle travels further from the antenna, it gets higher above the ground, mostly seeing reflections from rain and hail carried in the “mesocyclone,” the storm’s broad, rotating updraft. A mesocyclone doesn’t always produce a tornado.

    With this limited view, forecasters must decide whether or not to issue a tornado warning. They often err on the side of caution. As a result, the rate of false alarms for tornado warnings is more than 70 percent. “That can lead to boy-who-cried-wolf syndrome,” Kurdzo says.  

    In recent years, researchers have turned to machine learning to better detect and predict tornadoes. However, raw datasets and models have not always been accessible to the broader community, stifling progress. TorNet is filling this gap.

    The dataset contains more than 200,000 radar images, 13,587 of which depict tornadoes. The rest of the images are non-tornadic, taken from storms in one of two categories: randomly selected severe storms or false-alarm storms (those that led a forecaster to issue a warning but that didn’t produce a tornado).

    Each sample of a storm or tornado comprises two sets of six radar images. The two sets correspond to different radar sweep angles. The six images portray different radar data products, such as reflectivity (showing precipitation intensity) or radial velocity (indicating if winds are moving toward or away from the radar).

    A challenge in curating the dataset was first finding tornadoes. Within the corpus of weather radar data, tornadoes are extremely rare events. The team then had to balance those tornado samples with difficult non-tornado samples. If the dataset were too easy, say by comparing tornadoes to snowstorms, an algorithm trained on the data would likely over-classify storms as tornadic.

    “What’s beautiful about a true benchmark dataset is that we’re all working with the same data, with the same level of difficulty, and can compare results,” Veillette says. “It also makes meteorology more accessible to data scientists, and vice versa. It becomes easier for these two parties to work on a common problem.”

    Both researchers represent the progress that can come from cross-collaboration. Veillette is a mathematician and algorithm developer who has long been fascinated by tornadoes. Kurdzo is a meteorologist by training and a signal processing expert. In grad school, he chased tornadoes with custom-built mobile radars, collecting data to analyze in new ways.

    “This dataset also means that a grad student doesn’t have to spend a year or two building a dataset. They can jump right into their research,” Kurdzo says.

    This project was funded by Lincoln Laboratory’s Climate Change Initiative, which aims to leverage the laboratory’s diverse technical strengths to help address climate problems threatening human health and global security.

    Chasing answers with deep learning

    Using the dataset, the researchers developed baseline artificial intelligence (AI) models. They were particularly eager to apply deep learning, a form of machine learning that excels at processing visual data. On its own, deep learning can extract features (key observations that an algorithm uses to make a decision) from images across a dataset. Other machine learning approaches require humans to first manually label features. 

    “We wanted to see if deep learning could rediscover what people normally look for in tornadoes and even identify new things that typically aren’t searched for by forecasters,” Veillette says.

    The results are promising. Their deep learning model performed similar to or better than all tornado-detecting algorithms known in literature. The trained algorithm correctly classified 50 percent of weaker EF-1 tornadoes and over 85 percent of tornadoes rated EF-2 or higher, which make up the most devastating and costly occurrences of these storms.

    They also evaluated two other types of machine-learning models, and one traditional model to compare against. The source code and parameters of all these models are freely available. The models and dataset are also described in a paper submitted to a journal of the American Meteorological Society (AMS). Veillette presented this work at the AMS Annual Meeting in January.

    “The biggest reason for putting our models out there is for the community to improve upon them and do other great things,” Kurdzo says. “The best solution could be a deep learning model, or someone might find that a non-deep learning model is actually better.”

    TorNet could be useful in the weather community for others uses too, such as for conducting large-scale case studies on storms. It could also be augmented with other data sources, like satellite imagery or lightning maps. Fusing multiple types of data could improve the accuracy of machine learning models.

    Taking steps toward operations

    On top of detecting tornadoes, Kurdzo hopes that models might help unravel the science of why they form.

    “As scientists, we see all these precursors to tornadoes — an increase in low-level rotation, a hook echo in reflectivity data, specific differential phase (KDP) foot and differential reflectivity (ZDR) arcs. But how do they all go together? And are there physical manifestations we don’t know about?” he asks.

    Teasing out those answers might be possible with explainable AI. Explainable AI refers to methods that allow a model to provide its reasoning, in a format understandable to humans, of why it came to a certain decision. In this case, these explanations might reveal physical processes that happen before tornadoes. This knowledge could help train forecasters, and models, to recognize the signs sooner. 

    “None of this technology is ever meant to replace a forecaster. But perhaps someday it could guide forecasters’ eyes in complex situations, and give a visual warning to an area predicted to have tornadic activity,” Kurdzo says.

    Such assistance could be especially useful as radar technology improves and future networks potentially grow denser. Data refresh rates in a next-generation radar network are expected to increase from every five minutes to approximately one minute, perhaps faster than forecasters can interpret the new information. Because deep learning can process huge amounts of data quickly, it could be well-suited for monitoring radar returns in real time, alongside humans. Tornadoes can form and disappear in minutes.

    But the path to an operational algorithm is a long road, especially in safety-critical situations, Veillette says. “I think the forecaster community is still, understandably, skeptical of machine learning. One way to establish trust and transparency is to have public benchmark datasets like this one. It’s a first step.”

    The next steps, the team hopes, will be taken by researchers across the world who are inspired by the dataset and energized to build their own algorithms. Those algorithms will in turn go into test beds, where they’ll eventually be shown to forecasters, to start a process of transitioning into operations.

    In the end, the path could circle back to trust.

    “We may never get more than a 10- to 15-minute tornado warning using these tools. But if we could lower the false-alarm rate, we could start to make headway with public perception,” Kurdzo says. “People are going to use those warnings to take the action they need to save their lives.” More

  • in

    Using deep learning to image the Earth’s planetary boundary layer

    Although the troposphere is often thought of as the closest layer of the atmosphere to the Earth’s surface, the planetary boundary layer (PBL) — the lowest layer of the troposphere — is actually the part that most significantly influences weather near the surface. In the 2018 planetary science decadal survey, the PBL was raised as an important scientific issue that has the potential to enhance storm forecasting and improve climate projections.  

    “The PBL is where the surface interacts with the atmosphere, including exchanges of moisture and heat that help lead to severe weather and a changing climate,” says Adam Milstein, a technical staff member in Lincoln Laboratory’s Applied Space Systems Group. “The PBL is also where humans live, and the turbulent movement of aerosols throughout the PBL is important for air quality that influences human health.” 

    Although vital for studying weather and climate, important features of the PBL, such as its height, are difficult to resolve with current technology. In the past four years, Lincoln Laboratory staff have been studying the PBL, focusing on two different tasks: using machine learning to make 3D-scanned profiles of the atmosphere, and resolving the vertical structure of the atmosphere more clearly in order to better predict droughts.  

    This PBL-focused research effort builds on more than a decade of related work on fast, operational neural network algorithms developed by Lincoln Laboratory for NASA missions. These missions include the Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of Smallsats (TROPICS) mission as well as Aqua, a satellite that collects data about Earth’s water cycle and observes variables such as ocean temperature, precipitation, and water vapor in the atmosphere. These algorithms retrieve temperature and humidity from the satellite instrument data and have been shown to significantly improve the accuracy and usable global coverage of the observations over previous approaches. For TROPICS, the algorithms help retrieve data that are used to characterize a storm’s rapidly evolving structures in near-real time, and for Aqua, it has helped increase forecasting models, drought monitoring, and fire prediction. 

    These operational algorithms for TROPICS and Aqua are based on classic “shallow” neural networks to maximize speed and simplicity, creating a one-dimensional vertical profile for each spectral measurement collected by the instrument over each location. While this approach has improved observations of the atmosphere down to the surface overall, including the PBL, laboratory staff determined that newer “deep” learning techniques that treat the atmosphere over a region of interest as a three-dimensional image are needed to improve PBL details further.

    “We hypothesized that deep learning and artificial intelligence (AI) techniques could improve on current approaches by incorporating a better statistical representation of 3D temperature and humidity imagery of the atmosphere into the solutions,” Milstein says. “But it took a while to figure out how to create the best dataset — a mix of real and simulated data; we needed to prepare to train these techniques.”

    The team collaborated with Joseph Santanello of the NASA Goddard Space Flight Center and William Blackwell, also of the Applied Space Systems Group, in a recent NASA-funded effort showing that these retrieval algorithms can improve PBL detail, including more accurate determination of the PBL height than the previous state of the art. 

    While improved knowledge of the PBL is broadly useful for increasing understanding of climate and weather, one key application is prediction of droughts. According to a Global Drought Snapshot report released last year, droughts are a pressing planetary issue that the global community needs to address. Lack of humidity near the surface, specifically at the level of the PBL, is the leading indicator of drought. While previous studies using remote-sensing techniques have examined the humidity of soil to determine drought risk, studying the atmosphere can help predict when droughts will happen.  

    In an effort funded by Lincoln Laboratory’s Climate Change Initiative, Milstein, along with laboratory staff member Michael Pieper, are working with scientists at NASA’s Jet Propulsion Laboratory (JPL) to use neural network techniques to improve drought prediction over the continental United States. While the work builds off of existing operational work JPL has done incorporating (in part) the laboratory’s operational “shallow” neural network approach for Aqua, the team believes that this work and the PBL-focused deep learning research work can be combined to further improve the accuracy of drought prediction. 

    “Lincoln Laboratory has been working with NASA for more than a decade on neural network algorithms for estimating temperature and humidity in the atmosphere from space-borne infrared and microwave instruments, including those on the Aqua spacecraft,” Milstein says. “Over that time, we have learned a lot about this problem by working with the science community, including learning about what scientific challenges remain. Our long experience working on this type of remote sensing with NASA scientists, as well as our experience with using neural network techniques, gave us a unique perspective.”

    According to Milstein, the next step for this project is to compare the deep learning results to datasets from the National Oceanic and Atmospheric Administration, NASA, and the Department of Energy collected directly in the PBL using radiosondes, a type of instrument flown on a weather balloon. “These direct measurements can be considered a kind of ‘ground truth’ to quantify the accuracy of the techniques we have developed,” Milstein says.

    This improved neural network approach holds promise to demonstrate drought prediction that can exceed the capabilities of existing indicators, Milstein says, and to be a tool that scientists can rely on for decades to come. More

  • in

    MIT-derived algorithm helps forecast the frequency of extreme weather

    To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades, and even centuries, forward in time, but only at a coarse resolution. These models might be used to gauge, for instance, future climate conditions for the northeastern U.S., but not specifically for Boston.

    To estimate Boston’s future risk of extreme weather such as flooding, policymakers can combine a coarse model’s large-scale predictions with a finer-resolution model, tuned to estimate how often Boston is likely to experience damaging floods as the climate warms. But this risk analysis is only as accurate as the predictions from that first, coarser climate model.

    “If you get those wrong for large-scale environments, then you miss everything in terms of what extreme events will look like at smaller scales, such as over individual cities,” says Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering.

    Sapsis and his colleagues have now developed a method to “correct” the predictions from coarse climate models. By combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales. When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades, compared to predictions made without the correction scheme.

    Play video

    This animation shows the evolution of storms around the northern hemisphere, as a result of a high-resolution storm model, combined with the MIT team’s corrected global climate model. The simulation improves the modeling of extreme values for wind, temperature, and humidity, which typically have significant errors in coarse scale models. Credit: Courtesy of Ruby Leung and Shixuan Zhang, PNNL

    Sapsis says the new correction scheme is general in form and can be applied to any global climate model. Once corrected, the models can help to determine where and how often extreme weather will strike as global temperatures rise over the coming years. 

    “Climate change will have an effect on every aspect of human life, and every type of life on the planet, from biodiversity to food security to the economy,” Sapsis says. “If we have capabilities to know accurately how extreme weather will change, especially over specific locations, it can make a lot of difference in terms of preparation and doing the right engineering to come up with solutions. This is the method that can open the way to do that.”

    The team’s results appear today in the Journal of Advances in Modeling Earth Systems. The study’s MIT co-authors include postdoc Benedikt Barthel Sorensen and Alexis-Tzianni Charalampopoulos SM ’19, PhD ’23, with Shixuan Zhang, Bryce Harrop, and Ruby Leung of the Pacific Northwest National Laboratory in Washington state.

    Over the hood

    Today’s large-scale climate models simulate weather features such as the average temperature, humidity, and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power, and in order to simulate how weather features will interact and evolve over periods of decades or longer, models average out features every 100 kilometers or so.

    “It’s a very heavy computation requiring supercomputers,” Sapsis notes. “But these models still do not resolve very important processes like clouds or storms, which occur over smaller scales of a kilometer or less.”

    To improve the resolution of these coarse climate models, scientists typically have gone under the hood to try and fix a model’s underlying dynamical equations, which describe how phenomena in the atmosphere and oceans should physically interact.

    “People have tried to dissect into climate model codes that have been developed over the last 20 to 30 years, which is a nightmare, because you can lose a lot of stability in your simulation,” Sapsis explains. “What we’re doing is a completely different approach, in that we’re not trying to correct the equations but instead correct the model’s output.”

    The team’s new approach takes a model’s output, or simulation, and overlays an algorithm that nudges the simulation toward something that more closely represents real-world conditions. The algorithm is based on a machine-learning scheme that takes in data, such as past information for temperature and humidity around the world, and learns associations within the data that represent fundamental dynamics among weather features. The algorithm then uses these learned associations to correct a model’s predictions.

    “What we’re doing is trying to correct dynamics, as in how an extreme weather feature, such as the windspeeds during a Hurricane Sandy event, will look like in the coarse model, versus in reality,” Sapsis says. “The method learns dynamics, and dynamics are universal. Having the correct dynamics eventually leads to correct statistics, for example, frequency of rare extreme events.”

    Climate correction

    As a first test of their new approach, the team used the machine-learning scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM), a climate model run by the U.S. Department of Energy, that simulates climate patterns around the world at a resolution of 110 kilometers. The researchers used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They then ran the climate model forward in time for about 36 years and applied the trained algorithm to the model’s simulations. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years, not used for training.

    “We’re not talking about huge differences in absolute terms,” Sapsis says. “An extreme event in the uncorrected simulation might be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.”

    When the team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones, they found the approach accurately reproduced the frequency of extreme storms in specific locations around the world.

    “We now have a coarse model that can get you the right frequency of events, for the present climate. It’s much more improved,” Sapsis says. “Once we correct the dynamics, this is a relevant correction, even when you have a different average global temperature, and it can be used for understanding how forest fires, flooding events, and heat waves will look in a future climate. Our ongoing work is focusing on analyzing future climate scenarios.”

    “The results are particularly impressive as the method shows promising results on E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group at the University of Chicago and was not involved with the study. “It would be interesting to see what climate change projections this framework yields once future greenhouse-gas emission scenarios are incorporated.”

    This work was supported, in part, by the U.S. Defense Advanced Research Projects Agency. More

  • in

    Artificial reef designed by MIT engineers could protect marine life, reduce storm damage

    The beautiful, gnarled, nooked-and-crannied reefs that surround tropical islands serve as a marine refuge and natural buffer against stormy seas. But as the effects of climate change bleach and break down coral reefs around the world, and extreme weather events become more common, coastal communities are left increasingly vulnerable to frequent flooding and erosion.

    An MIT team is now hoping to fortify coastlines with “architected” reefs — sustainable, offshore structures engineered to mimic the wave-buffering effects of natural reefs while also providing pockets for fish and other marine life.

    The team’s reef design centers on a cylindrical structure surrounded by four rudder-like slats. The engineers found that when this structure stands up against a wave, it efficiently breaks the wave into turbulent jets that ultimately dissipate most of the wave’s total energy. The team has calculated that the new design could reduce as much wave energy as existing artificial reefs, using 10 times less material.

    The researchers plan to fabricate each cylindrical structure from sustainable cement, which they would mold in a pattern of “voxels” that could be automatically assembled, and would provide pockets for fish to explore and other marine life to settle in. The cylinders could be connected to form a long, semipermeable wall, which the engineers could erect along a coastline, about half a mile from shore. Based on the team’s initial experiments with lab-scale prototypes, the architected reef could reduce the energy of incoming waves by more than 95 percent.

    “This would be like a long wave-breaker,” says Michael Triantafyllou, the Henry L. and Grace Doherty Professor in Ocean Science and Engineering in the Department of Mechanical Engineering. “If waves are 6 meters high coming toward this reef structure, they would be ultimately less than a meter high on the other side. So, this kills the impact of the waves, which could prevent erosion and flooding.”

    Details of the architected reef design are reported today in a study appearing in the open-access journal PNAS Nexus. Triantafyllou’s MIT co-authors are Edvard Ronglan SM ’23; graduate students Alfonso Parra Rubio, Jose del Auila Ferrandis, and Erik Strand; research scientists Patricia Maria Stathatou and Carolina Bastidas; and Professor Neil Gershenfeld, director of the Center for Bits and Atoms; along with Alexis Oliveira Da Silva at the Polytechnic Institute of Paris, Dixia Fan of Westlake University, and Jeffrey Gair Jr. of Scinetics, Inc.

    Leveraging turbulence

    Some regions have already erected artificial reefs to protect their coastlines from encroaching storms. These structures are typically sunken ships, retired oil and gas platforms, and even assembled configurations of concrete, metal, tires, and stones. However, there’s variability in the types of artificial reefs that are currently in place, and no standard for engineering such structures. What’s more, the designs that are deployed tend to have a low wave dissipation per unit volume of material used. That is, it takes a huge amount of material to break enough wave energy to adequately protect coastal communities.

    The MIT team instead looked for ways to engineer an artificial reef that would efficiently dissipate wave energy with less material, while also providing a refuge for fish living along any vulnerable coast.

    “Remember, natural coral reefs are only found in tropical waters,” says Triantafyllou, who is director of the MIT Sea Grant. “We cannot have these reefs, for instance, in Massachusetts. But architected reefs don’t depend on temperature, so they can be placed in any water, to protect more coastal areas.”

    MIT researchers test the wave-breaking performance of two artificial reef structures in the MIT Towing Tank.Credit: Courtesy of the researchers

    The new effort is the result of a collaboration between researchers in MIT Sea Grant, who developed the reef structure’s hydrodynamic design, and researchers at the Center for Bits and Atoms (CBA), who worked to make the structure modular and easy to fabricate on location. The team’s architected reef design grew out of two seemingly unrelated problems. CBA researchers were developing ultralight cellular structures for the aerospace industry, while Sea Grant researchers were assessing the performance of blowout preventers in offshore oil structures — cylindrical valves that are used to seal off oil and gas wells and prevent them from leaking.

    The team’s tests showed that the structure’s cylindrical arrangement generated a high amount of drag. In other words, the structure appeared to be especially efficient in dissipating high-force flows of oil and gas. They wondered: Could the same arrangement dissipate another type of flow, in ocean waves?

    The researchers began to play with the general structure in simulations of water flow, tweaking its dimensions and adding certain elements to see whether and how waves changed as they crashed against each simulated design. This iterative process ultimately landed on an optimized geometry: a vertical cylinder flanked by four long slats, each attached to the cylinder in a way that leaves space for water to flow through the resulting structure. They found this setup essentially breaks up any incoming wave energy, causing parts of the wave-induced flow to spiral to the sides rather than crashing ahead.

    “We’re leveraging this turbulence and these powerful jets to ultimately dissipate wave energy,” Ferrandis says.

    Standing up to storms

    Once the researchers identified an optimal wave-dissipating structure, they fabricated a laboratory-scale version of an architected reef made from a series of the cylindrical structures, which they 3D-printed from plastic. Each test cylinder measured about 1 foot wide and 4 feet tall. They assembled a number of cylinders, each spaced about a foot apart, to form a fence-like structure, which they then lowered into a wave tank at MIT. They then generated waves of various heights and measured them before and after passing through the architected reef.

    “We saw the waves reduce substantially, as the reef destroyed their energy,” Triantafyllou says.

    The team has also looked into making the structures more porous, and friendly to fish. They found that, rather than making each structure from a solid slab of plastic, they could use a more affordable and sustainable type of cement.

    “We’ve worked with biologists to test the cement we intend to use, and it’s benign to fish, and ready to go,” he adds.

    They identified an ideal pattern of “voxels,” or microstructures, that cement could be molded into, in order to fabricate the reefs while creating pockets in which fish could live. This voxel geometry resembles individual egg cartons, stacked end to end, and appears to not affect the structure’s overall wave-dissipating power.

    “These voxels still maintain a big drag while allowing fish to move inside,” Ferrandis says.

    The team is currently fabricating cement voxel structures and assembling them into a lab-scale architected reef, which they will test under various wave conditions. They envision that the voxel design could be modular, and scalable to any desired size, and easy to transport and install in various offshore locations. “Now we’re simulating actual sea patterns, and testing how these models will perform when we eventually have to deploy them,” says Anjali Sinha, a graduate student at MIT who recently joined the group.

    Going forward, the team hopes to work with beach towns in Massachusetts to test the structures on a pilot scale.

    “These test structures would not be small,” Triantafyllou emphasizes. “They would be about a mile long, and about 5 meters tall, and would cost something like 6 million dollars per mile. So it’s not cheap. But it could prevent billions of dollars in storm damage. And with climate change, protecting the coasts will become a big issue.”

    This work was funded, in part, by the U.S. Defense Advanced Research Projects Agency. More