More stories

  • in

    Empowering people to adapt on the frontlines of climate change

    On April 11, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the fifth in a five-part series highlighting the most promising concepts to emerge from the competition and the interdisciplinary research teams behind them.

    In the coastal south of Bangladesh, rice paddies that farmers could once harvest three times a year lie barren. Sea-level rise brings saltwater to the soil, ruining the staple crop. It’s one of many impacts, and inequities, of climate change. Despite producing less than 1 percent of global carbon emissions, Bangladesh is suffering more than most countries. Rising seas, heat waves, flooding, and cyclones threaten 90 million people.

    A platform being developed in a collaboration between MIT and BRAC, a Bangladesh-based global development organization, aims to inform and empower climate-threatened communities to proactively adapt to a changing future. Selected as one of five MIT Climate Grand Challenges flagship projects, the Climate Resilience Early Warning System (CREWSnet) will forecast the local impacts of climate change on people’s lives, homes, and livelihoods. These forecasts will guide BRAC’s development of climate-resiliency programs to help residents prepare for and adapt to life-altering conditions.

    “The communities that CREWSnet will focus on have done little to contribute to the problem of climate change in the first place. However, because of socioeconomic situations, they may be among the most vulnerable. We hope that by providing state-of-the-art projections and sharing them broadly with communities, and working through partners like BRAC, we can help improve the capacity of local communities to adapt to climate change, significantly,” says Elfatih Eltahir, the H.M. King Bhumibol Professor in the Department of Civil and Environmental Engineering.

    Eltahir leads the project with John Aldridge and Deborah Campbell in the Humanitarian Assistance and Disaster Relief Systems Group at Lincoln Laboratory. Additional partners across MIT include the Center for Global Change Science; the Department of Earth, Atmospheric and Planetary Sciences; the Joint Program on the Science and Policy of Global Change; and the Abdul Latif Jameel Poverty Action Lab. 

    Predicting local risks

    CREWSnet’s forecasts rely upon a sophisticated model, developed in Eltahir’s research group over the past 25 years, called the MIT Regional Climate Model. This model zooms in on climate processes at local scales, at a resolution as granular as 6 miles. In Bangladesh’s population-dense cities, a 6-mile area could encompass tens, or even hundreds, of thousands of people. The model takes into account the details of a region’s topography, land use, and coastline to predict changes in local conditions.

    When applying this model over Bangladesh, researchers found that heat waves will get more severe and more frequent over the next 30 years. In particular, wet-bulb temperatures, which indicate the ability for humans to cool down by sweating, will rise to dangerous levels rarely observed today, particularly in western, inland cities.

    Such hot spots exacerbate other challenges predicted to worsen near Bangladesh’s coast. Rising sea levels and powerful cyclones are eroding and flooding coastal communities, causing saltwater to surge into land and freshwater. This salinity intrusion is detrimental to human health, ruins drinking water supplies, and harms crops, livestock, and aquatic life that farmers and fishermen depend on for food and income.

    CREWSnet will fuse climate science with forecasting tools that predict the social and economic impacts to villages and cities. These forecasts — such as how often a crop season may fail, or how far floodwaters will reach — can steer decision-making.

    “What people need to know, whether they’re a governor or head of a household, is ‘What is going to happen in my area, and what decisions should I make for the people I’m responsible for?’ Our role is to integrate this science and technology together into a decision support system,” says Aldridge, whose group at Lincoln Laboratory specializes in this area. Most recently, they transitioned a hurricane-evacuation planning system to the U.S. government. “We know that making decisions based on climate change requires a deep level of trust. That’s why having a powerful partner like BRAC is so important,” he says.

    Testing interventions

    Established 50 years ago, just after Bangladesh’s independence, BRAC works in every district of the nation to provide social services that help people rise from extreme poverty. Today, it is one of the world’s largest nongovernmental organizations, serving 110 million people across 11 countries in Asia and Africa, but its success is cultivated locally.

    “BRAC is thrilled to partner with leading researchers at MIT to increase climate resilience in Bangladesh and provide a model that can be scaled around the globe,” says Donella Rapier, president and CEO of BRAC USA. “Locally led climate adaptation solutions that are developed in partnership with communities are urgently needed, particularly in the most vulnerable regions that are on the frontlines of climate change.”

    CREWSnet will help BRAC identify communities most vulnerable to forecasted impacts. In these areas, they will share knowledge and innovate or bolster programs to improve households’ capacity to adapt.

    Many climate initiatives are already underway. One program equips homes to filter and store rainwater, as salinity intrusion makes safe drinking water hard to access. Another program is building resilient housing, able to withstand 120-mile-per-hour winds, that can double as local shelters during cyclones and flooding. Other services are helping farmers switch to different livestock or crops better suited for wetter or saltier conditions (e.g., ducks instead of chickens, or salt-tolerant rice), providing interest-free loans to enable this change.

    But adapting in place will not always be possible, for example in areas predicted to be submerged or unbearably hot by midcentury. “Bangladesh is working on identifying and developing climate-resilient cities and towns across the country, as closer-by alternative destinations as compared to moving to Dhaka, the overcrowded capital of Bangladesh,” says Campbell. “CREWSnet can help identify regions better suited for migration, and climate-resilient adaptation strategies for those regions.” At the same time, BRAC’s Climate Bridge Fund is helping to prepare cities for climate-induced migration, building up infrastructure and financial services for people who have been displaced.

    Evaluating impact

    While CREWSnet’s goal is to enable action, it can’t quite measure the impact of those actions. The Abdul Latif Jameel Poverty Action Lab (J-PAL), a development economics program in the MIT School of Humanities, Arts, and Social Sciences, will help evaluate the effectiveness of the climate-adaptation programs.

    “We conduct randomized controlled trials, similar to medical trials, that help us understand if a program improved people’s lives,” says Claire Walsh, the project director of the King Climate Action Initiative at J-PAL. “Once CREWSnet helps BRAC implement adaptation programs, we will generate scientific evidence on their impacts, so that BRAC and CREWSnet can make a case to funders and governments to expand effective programs.”

    The team aspires to bring CREWSnet to other nations disproportionately impacted by climate change. “Our vision is to have this be a globally extensible capability,” says Campbell. CREWSnet’s name evokes another early-warning decision-support system, FEWSnet, that helped organizations address famine in eastern Africa in the 1980s. Today it is a pillar of food-security planning around the world.

    CREWSnet hopes for a similar impact in climate change planning. Its selection as an MIT Climate Grand Challenges flagship project will inject the project with more funding and resources, momentum that will also help BRAC’s fundraising. The team plans to deploy CREWSnet to southwestern Bangladesh within five years.

    “The communities that we are aspiring to reach with CREWSnet are deeply aware that their lives are changing — they have been looking climate change in the eye for many years. They are incredibly resilient, creative, and talented,” says Ashley Toombs, the external affairs director for BRAC USA. “As a team, we are excited to bring this system to Bangladesh. And what we learn together, we will apply at potentially even larger scales.” More

  • in

    Looking forward to forecast the risks of a changing climate

    On April 11, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the third in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    Extreme weather events that were once considered rare have become noticeably less so, from intensifying hurricane activity in the North Atlantic to wildfires generating massive clouds of ozone-damaging smoke. But current climate models are unprepared when it comes to estimating the risk that these increasingly extreme events pose — and without adequate modeling, governments are left unable to take necessary precautions to protect their communities.

    MIT Department of Earth, Atmospheric and Planetary Science (EAPS) Professor Paul O’Gorman researches this trend by studying how climate affects the atmosphere and incorporating what he learns into climate models to improve their accuracy. One particular focus for O’Gorman has been changes in extreme precipitation and midlatitude storms that hit areas like New England.

    “These extreme events are having a lot of impact, but they’re also difficult to model or study,” he says. Seeing the pressing need for better climate models that can be used to develop preparedness plans and climate change mitigation strategies, O’Gorman and collaborators Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in EAPS, and Miho Mazereeuw, associate professor in MIT’s Department of Architecture, are leading an interdisciplinary group of scientists, engineers, and designers to tackle this problem with their MIT Climate Grand Challenges flagship project, “Preparing for a new world of weather and climate extremes.”

    “We know already from observations and from climate model predictions that weather and climate extremes are changing and will change more,” O’Gorman says. “The grand challenge is preparing for those changing extremes.”

    Their proposal is one of five flagship projects recently announced by the MIT Climate Grand Challenges initiative — an Institute-wide effort catalyzing novel research and engineering innovations to address the climate crisis. Selected from a field of almost 100 submissions, the team will receive additional funding and exposure to help accelerate and scale their project goals. Other MIT collaborators on the proposal include researchers from the School of Engineering, the School of Architecture and Planning, the Office of Sustainability, the Center for Global Change Science, and the Institute for Data, Systems and Society.

    Weather risk modeling

    Fifteen years ago, Kerry Emanuel developed a simple hurricane model. It was based on physics equations, rather than statistics, and could run in real time, making it useful for modeling risk assessment. Emanuel wondered if similar models could be used for long-term risk assessment of other things, such as changes in extreme weather because of climate change.

    “I discovered, somewhat to my surprise and dismay, that almost all extant estimates of long-term weather risks in the United States are based not on physical models, but on historical statistics of the hazards,” says Emanuel. “The problem with relying on historical records is that they’re too short; while they can help estimate common events, they don’t contain enough information to make predictions for more rare events.”

    Another limitation of weather risk models which rely heavily on statistics: They have a built-in assumption that the climate is static.

    “Historical records rely on the climate at the time they were recorded; they can’t say anything about how hurricanes grow in a warmer climate,” says Emanuel. The models rely on fixed relationships between events; they assume that hurricane activity will stay the same, even while science is showing that warmer temperatures will most likely push typical hurricane activity beyond the tropics and into a much wider band of latitudes.

    As a flagship project, the goal is to eliminate this reliance on the historical record by emphasizing physical principles (e.g., the laws of thermodynamics and fluid mechanics) in next-generation models. The downside to this is that there are many variables that have to be included. Not only are there planetary-scale systems to consider, such as the global circulation of the atmosphere, but there are also small-scale, extremely localized events, like thunderstorms, that influence predictive outcomes.

    Trying to compute all of these at once is costly and time-consuming — and the results often can’t tell you the risk in a specific location. But there is a way to correct for this: “What’s done is to use a global model, and then use a method called downscaling, which tries to infer what would happen on very small scales that aren’t properly resolved by the global model,” explains O’Gorman. The team hopes to improve downscaling techniques so that they can be used to calculate the risk of very rare but impactful weather events.

    Global climate models, or general circulation models (GCMs), Emanuel explains, are constructed a bit like a jungle gym. Like the playground bars, the Earth is sectioned in an interconnected three-dimensional framework — only it’s divided 100 to 200 square kilometers at a time. Each node comprises a set of computations for characteristics like wind, rainfall, atmospheric pressure, and temperature within its bounds; the outputs of each node are connected to its neighbor. This framework is useful for creating a big picture idea of Earth’s climate system, but if you tried to zoom in on a specific location — like, say, to see what’s happening in Miami or Mumbai — the connecting nodes are too far apart to make predictions on anything specific to those areas.

    Scientists work around this problem by using downscaling. They use the same blueprint of the jungle gym, but within the nodes they weave a mesh of smaller features, incorporating equations for things like topography and vegetation or regional meteorological models to fill in the blanks. By creating a finer mesh over smaller areas they can predict local effects without needing to run the entire global model.

    Of course, even this finer-resolution solution has its trade-offs. While we might be able to gain a clearer picture of what’s happening in a specific region by nesting models within models, it can still make for a computing challenge to crunch all that data at once, with the trade-off being expense and time, or predictions that are limited to shorter windows of duration — where GCMs can be run considering decades or centuries, a particularly complex local model may be restricted to predictions on timescales of just a few years at a time.

    “I’m afraid that most of the downscaling at present is brute force, but I think there’s room to do it in better ways,” says Emanuel, who sees the problem of finding new and novel methods of achieving this goal as an intellectual challenge. “I hope that through the Grand Challenges project we might be able to get students, postdocs, and others interested in doing this in a very creative way.”

    Adapting to weather extremes for cities and renewable energy

    Improving climate modeling is more than a scientific exercise in creativity, however. There’s a very real application for models that can accurately forecast risk in localized regions.

    Another problem is that progress in climate modeling has not kept up with the need for climate mitigation plans, especially in some of the most vulnerable communities around the globe.

    “It is critical for stakeholders to have access to this data for their own decision-making process. Every community is composed of a diverse population with diverse needs, and each locality is affected by extreme weather events in unique ways,” says Mazereeuw, the director of the MIT Urban Risk Lab. 

    A key piece of the team’s project is building on partnerships the Urban Risk Lab has developed with several cities to test their models once they have a usable product up and running. The cities were selected based on their vulnerability to increasing extreme weather events, such as tropical cyclones in Broward County, Florida, and Toa Baja, Puerto Rico, and extratropical storms in Boston, Massachusetts, and Cape Town, South Africa.

    In their proposal, the team outlines a variety of deliverables that the cities can ultimately use in their climate change preparations, with ideas such as online interactive platforms and workshops with stakeholders — such as local governments, developers, nonprofits, and residents — to learn directly what specific tools they need for their local communities. By doing so, they can craft plans addressing different scenarios in their region, involving events such as sea-level rise or heat waves, while also providing information and means of developing adaptation strategies for infrastructure under these conditions that will be the most effective and efficient for them.

    “We are acutely aware of the inequity of resources both in mitigating impacts and recovering from disasters. Working with diverse communities through workshops allows us to engage a lot of people, listen, discuss, and collaboratively design solutions,” says Mazereeuw.

    By the end of five years, the team is hoping that they’ll have better risk assessment and preparedness tool kits, not just for the cities that they’re partnering with, but for others as well.

    “MIT is well-positioned to make progress in this area,” says O’Gorman, “and I think it’s an important problem where we can make a difference.” More

  • in

    Structures considered key to gene expression are surprisingly fleeting

    In human chromosomes, DNA is coated by proteins to form an exceedingly long beaded string. This “string” is folded into numerous loops, which are believed to help cells control gene expression and facilitate DNA repair, among other functions. A new study from MIT suggests that these loops are very dynamic and shorter-lived than previously thought.

    In the new study, the researchers were able to monitor the movement of one stretch of the genome in a living cell for about two hours. They saw that this stretch was fully looped for only 3 to 6 percent of the time, with the loop lasting for only about 10 to 30 minutes. The findings suggest that scientists’ current understanding of how loops influence gene expression may need to be revised, the researchers say.

    “Many models in the field have been these pictures of static loops regulating these processes. What our new paper shows is that this picture is not really correct,” says Anders Sejr Hansen, the Underwood-Prescott Career Development Assistant Professor of Biological Engineering at MIT. “We suggest that the functional state of these domains is much more dynamic.”

    Hansen is one of the senior authors of the new study, along with Leonid Mirny, a professor in MIT’s Institute for Medical Engineering and Science and the Department of Physics, and Christoph Zechner, a group leader at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and the Center for Systems Biology Dresden. MIT postdoc Michele Gabriele, recent Harvard University PhD recipient Hugo Brandão, and MIT graduate student Simon Grosse-Holz are the lead authors of the paper, which appears today in Science.

    Out of the loop

    Using computer simulations and experimental data, scientists including Mirny’s group at MIT have shown that loops in the genome are formed by a process called extrusion, in which a molecular motor promotes the growth of progressively larger loops. The motor stops each time it encounters a “stop sign” on DNA. The motor that extrudes such loops is a protein complex called cohesin, while the DNA-bound protein CTCF serves as the stop sign. These cohesin-mediated loops between CTCF sites were seen in previous experiments.

    However, those experiments only offered a snapshot of a moment in time, with no information on how the loops change over time. In their new study, the researchers developed techniques that allowed them to fluorescently label CTCF DNA sites so they could image the DNA loops over several hours. They also created a new computational method that can infer the looping events from the imaging data.

    “This method was crucial for us to distinguish signal from noise in our experimental data and quantify looping,” Zechner says. “We believe that such approaches will become increasingly important for biology as we continue to push the limits of detection with experiments.”

    The researchers used their method to image a stretch of the genome in mouse embryonic stem cells. “If we put our data in the context of one cell division cycle, which lasts about 12 hours, the fully formed loop only actually exists for about 20 to 45 minutes, or about 3 to 6 percent of the time,” Grosse-Holz says.

    “If the loop is only present for such a tiny period of the cell cycle and very short-lived, we shouldn’t think of this fully looped state as being the primary regulator of gene expression,” Hansen says. “We think we need new models for how the 3D structure of the genome regulates gene expression, DNA repair, and other functional downstream processes.”

    While fully formed loops were rare, the researchers found that partially extruded loops were present about 92 percent of the time. These smaller loops have been difficult to observe with the previous methods of detecting loops in the genome.

    “In this study, by integrating our experimental data with polymer simulations, we have now been able to quantify the relative extents of the unlooped, partially extruded, and fully looped states,” Brandão says.

    “Since these interactions are very short, but very frequent, the previous methodologies were not able to fully capture their dynamics,” Gabriele adds. “With our new technique, we can start to resolve transitions between fully looped and unlooped states.”

    Play video

    The researchers hypothesize that these partial loops may play more important roles in gene regulation than fully formed loops. Strands of DNA run along each other as loops begin to form and then fall apart, and these interactions may help regulatory elements such as enhancers and gene promoters find each other.

    “More than 90 percent of the time, there are some transient loops, and presumably what’s important is having those loops that are being perpetually extruded,” Mirny says. “The process of extrusion itself may be more important than the fully looped state that only occurs for a short period of time.”

    More loops to study

    Since most of the other loops in the genome are weaker than the one the researchers studied in this paper, they suspect that many other loops will also prove to be highly transient. They now plan to use their new technique study some of those other loops, in a variety of cell types.

    “There are about 10,000 of these loops, and we’ve looked at one,” Hansen says. “We have a lot of indirect evidence to suggest that the results would be generalizable, but we haven’t demonstrated that. Using the technology platform we’ve set up, which combines new experimental and computational methods, we can begin to approach other loops in the genome.”

    The researchers also plan to investigate the role of specific loops in disease. Many diseases, including a neurodevelopmental disorder called FOXG1 syndrome, could be linked to faulty loop dynamics. The researchers are now studying how both the normal and mutated form of the FOXG1 gene, as well as the cancer-causing gene MYC, are affected by genome loop formation.

    The research was funded by the National Institutes of Health, the National Science Foundation, the Mathers Foundation, a Pew-Stewart Cancer Research Scholar grant, the Chaires d’excellence Internationale Blaise Pascal, an American-Italian Cancer Foundation research scholarship, and the Max Planck Institute for Molecular Cell Biology and Genetics. More

  • in

    Developing electricity-powered, low-emissions alternatives to carbon-intensive industrial processes

    On April 11, 2022, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This is the second article in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    One of the biggest leaps that humankind could take to drastically lower greenhouse gas emissions globally would be the complete decarbonization of industry. But without finding low-cost, environmentally friendly substitutes for industrial materials, the traditional production of steel, cement, ammonia, and ethylene will continue pumping out billions of tons of carbon annually; these sectors alone are responsible for at least one third of society’s global greenhouse gas emissions. 

    A major problem is that industrial manufacturers, whose success depends on reliable, cost-efficient, and large-scale production methods, are too heavily invested in processes that have historically been powered by fossil fuels to quickly switch to new alternatives. It’s a machine that kicked on more than 100 years ago, and which MIT electrochemical engineer Yet-Ming Chiang says we can’t shut off without major disruptions to the world’s massive supply chain of these materials. What’s needed, Chiang says, is a broader, collaborative clean energy effort that takes “targeted fundamental research, all the way through to pilot demonstrations that greatly lowers the risk for adoption of new technology by industry.”

    This would be a new approach to decarbonization of industrial materials production that relies on largely unexplored but cleaner electrochemical processes. New production methods could be optimized and integrated into the industrial machine to make it run on low-cost, renewable electricity in place of fossil fuels. 

    Recognizing this, Chiang, the Kyocera Professor in the Department of Materials Science and Engineering, teamed with research collaborator Bilge Yildiz, the Breene M. Kerr Professor of Nuclear Science and Engineering and professor of materials science and engineering, with key input from Karthish Manthiram, visiting professor in the Department of Chemical Engineering, to submit a project proposal to the MIT Climate Grand Challenges. Their plan: to create an innovation hub on campus that would bring together MIT researchers individually investigating decarbonization of steel, cement, ammonia, and ethylene under one roof, combining research equipment and directly collaborating on new methods to produce these four key materials.

    Many researchers across MIT have already signed on to join the effort, including Antoine Allanore, associate professor of metallurgy, who specializes in the development of sustainable materials and manufacturing processes, and Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in the Department of Materials Science and Engineering, who is an expert in materials economics and sustainability. Other MIT faculty currently involved include Fikile Brushett, Betar Gallant, Ahmed Ghoniem, William Green, Jeffrey Grossman, Ju Li, Yuriy Román-Leshkov, Yang Shao-Horn, Robert Stoner, Yogesh Surendranath, Timothy Swager, and Kripa Varanasi.

    “The team we brought together has the expertise needed to tackle these challenges, including electrochemistry — using electricity to decarbonize these chemical processes — and materials science and engineering, process design and scale-up technoeconomic analysis, and system integration, which is all needed for this to go out from our labs to the field,” says Yildiz.

    Selected from a field of more than 100 proposals, their Center for Electrification and Decarbonization of Industry (CEDI) will be the first such institute worldwide dedicated to testing and scaling the most innovative and promising technologies in sustainable chemicals and materials. CEDI will work to facilitate rapid translation of lab discoveries into affordable, scalable industry solutions, with potential to offset as much as 15 percent of greenhouse gas emissions. The team estimates that some CEDI projects already underway could be commercialized within three years.

    “The real timeline is as soon as possible,” says Chiang.

    To achieve CEDI’s ambitious goals, a physical location is key, staffed with permanent faculty, as well as undergraduates, graduate students, and postdocs. Yildiz says the center’s success will depend on engaging student researchers to carry forward with research addressing the biggest ongoing challenges to decarbonization of industry.

    “We are training young scientists, students, on the learned urgency of the problem,” says Yildiz. “We empower them with the skills needed, and even if an individual project does not find the implementation in the field right away, at least, we would have trained the next generation that will continue to go after them in the field.”

    Chiang’s background in electrochemistry showed him how the efficiency of cement production could benefit from adopting clean electricity sources, and Yildiz’s work on ethylene, the source of plastic and one of industry’s most valued chemicals, has revealed overlooked cost benefits to switching to electrochemical processes with less expensive starting materials. With industry partners, they hope to continue these lines of fundamental research along with Allanore, who is focused on electrifying steel production, and Manthiram, who is developing new processes for ammonia. Olivetti will focus on understanding risks and barriers to implementation. This multilateral approach aims to speed up the timeline to industry adoption of new technologies at the scale needed for global impact.

    “One of the points of emphasis in this whole center is going to be applying technoeconomic analysis of what it takes to be successful at a technical and economic level, as early in the process as possible,” says Chiang.

    The impact of large-scale industry adoption of clean energy sources in these four key areas that CEDI plans to target first would be profound, as these sectors are currently responsible for 7.5 billion tons of emissions annually. There is the potential for even greater impact on emissions as new knowledge is applied to other industrial products beyond the initial four targets of steel, cement, ammonia, and ethylene. Meanwhile, the center will stand as a hub to attract new industry, government stakeholders, and research partners to collaborate on urgently needed solutions, both newly arising and long overdue.

    When Chiang and Yildiz first met to discuss ideas for MIT Climate Grand Challenges, they decided they wanted to build a climate research center that functioned unlike any other to help pivot large industry toward decarbonization. Beyond considering how new solutions will impact industry’s bottom line, CEDI will also investigate unique synergies that could arise from the electrification of industry, like processes that would create new byproducts that could be the feedstock to other industry processes, reducing waste and increasing efficiencies in the larger system. And because industry is so good at scaling, those added benefits would be widespread, finally replacing century-old technologies with critical updates designed to improve production and markedly reduce industry’s carbon footprint sooner rather than later.

    “Everything we do, we’re going to try to do with urgency,” Chiang says. “The fundamental research will be done with urgency, and the transition to commercialization, we’re going to do with urgency.” More

  • in

    Computing our climate future

    On Monday, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the first in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    With improvements to computer processing power and an increased understanding of the physical equations governing the Earth’s climate, scientists are continually working to refine climate models and improve their predictive power. But the tools they’re refining were originally conceived decades ago with only scientists in mind. When it comes to developing tangible climate action plans, these models remain inscrutable to the policymakers, public safety officials, civil engineers, and community organizers who need their predictive insight most.

    “What you end up having is a gap between what’s typically used in practice, and the real cutting-edge science,” says Noelle Selin, a professor in the Institute for Data, Systems and Society and the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and co-lead with Professor Raffaele Ferrari on the MIT Climate Grand Challenges flagship project “Bringing Computation to the Climate Crisis.” “How can we use new computational techniques, new understandings, new ways of thinking about modeling, to really bridge that gap between state-of-the-art scientific advances and modeling, and people who are actually needing to use these models?”

    Using this as a driving question, the team won’t just be trying to refine current climate models, they’re building a new one from the ground up.

    This kind of game-changing advancement is exactly what the MIT Climate Grand Challenges is looking for, which is why the proposal has been named one of the five flagship projects in the ambitious Institute-wide program aimed at tackling the climate crisis. The proposal, which was selected from 100 submissions and was among 27 finalists, will receive additional funding and support to further their goal of reimagining the climate modeling system. It also brings together contributors from across the Institute, including the MIT Schwarzman College of Computing, the School of Engineering, and the Sloan School of Management.

    When it comes to pursuing high-impact climate solutions that communities around the world can use, “it’s great to do it at MIT,” says Ferrari, EAPS Cecil and Ida Green Professor of Oceanography. “You’re not going to find many places in the world where you have the cutting-edge climate science, the cutting-edge computer science, and the cutting-edge policy science experts that we need to work together.”

    The climate model of the future

    The proposal builds on work that Ferrari began three years ago as part of a joint project with Caltech, the Naval Postgraduate School, and NASA’s Jet Propulsion Lab. Called the Climate Modeling Alliance (CliMA), the consortium of scientists, engineers, and applied mathematicians is constructing a climate model capable of more accurately projecting future changes in critical variables, such as clouds in the atmosphere and turbulence in the ocean, with uncertainties at least half the size of those in existing models.

    To do this, however, requires a new approach. For one thing, current models are too coarse in resolution — at the 100-to-200-kilometer scale — to resolve small-scale processes like cloud cover, rainfall, and sea ice extent. But also, explains Ferrari, part of this limitation in resolution is due to the fundamental architecture of the models themselves. The languages most global climate models are coded in were first created back in the 1960s and ’70s, largely by scientists for scientists. Since then, advances in computing driven by the corporate world and computer gaming have given rise to dynamic new computer languages, powerful graphics processing units, and machine learning.

    For climate models to take full advantage of these advancements, there’s only one option: starting over with a modern, more flexible language. Written in Julia, a part of Julialab’s Scientific Machine Learning technology, and spearheaded by Alan Edelman, a professor of applied mathematics in MIT’s Department of Mathematics, CliMA will be able to harness far more data than the current models can handle.

    “It’s been real fun finally working with people in computer science here at MIT,” Ferrari says. “Before it was impossible, because traditional climate models are in a language their students can’t even read.”

    The result is what’s being called the “Earth digital twin,” a climate model that can simulate global conditions on a large scale. This on its own is an impressive feat, but the team wants to take this a step further with their proposal.

    “We want to take this large-scale model and create what we call an ‘emulator’ that is only predicting a set of variables of interest, but it’s been trained on the large-scale model,” Ferrari explains. Emulators are not new technology, but what is new is that these emulators, being referred to as the “Earth digital cousins,” will take advantage of machine learning.

    “Now we know how to train a model if we have enough data to train them on,” says Ferrari. Machine learning for projects like this has only become possible in recent years as more observational data become available, along with improved computer processing power. The goal is to create smaller, more localized models by training them using the Earth digital twin. Doing so will save time and money, which is key if the digital cousins are going to be usable for stakeholders, like local governments and private-sector developers.

    Adaptable predictions for average stakeholders

    When it comes to setting climate-informed policy, stakeholders need to understand the probability of an outcome within their own regions — in the same way that you would prepare for a hike differently if there’s a 10 percent chance of rain versus a 90 percent chance. The smaller Earth digital cousin models will be able to do things the larger model can’t do, like simulate local regions in real time and provide a wider range of probabilistic scenarios.

    “Right now, if you wanted to use output from a global climate model, you usually would have to use output that’s designed for general use,” says Selin, who is also the director of the MIT Technology and Policy Program. With the project, the team can take end-user needs into account from the very beginning while also incorporating their feedback and suggestions into the models, helping to “democratize the idea of running these climate models,” as she puts it. Doing so means building an interactive interface that eventually will give users the ability to change input values and run the new simulations in real time. The team hopes that, eventually, the Earth digital cousins could run on something as ubiquitous as a smartphone, although developments like that are currently beyond the scope of the project.

    The next thing the team will work on is building connections with stakeholders. Through participation of other MIT groups, such as the Joint Program on the Science and Policy of Global Change and the Climate and Sustainability Consortium, they hope to work closely with policymakers, public safety officials, and urban planners to give them predictive tools tailored to their needs that can provide actionable outputs important for planning. Faced with rising sea levels, for example, coastal cities could better visualize the threat and make informed decisions about infrastructure development and disaster preparedness; communities in drought-prone regions could develop long-term civil planning with an emphasis on water conservation and wildfire resistance.

    “We want to make the modeling and analysis process faster so people can get more direct and useful feedback for near-term decisions,” she says.

    The final piece of the challenge is to incentivize students now so that they can join the project and make a difference. Ferrari has already had luck garnering student interest after co-teaching a class with Edelman and seeing the enthusiasm students have about computer science and climate solutions.

    “We’re intending in this project to build a climate model of the future,” says Selin. “So it seems really appropriate that we would also train the builders of that climate model.” More

  • in

    MIT announces five flagship projects in first-ever Climate Grand Challenges competition

    MIT today announced the five flagship projects selected in its first-ever Climate Grand Challenges competition. These multiyear projects will define a dynamic research agenda focused on unraveling some of the toughest unsolved climate problems and bringing high-impact, science-based solutions to the world on an accelerated basis.

    Representing the most promising concepts to emerge from the two-year competition, the five flagship projects will receive additional funding and resources from MIT and others to develop their ideas and swiftly transform them into practical solutions at scale.

    “Climate Grand Challenges represents a whole-of-MIT drive to develop game-changing advances to confront the escalating climate crisis, in time to make a difference,” says MIT President L. Rafael Reif. “We are inspired by the creativity and boldness of the flagship ideas and by their potential to make a significant contribution to the global climate response. But given the planet-wide scale of the challenge, success depends on partnership. We are eager to work with visionary leaders in every sector to accelerate this impact-oriented research, implement serious solutions at scale, and inspire others to join us in confronting this urgent challenge for humankind.”

    Brief descriptions of the five Climate Grand Challenges flagship projects are provided below.

    Bringing Computation to the Climate Challenge

    This project leverages advances in artificial intelligence, machine learning, and data sciences to improve the accuracy of climate models and make them more useful to a variety of stakeholders — from communities to industry. The team is developing a digital twin of the Earth that harnesses more data than ever before to reduce and quantify uncertainties in climate projections.

    Research leads: Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in the Department of Earth, Atmospheric and Planetary Sciences, and director of the Program in Atmospheres, Oceans, and Climate; and Noelle Eckley Selin, director of the Technology and Policy Program and professor with a joint appointment in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences

    Center for Electrification and Decarbonization of Industry

    This project seeks to reinvent and electrify the processes and materials behind hard-to-decarbonize industries like steel, cement, ammonia, and ethylene production. A new innovation hub will perform targeted fundamental research and engineering with urgency, pushing the technological envelope on electricity-driven chemical transformations.

    Research leads: Yet-Ming Chiang, the Kyocera Professor of Materials Science and Engineering, and Bilge Yıldız, the Breene M. Kerr Professor in the Department of Nuclear Science and Engineering and professor in the Department of Materials Science and Engineering

    Preparing for a new world of weather and climate extremes

    This project addresses key gaps in knowledge about intensifying extreme events such as floods, hurricanes, and heat waves, and quantifies their long-term risk in a changing climate. The team is developing a scalable climate-change adaptation toolkit to help vulnerable communities and low-carbon energy providers prepare for these extreme weather events.

    Research leads: Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Sciences and co-director of the MIT Lorenz Center; Miho Mazereeuw, associate professor of architecture and urbanism in the Department of Architecture and director of the Urban Risk Lab; and Paul O’Gorman, professor in the Program in Atmospheres, Oceans, and Climate in the Department of Earth, Atmospheric and Planetary Sciences

    The Climate Resilience Early Warning System

    The CREWSnet project seeks to reinvent climate change adaptation with a novel forecasting system that empowers underserved communities to interpret local climate risk, proactively plan for their futures incorporating resilience strategies, and minimize losses. CREWSnet will initially be demonstrated in southwestern Bangladesh, serving as a model for similarly threatened regions around the world.

    Research leads: John Aldridge, assistant leader of the Humanitarian Assistance and Disaster Relief Systems Group at MIT Lincoln Laboratory, and Elfatih Eltahir, the H.M. King Bhumibol Professor of Hydrology and Climate in the Department of Civil and Environmental Engineering

    Revolutionizing agriculture with low-emissions, resilient crops

    This project works to revolutionize the agricultural sector with climate-resilient crops and fertilizers that have the ability to dramatically reduce greenhouse gas emissions from food production.

    Research lead: Christopher Voigt, the Daniel I.C. Wang Professor in the Department of Biological Engineering

    “As one of the world’s leading institutions of research and innovation, it is incumbent upon MIT to draw on our depth of knowledge, ingenuity, and ambition to tackle the hard climate problems now confronting the world,” says Richard Lester, MIT associate provost for international activities. “Together with collaborators across industry, finance, community, and government, the Climate Grand Challenges teams are looking to develop and implement high-impact, path-breaking climate solutions rapidly and at a grand scale.”

    The initial call for ideas in 2020 yielded nearly 100 letters of interest from almost 400 faculty members and senior researchers, representing 90 percent of MIT departments. After an extensive evaluation, 27 finalist teams received a total of $2.7 million to develop comprehensive research and innovation plans. The projects address four broad research themes:

    To select the winning projects, research plans were reviewed by panels of international experts representing relevant scientific and technical domains as well as experts in processes and policies for innovation and scalability.

    “In response to climate change, the world really needs to do two things quickly: deploy the solutions we already have much more widely, and develop new solutions that are urgently needed to tackle this intensifying threat,” says Maria Zuber, MIT vice president for research. “These five flagship projects exemplify MIT’s strong determination to bring its knowledge and expertise to bear in generating new ideas and solutions that will help solve the climate problem.”

    “The Climate Grand Challenges flagship projects set a new standard for inclusive climate solutions that can be adapted and implemented across the globe,” says MIT Chancellor Melissa Nobles. “This competition propels the entire MIT research community — faculty, students, postdocs, and staff — to act with urgency around a worsening climate crisis, and I look forward to seeing the difference these projects can make.”

    “MIT’s efforts on climate research amid the climate crisis was a primary reason that I chose to attend MIT, and remains a reason that I view the Institute favorably. MIT has a clear opportunity to be a thought leader in the climate space in our own MIT way, which is why CGC fits in so well,” says senior Megan Xu, who served on the Climate Grand Challenges student committee and is studying ways to make the food system more sustainable.

    The Climate Grand Challenges competition is a key initiative of “Fast Forward: MIT’s Climate Action Plan for the Decade,” which the Institute published in May 2021. Fast Forward outlines MIT’s comprehensive plan for helping the world address the climate crisis. It consists of five broad areas of action: sparking innovation, educating future generations, informing and leveraging government action, reducing MIT’s own climate impact, and uniting and coordinating all of MIT’s climate efforts. More

  • in

    Finding the questions that guide MIT fusion research

    “One of the things I learned was, doing good science isn’t so much about finding the answers as figuring out what the important questions are.”

    As Martin Greenwald retires from the responsibilities of senior scientist and deputy director of the MIT Plasma Science and Fusion Center (PSFC), he reflects on his almost 50 years of science study, 43 of them as a researcher at MIT, pursuing the question of how to make the carbon-free energy of fusion a reality.

    Most of Greenwald’s important questions about fusion began after graduating from MIT with a BS in both physics and chemistry. Beginning graduate work at the University of California at Berkeley, he felt compelled to learn more about fusion as an energy source that could have “a real societal impact.” At the time, researchers were exploring new ideas for devices that could create and confine fusion plasmas. Greenwald worked on Berkeley’s “alternate concept” TORMAC, a Toroidal Magnetic Cusp. “It didn’t work out very well,” he laughs. “The first thing I was known for was making the measurements that shut down the program.”

    Believing the temperature of the plasma generated by the device would not be as high as his group leader expected, Greenwald developed hardware that could measure the low temperatures predicted by his own “back of the envelope calculations.” As he anticipated, his measurements showed that “this was not a fusion plasma; this was hardly a confined plasma at all.”

    With a PhD from Berkeley, Greenwald returned to MIT for a research position at the PSFC, attracted by the center’s “esprit de corps.”

    He arrived in time to participate in the final experiments on Alcator A, the first in a series of tokamaks built at MIT, all characterized by compact size and featuring high-field magnets. The tokamak design was then becoming favored as the most effective route to fusion: its doughnut-shaped vacuum chamber, surrounded by electromagnets, could confine the turbulent plasma long enough, while increasing its heat and density, to make fusion occur.

    Alcator A showed that the energy confinement time improves in relation to increasing plasma density. MIT’s succeeding device, Alcator C, was designed to use higher magnetic fields, boosting expectations that it would reach higher densities and better confinement. To attain these goals, however, Greenwald had to pursue a new technique that increased density by injecting pellets of frozen fuel into the plasma, a method he likens to throwing “snowballs in hell.” This work was notable for the creation of a new regime of enhanced plasma confinement on Alcator C. In those experiments, a confined plasma surpassed for the first time one of the two Lawson criteria — the minimum required value for the product of the plasma density and confinement time — for making net power from fusion. This had been a milestone for fusion research since their publication by John Lawson in 1957.

    Greenwald continued to make a name for himself as part of a larger study into the physics of the Compact Ignition Tokamak — a high-field burning plasma experiment that the U.S. program was proposing to build in the late 1980s. The result, unexpectedly, was a new scaling law, later known as the “Greenwald Density Limit,” and a new theory for the mechanism of the limit. It has been used to accurately predict performance on much larger machines built since.

    The center’s next tokamak, Alcator C-Mod, started operation in 1993 and ran for more than 20 years, with Greenwald as the chair of its Experimental Program Committee. Larger than Alcator C, the new device supported a highly shaped plasma, strong radiofrequency heating, and an all-metal plasma-facing first wall. All of these would eventually be required in a fusion power system.

    C-Mod proved to be MIT’s most enduring fusion experiment to date, producing important results for 20 years. During that time Greenwald contributed not only to the experiments, but to mentoring the next generation. Research scientist Ryan Sweeney notes that “Martin quickly gained my trust as a mentor, in part due to his often casual dress and slightly untamed hair, which are embodiments of his transparency and his focus on what matters. He can quiet a room of PhDs and demand attention not by intimidation, but rather by his calmness and his ability to bring clarity to complicated problems, be they scientific or human in nature.”

    Greenwald worked closely with the group of students who, in PSFC Director Dennis Whyte’s class, came up with the tokamak concept that evolved into SPARC. MIT is now pursuing this compact, high-field tokamak with Commonwealth Fusion Systems, a startup that grew out of the collective enthusiasm for this concept, and the growing realization it could work. Greenwald now heads the Physics Group for the SPARC project at MIT. He has helped confirm the device’s physics basis in order to predict performance and guide engineering decisions.

    “Martin’s multifaceted talents are thoroughly embodied by, and imprinted on, SPARC” says Whyte. “First, his leadership in its plasma confinement physics validation and publication place SPARC on a firm scientific footing. Secondly, the impact of the density limit he discovered, which shows that fuel density increases with magnetic field and decreasing the size of the tokamak, is critical in obtaining high fusion power density not just in SPARC, but in future power plants. Third, and perhaps most impressive, is Martin’s mentorship of the SPARC generation of leadership.”

    Greenwald’s expertise and easygoing personality have made him an asset as head of the PSFC Office for Computer Services and group leader for data acquisition and computing, and sought for many professional committees. He has been an APS Fellow since 2000, and was an APS Distinguished Lecturer in Plasma Physics (2001-02). He was also presented in 2014 with a Leadership Award from Fusion Power Associates. He is currently an associate editor for Physics of Plasmas and a member of the Lawrence Livermore National Laboratory Physical Sciences Directorate External Review Committee.

    Although leaving his full-time responsibilities, Greenwald will remain at MIT as a visiting scientist, a role he says will allow him to “stick my nose into everything without being responsible for anything.”

    “At some point in the race you have to hand off the baton,“ he says. “And it doesn’t mean you’re not interested in the outcome; and it doesn’t mean you’re just going to walk away into the stands. I want to be there at the end when we succeed.” More

  • in

    Q&A: Climate Grand Challenges finalists on using data and science to forecast climate-related risk

    Note: This is the final article in a four-part interview series featuring the work of the 27 MIT Climate Grand Challenges finalist teams, which received a total of $2.7 million in startup funding to advance their projects. This month, the Institute will name a subset of the finalists as multiyear flagship projects.

    Advances in computation, artificial intelligence, robotics, and data science are enabling a new generation of observational tools and scientific modeling with the potential to produce timely, reliable, and quantitative analysis of future climate risks at a local scale. These projections can increase the accuracy and efficacy of early warning systems, improve emergency planning, and provide actionable information for climate mitigation and adaptation efforts, as human actions continue to change planetary conditions.

    In conversations prepared for MIT News, faculty from four Climate Grand Challenges teams with projects in the competition’s “Using data and science to forecast climate-related risk” category describe the promising new technologies that can help scientists understand the Earth’s climate system on a finer scale than ever before. (The other Climate Grand Challenges research themes include building equity and fairness into climate solutions, removing, managing, and storing greenhouse gases, and decarbonizing complex industries and processes.) The following responses have been edited for length and clarity.

    An observational system that can initiate a climate risk forecasting revolution

    Despite recent technological advances and massive volumes of data, climate forecasts remain highly uncertain. Gaps in observational capabilities create substantial challenges to predicting extreme weather events and establishing effective mitigation and adaptation strategies. R. John Hansman, the T. Wilson Professor of Aeronautics and Astronautics and director of the MIT International Center for Air Transportation, discusses the Stratospheric Airborne Climate Observatory System (SACOS) being developed together with Brent Minchew, the Cecil and Ida Green Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and a team that includes researchers from MIT Lincoln Laboratory and Harvard University.

    Q: How does SACOS reduce uncertainty in climate risk forecasting?

    A: There is a critical need for higher spatial and temporal resolution observations of the climate system than are currently available through remote (satellite or airborne) and surface (in-situ) sensing. We are developing an ensemble of high-endurance, solar-powered aircraft with instrument systems capable of performing months-long climate observing missions that satellites or aircraft alone cannot fulfill. Summer months are ideal for SACOS operations, as many key climate phenomena are active and short night periods reduce the battery mass, vehicle size, and technical risks. These observations hold the potential to inform and predict, allowing emergency planners, policymakers, and the rest of society to better prepare for the changes to come.

    Q: Describe the types of observing missions where SACOS could provide critical improvements.

    A: The demise of the Antarctic Ice Sheet, which is leading to rising sea levels around the world and threatening the displacement of millions of people, is one example. Current sea level forecasts struggle to account for giant fissures that create massive icebergs and cause the Antarctic Ice Sheet to flow more rapidly into the ocean. SACOS can track these fissures to accurately forecast ice slippage and give impacted populations enough time to prepare or evacuate. Elsewhere, widespread droughts cause rampant wildfires and water shortages. SACOS has the ability to monitor soil moisture and humidity in critically dry regions to identify where and when wildfires and droughts are imminent. SACOS also offers the most effective method to measure, track, and predict local ozone depletion over North America, which has resulted in increasingly severe summer thunderstorms.

    Quantifying and managing the risks of sea-level rise

    Prevailing estimates of sea-level rise range from approximately 20 centimeters to 2 meters by the end of the century, with the associated costs on the order of trillions of dollars. The instability of certain portions of the world’s ice sheets creates vast uncertainties, complicating how the world prepares for and responds to these potential changes. EAPS Professor Brent Minchew is leading another Climate Grand Challenges finalist team working on an integrated, multidisciplinary effort to improve the scientific understanding of sea-level rise and provide actionable information and tools to manage the risks it poses.

    Q: What have been the most significant challenges to understanding the potential rates of sea-level rise?

    A: West Antarctica is one of the most remote, inaccessible, and hostile places on Earth — to people and equipment. Thus, opportunities to observe the collapse of the West Antarctic Ice Sheet, which contains enough ice to raise global sea levels by about 3 meters, are limited and current observations crudely resolved. It is essential that we understand how the floating edge of the ice sheets, often called ice shelves, fracture and collapse because they provide critical forces that govern the rate of ice mass loss and can stabilize the West Antarctic Ice Sheet.

    Q: How will your project advance what is currently known about sea-level rise?

    A: We aim to advance global-scale projections of sea-level rise through novel observational technologies and computational models of ice sheet change and to link those predictions to region- to neighborhood-scale estimates of costs and adaptation strategies. To do this, we propose two novel instruments: a first-of-its-kind drone that can fly for months at a time over Antarctica making continuous observations of critical areas and an airdropped seismometer and GPS bundle that can be deployed to vulnerable and hard-to-reach areas of the ice sheet. This technology will provide greater data quality and density and will observe the ice sheet at frequencies that are currently inaccessible — elements that are essential for understanding the physics governing the evolution of the ice sheet and sea-level rise.

    Changing flood risk for coastal communities in the developing world

    Globally, more than 600 million people live in low-elevation coastal areas that face an increasing risk of flooding from sea-level rise. This includes two-thirds of cities with populations of more than 5 million and regions that conduct the vast majority of global trade. Dara Entekhabi, the Bacardi and Stockholm Water Foundations Professor in the Department of Civil and Environmental Engineering and professor in the Department of Earth, Atmospheric, and Planetary Sciences, outlines an interdisciplinary partnership that leverages data and technology to guide short-term and chart long-term adaptation pathways with Miho Mazereeuw, associate professor of architecture and urbanism and director of the Urban Risk Lab in the School of Architecture and Planning, and Danielle Wood, assistant professor in the Program in Media Arts and Sciences and the Department of Aeronautics and Astronautics.

    Q: What is the key problem this program seeks to address?

    A: The accumulated heating of the Earth system due to fossil burning is largely absorbed by the oceans, and the stored heat expands the ocean volume leading to increased base height for tides. When the high tides inundate a city, the condition is referred to as “sunny day” flooding, but the saline waters corrode infrastructure and wreak havoc on daily routines. The danger ahead for many coastal cities in the developing world is the combination of increasing high tide intrusions, coupled with heavy precipitation storm events.

    Q: How will your proposed solutions impact flood risk management?

    A: We are producing detailed risk maps for coastal cities in developing countries using newly available, very high-resolution remote-sensing data from space-borne instruments, as well as historical tides records and regional storm characteristics. Using these datasets, we aim to produce street-by-street risk maps that provide local decision-makers and stakeholders with a way to estimate present and future flood risks. With the model of future tides and probabilistic precipitation events, we can forecast future inundation by a flooding event, decadal changes with various climate-change and sea-level rise projections, and an increase in the likelihood of sunny-day flooding. Working closely with local partners, we will develop toolkits to explore short-term emergency response, as well as long-term mitigation and adaptation techniques in six pilot locations in South and Southeast Asia, Africa, and South America.

    Ocean vital signs

    On average, every person on Earth generates fossil fuel emissions equivalent to an 8-pound bag of carbon, every day. Much of this is absorbed by the ocean, but there is wide variability in the estimates of oceanic absorption, which translates into differences of trillions of dollars in the required cost of mitigation. In the Department of Earth, Atmospheric and Planetary Sciences, Christopher Hill, a principal research engineer specializing in Earth and planetary computational science, works with Ryan Woosley, a principal research scientist focusing on the carbon cycle and ocean acidification. Hill explains that they hope to use artificial intelligence and machine learning to help resolve this uncertainty.

    Q: What is the current state of knowledge on air-sea interactions?

    A: Obtaining specific, accurate field measurements of critical physical, chemical, and biological exchanges between the ocean and the planet have historically entailed expensive science missions with large ship-based infrastructure that leave gaps in real-time data about significant ocean climate processes. Recent advances in highly scalable in-situ autonomous observing and navigation combined with airborne, remote sensing, and machine learning innovations have the potential to transform data gathering, provide more accurate information, and address fundamental scientific questions around air-sea interaction.

    Q: How will your approach accelerate real-time, autonomous surface ocean observing from an experimental research endeavor to a permanent and impactful solution?

    A: Our project seeks to demonstrate how a scalable surface ocean observing network can be launched and operated, and to illustrate how this can reduce uncertainties in estimates of air-sea carbon dioxide exchange. With an initial high-impact goal of substantially eliminating the vast uncertainties that plague our understanding of ocean uptake of carbon dioxide, we will gather critical measurements for improving extended weather and climate forecast models and reducing climate impact uncertainty. The results have the potential to more accurately identify trillions of dollars worth of economic activity. More