More stories

  • in

    Using plant biology to address climate change

    On April 11, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the fourth in a five-part series highlighting the most promising concepts to emerge from the competition and the interdisciplinary research teams behind them.

    The impact of our changing climate on agriculture and food security — and how contemporary agriculture contributes to climate change — is at the forefront of MIT’s multidisciplinary project “Revolutionizing agriculture with low-emissions, resilient crops.” The project The project is one of five flagship winners in the Climate Grand Challenges competition, and brings together researchers from the departments of Biology, Biological Engineering, Chemical Engineering, and Civil and Environmental Engineering.

    “Our team’s research seeks to address two connected challenges: first, the need to reduce the greenhouse gas emissions produced by agricultural fertilizer; second, the fact that the yields of many current agricultural crops will decrease, due to the effects of climate change on plant metabolism,” says the project’s faculty lead, Christopher Voigt, the Daniel I.C. Wang Professor in MIT’s Department of Biological Engineering. “We are pursuing six interdisciplinary projects that are each key to our overall goal of developing low-emissions methods for fertilizing plants that are bioengineered to be more resilient and productive in a changing climate.”

    Whitehead Institute members Mary Gehring and Jing-Ke Weng, plant biologists who are also associate professors in MIT’s Department of Biology, will lead two of those projects.

    Promoting crop resilience

    For most of human history, climate change occurred gradually, over hundreds or thousands of years. That pace allowed plants to adapt to variations in temperature, precipitation, and atmospheric composition. However, human-driven climate change has occurred much more quickly, and crop plants have suffered: Crop yields are down in many regions, as is seed protein content in cereal crops.

    “If we want to ensure an abundant supply of nutritious food for the world, we need to develop fundamental mechanisms for bioengineering a wide variety of crop plants that will be both hearty and nutritious in the face of our changing climate,” says Gehring. In her previous work, she has shown that many aspects of plant reproduction and seed development are controlled by epigenetics — that is, by information outside of the DNA sequence. She has been using that knowledge and the research methods she has developed to identify ways to create varieties of seed-producing plants that are more productive and resilient than current food crops.

    But plant biology is complex, and while it is possible to develop plants that integrate robustness-enhancing traits by combining dissimilar parental strains, scientists are still learning how to ensure that the new traits are carried forward from one generation to the next. “Plants that carry the robustness-enhancing traits have ‘hybrid vigor,’ and we believe that the perpetuation of those traits is controlled by epigenetics,” Gehring explains. “Right now, some food crops, like corn, can be engineered to benefit from hybrid vigor, but those traits are not inherited. That’s why farmers growing many of today’s most productive varieties of corn must purchase and plant new batches of seeds each year. Moreover, many important food crops have not yet realized the benefits of hybrid vigor.”

    The project Gehring leads, “Developing Clonal Seed Production to Fix Hybrid Vigor,” aims to enable food crop plants to create seeds that are both more robust and genetically identical to the parent — and thereby able to pass beneficial traits from generation to generation.

    The process of clonal (or asexual) production of seeds that are genetically identical to the maternal parent is called apomixis. Gehring says, “Because apomixis is present in 400 flowering plant species — about 1 percent of flowering plant species — it is probable that genes and signaling pathways necessary for apomixis are already present within crop plants. Our challenge is to tweak those genes and pathways so that the plant switches reproduction from sexual to asexual.”

    The project will leverage the fact that genes and pathways related to autonomous asexual development of the endosperm — a seed’s nutritive tissue — exist in the model plant Arabidopsis thaliana. In previous work on Arabidopsis, Gehring’s lab researched a specific gene that, when misregulated, drives development of an asexual endosperm-like material. “Normally, that seed would not be viable,” she notes. “But we believe that by epigenetic tuning of the expression of additional relevant genes, we will enable the plant to retain that material — and help achieve apomixis.”

    If Gehring and her colleagues succeed in creating a gene-expression “formula” for introducing endosperm apomixis into a wide range of crop plants, they will have made a fundamental and important achievement. Such a method could be applied throughout agriculture to create and perpetuate new crop breeds able to withstand their changing environments while requiring less fertilizer and fewer pesticides.

    Creating “self-fertilizing” crops

    Roughly a quarter of greenhouse gas (GHG) emissions in the United States are a product of agriculture. Fertilizer production and use accounts for one third of those emissions and includes nitrous oxide, which has heat-trapping capacity 298-fold stronger than carbon dioxide, according to a 2018 Frontiers in Plant Science study. Most artificial fertilizer production also consumes huge quantities of natural gas and uses minerals mined from nonrenewable resources. After all that, much of the nitrogen fertilizer becomes runoff that pollutes local waterways. For those reasons, this Climate Grand Challenges flagship project aims to greatly reduce use of human-made fertilizers.

    One tantalizing approach is to cultivate cereal crop plants — which account for about 75 percent of global food production — capable of drawing nitrogen from metabolic interactions with bacteria in the soil. Whitehead Institute’s Weng leads an effort to do just that: genetically bioengineer crops such as corn, rice, and wheat to, essentially, create their own fertilizer through a symbiotic relationship with nitrogen-fixing microbes.

    “Legumes such as bean and pea plants can form root nodules through which they receive nitrogen from rhizobia bacteria in exchange for carbon,” Weng explains. “This metabolic exchange means that legumes release far less greenhouse gas — and require far less investment of fossil energy — than do cereal crops, which use a huge portion of the artificially produced nitrogen fertilizers employed today.

    “Our goal is to develop methods for transferring legumes’ ‘self-fertilizing’ capacity to cereal crops,” Weng says. “If we can, we will revolutionize the sustainability of food production.”

    The project — formally entitled “Mimicking legume-rhizobia symbiosis for fertilizer production in cereals” — will be a multistage, five-year effort. It draws on Weng’s extensive studies of metabolic evolution in plants and his identification of molecules involved in formation of the root nodules that permit exchanges between legumes and nitrogen-fixing bacteria. It also leverages his expertise in reconstituting specific signaling and metabolic pathways in plants.

    Weng and his colleagues will begin by deciphering the full spectrum of small-molecule signaling processes that occur between legumes and rhizobium bacteria. Then they will genetically engineer an analogous system in nonlegume crop plants. Next, using state-of-the-art metabolomic methods, they will identify which small molecules excreted from legume roots prompt a nitrogen/carbon exchange from rhizobium bacteria. Finally, the researchers will genetically engineer the biosynthesis of those molecules in the roots of nonlegume plants and observe their effect on the rhizobium bacteria surrounding the roots.

    While the project is complex and technically challenging, its potential is staggering. “Focusing on corn alone, this could reduce the production and use of nitrogen fertilizer by 160,000 tons,” Weng notes. “And it could halve the related emissions of nitrous oxide gas.” More

  • in

    MIT engineers introduce the Oreometer

    When you twist open an Oreo cookie to get to the creamy center, you’re mimicking a standard test in rheology — the study of how a non-Newtonian material flows when twisted, pressed, or otherwise stressed. MIT engineers have now subjected the sandwich cookie to rigorous materials tests to get to the center of a tantalizing question: Why does the cookie’s cream stick to just one wafer when twisted apart?

    “There’s the fascinating problem of trying to get the cream to distribute evenly between the two wafers, which turns out to be really hard,” says Max Fan, an undergraduate in MIT’s Department of Mechanical Engineering.

    In pursuit of an answer, the team subjected cookies to standard rheology tests in the lab and found that no matter the flavor or amount of stuffing, the cream at the center of an Oreo almost always sticks to one wafer when twisted open. Only for older boxes of cookies does the cream sometimes separate more evenly between both wafers.

    The researchers also measured the torque required to twist open an Oreo, and found it to be similar to the torque required to turn a doorknob and about 1/10th what’s needed to twist open a bottlecap. The cream’s failure stress — i.e. the force per area required to get the cream to flow, or deform — is twice that of cream cheese and peanut butter, and about the same magnitude as mozzarella cheese. Judging from the cream’s response to stress, the team classifies its texture as “mushy,” rather than brittle, tough, or rubbery.

    So, why does the cookie’s cream glom to one side rather than splitting evenly between both? The manufacturing process may be to blame.

    “Videos of the manufacturing process show that they put the first wafer down, then dispense a ball of cream onto that wafer before putting the second wafer on top,” says Crystal Owens, an MIT mechanical engineering PhD candidate who studies the properties of complex fluids. “Apparently that little time delay may make the cream stick better to the first wafer.”

    The team’s study isn’t simply a sweet diversion from bread-and-butter research; it’s also an opportunity to make the science of rheology accessible to others. To that end, the researchers have designed a 3D-printable “Oreometer” — a simple device that firmly grasps an Oreo cookie and uses pennies and rubber bands to control the twisting force that progressively twists the cookie open. Instructions for the tabletop device can be found here.

    The new study, “On Oreology, the fracture and flow of ‘milk’s favorite cookie,’” appears today in Kitchen Flows, a special issue of the journal Physics of Fluids. It was conceived of early in the Covid-19 pandemic, when many scientists’ labs were closed or difficult to access. In addition to Owens and Fan, co-authors are mechanical engineering professors Gareth McKinley and A. John Hart.

    Confection connection

    A standard test in rheology places a fluid, slurry, or other flowable material onto the base of an instrument known as a rheometer. A parallel plate above the base can be lowered onto the test material. The plate is then twisted as sensors track the applied rotation and torque.

    Owens, who regularly uses a laboratory rheometer to test fluid materials such as 3D-printable inks, couldn’t help noting a similarity with sandwich cookies. As she writes in the new study:

    “Scientifically, sandwich cookies present a paradigmatic model of parallel plate rheometry in which a fluid sample, the cream, is held between two parallel plates, the wafers. When the wafers are counter-rotated, the cream deforms, flows, and ultimately fractures, leading to separation of the cookie into two pieces.”

    While Oreo cream may not appear to possess fluid-like properties, it is considered a “yield stress fluid” — a soft solid when unperturbed that can start to flow under enough stress, the way toothpaste, frosting, certain cosmetics, and concrete do.

    Curious as to whether others had explored the connection between Oreos and rheology, Owens found mention of a 2016 Princeton University study in which physicists first reported that indeed, when twisting Oreos by hand, the cream almost always came off on one wafer.

    “We wanted to build on this to see what actually causes this effect and if we could control it if we mounted the Oreos carefully onto our rheometer,” she says.

    Play video

    Cookie twist

    In an experiment that they would repeat for multiple cookies of various fillings and flavors, the researchers glued an Oreo to both the top and bottom plates of a rheometer and applied varying degrees of torque and angular rotation, noting the values  that successfully twisted each cookie apart. They plugged the measurements into equations to calculate the cream’s viscoelasticity, or flowability. For each experiment, they also noted the cream’s “post-mortem distribution,” or where the cream ended up after twisting open.

    In all, the team went through about 20 boxes of Oreos, including regular, Double Stuf, and Mega Stuf levels of filling, and regular, dark chocolate, and “golden” wafer flavors. Surprisingly, they found that no matter the amount of cream filling or flavor, the cream almost always separated onto one wafer.

    “We had expected an effect based on size,” Owens says. “If there was more cream between layers, it should be easier to deform. But that’s not actually the case.”

    Curiously, when they mapped each cookie’s result to its original position in the box, they noticed the cream tended to stick to the inward-facing wafer: Cookies on the left side of the box twisted such that the cream ended up on the right wafer, whereas cookies on the right side separated with cream mostly on the left wafer. They suspect this box distribution may be a result of post-manufacturing environmental effects, such as heating or jostling that may cause cream to peel slightly away from the outer wafers, even before twisting.

    The understanding gained from the properties of Oreo cream could potentially be applied to the design of other complex fluid materials.

    “My 3D printing fluids are in the same class of materials as Oreo cream,” she says. “So, this new understanding can help me better design ink when I’m trying to print flexible electronics from a slurry of carbon nanotubes, because they deform in almost exactly the same way.”

    As for the cookie itself, she suggests that if the inside of Oreo wafers were more textured, the cream might grip better onto both sides and split more evenly when twisted.

    “As they are now, we found there’s no trick to twisting that would split the cream evenly,” Owens concludes.

    This research was supported, in part, by the MIT UROP program and by the National Defense Science and Engineering Graduate Fellowship Program. More

  • in

    Looking forward to forecast the risks of a changing climate

    On April 11, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the third in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    Extreme weather events that were once considered rare have become noticeably less so, from intensifying hurricane activity in the North Atlantic to wildfires generating massive clouds of ozone-damaging smoke. But current climate models are unprepared when it comes to estimating the risk that these increasingly extreme events pose — and without adequate modeling, governments are left unable to take necessary precautions to protect their communities.

    MIT Department of Earth, Atmospheric and Planetary Science (EAPS) Professor Paul O’Gorman researches this trend by studying how climate affects the atmosphere and incorporating what he learns into climate models to improve their accuracy. One particular focus for O’Gorman has been changes in extreme precipitation and midlatitude storms that hit areas like New England.

    “These extreme events are having a lot of impact, but they’re also difficult to model or study,” he says. Seeing the pressing need for better climate models that can be used to develop preparedness plans and climate change mitigation strategies, O’Gorman and collaborators Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in EAPS, and Miho Mazereeuw, associate professor in MIT’s Department of Architecture, are leading an interdisciplinary group of scientists, engineers, and designers to tackle this problem with their MIT Climate Grand Challenges flagship project, “Preparing for a new world of weather and climate extremes.”

    “We know already from observations and from climate model predictions that weather and climate extremes are changing and will change more,” O’Gorman says. “The grand challenge is preparing for those changing extremes.”

    Their proposal is one of five flagship projects recently announced by the MIT Climate Grand Challenges initiative — an Institute-wide effort catalyzing novel research and engineering innovations to address the climate crisis. Selected from a field of almost 100 submissions, the team will receive additional funding and exposure to help accelerate and scale their project goals. Other MIT collaborators on the proposal include researchers from the School of Engineering, the School of Architecture and Planning, the Office of Sustainability, the Center for Global Change Science, and the Institute for Data, Systems and Society.

    Weather risk modeling

    Fifteen years ago, Kerry Emanuel developed a simple hurricane model. It was based on physics equations, rather than statistics, and could run in real time, making it useful for modeling risk assessment. Emanuel wondered if similar models could be used for long-term risk assessment of other things, such as changes in extreme weather because of climate change.

    “I discovered, somewhat to my surprise and dismay, that almost all extant estimates of long-term weather risks in the United States are based not on physical models, but on historical statistics of the hazards,” says Emanuel. “The problem with relying on historical records is that they’re too short; while they can help estimate common events, they don’t contain enough information to make predictions for more rare events.”

    Another limitation of weather risk models which rely heavily on statistics: They have a built-in assumption that the climate is static.

    “Historical records rely on the climate at the time they were recorded; they can’t say anything about how hurricanes grow in a warmer climate,” says Emanuel. The models rely on fixed relationships between events; they assume that hurricane activity will stay the same, even while science is showing that warmer temperatures will most likely push typical hurricane activity beyond the tropics and into a much wider band of latitudes.

    As a flagship project, the goal is to eliminate this reliance on the historical record by emphasizing physical principles (e.g., the laws of thermodynamics and fluid mechanics) in next-generation models. The downside to this is that there are many variables that have to be included. Not only are there planetary-scale systems to consider, such as the global circulation of the atmosphere, but there are also small-scale, extremely localized events, like thunderstorms, that influence predictive outcomes.

    Trying to compute all of these at once is costly and time-consuming — and the results often can’t tell you the risk in a specific location. But there is a way to correct for this: “What’s done is to use a global model, and then use a method called downscaling, which tries to infer what would happen on very small scales that aren’t properly resolved by the global model,” explains O’Gorman. The team hopes to improve downscaling techniques so that they can be used to calculate the risk of very rare but impactful weather events.

    Global climate models, or general circulation models (GCMs), Emanuel explains, are constructed a bit like a jungle gym. Like the playground bars, the Earth is sectioned in an interconnected three-dimensional framework — only it’s divided 100 to 200 square kilometers at a time. Each node comprises a set of computations for characteristics like wind, rainfall, atmospheric pressure, and temperature within its bounds; the outputs of each node are connected to its neighbor. This framework is useful for creating a big picture idea of Earth’s climate system, but if you tried to zoom in on a specific location — like, say, to see what’s happening in Miami or Mumbai — the connecting nodes are too far apart to make predictions on anything specific to those areas.

    Scientists work around this problem by using downscaling. They use the same blueprint of the jungle gym, but within the nodes they weave a mesh of smaller features, incorporating equations for things like topography and vegetation or regional meteorological models to fill in the blanks. By creating a finer mesh over smaller areas they can predict local effects without needing to run the entire global model.

    Of course, even this finer-resolution solution has its trade-offs. While we might be able to gain a clearer picture of what’s happening in a specific region by nesting models within models, it can still make for a computing challenge to crunch all that data at once, with the trade-off being expense and time, or predictions that are limited to shorter windows of duration — where GCMs can be run considering decades or centuries, a particularly complex local model may be restricted to predictions on timescales of just a few years at a time.

    “I’m afraid that most of the downscaling at present is brute force, but I think there’s room to do it in better ways,” says Emanuel, who sees the problem of finding new and novel methods of achieving this goal as an intellectual challenge. “I hope that through the Grand Challenges project we might be able to get students, postdocs, and others interested in doing this in a very creative way.”

    Adapting to weather extremes for cities and renewable energy

    Improving climate modeling is more than a scientific exercise in creativity, however. There’s a very real application for models that can accurately forecast risk in localized regions.

    Another problem is that progress in climate modeling has not kept up with the need for climate mitigation plans, especially in some of the most vulnerable communities around the globe.

    “It is critical for stakeholders to have access to this data for their own decision-making process. Every community is composed of a diverse population with diverse needs, and each locality is affected by extreme weather events in unique ways,” says Mazereeuw, the director of the MIT Urban Risk Lab. 

    A key piece of the team’s project is building on partnerships the Urban Risk Lab has developed with several cities to test their models once they have a usable product up and running. The cities were selected based on their vulnerability to increasing extreme weather events, such as tropical cyclones in Broward County, Florida, and Toa Baja, Puerto Rico, and extratropical storms in Boston, Massachusetts, and Cape Town, South Africa.

    In their proposal, the team outlines a variety of deliverables that the cities can ultimately use in their climate change preparations, with ideas such as online interactive platforms and workshops with stakeholders — such as local governments, developers, nonprofits, and residents — to learn directly what specific tools they need for their local communities. By doing so, they can craft plans addressing different scenarios in their region, involving events such as sea-level rise or heat waves, while also providing information and means of developing adaptation strategies for infrastructure under these conditions that will be the most effective and efficient for them.

    “We are acutely aware of the inequity of resources both in mitigating impacts and recovering from disasters. Working with diverse communities through workshops allows us to engage a lot of people, listen, discuss, and collaboratively design solutions,” says Mazereeuw.

    By the end of five years, the team is hoping that they’ll have better risk assessment and preparedness tool kits, not just for the cities that they’re partnering with, but for others as well.

    “MIT is well-positioned to make progress in this area,” says O’Gorman, “and I think it’s an important problem where we can make a difference.” More

  • in

    Structures considered key to gene expression are surprisingly fleeting

    In human chromosomes, DNA is coated by proteins to form an exceedingly long beaded string. This “string” is folded into numerous loops, which are believed to help cells control gene expression and facilitate DNA repair, among other functions. A new study from MIT suggests that these loops are very dynamic and shorter-lived than previously thought.

    In the new study, the researchers were able to monitor the movement of one stretch of the genome in a living cell for about two hours. They saw that this stretch was fully looped for only 3 to 6 percent of the time, with the loop lasting for only about 10 to 30 minutes. The findings suggest that scientists’ current understanding of how loops influence gene expression may need to be revised, the researchers say.

    “Many models in the field have been these pictures of static loops regulating these processes. What our new paper shows is that this picture is not really correct,” says Anders Sejr Hansen, the Underwood-Prescott Career Development Assistant Professor of Biological Engineering at MIT. “We suggest that the functional state of these domains is much more dynamic.”

    Hansen is one of the senior authors of the new study, along with Leonid Mirny, a professor in MIT’s Institute for Medical Engineering and Science and the Department of Physics, and Christoph Zechner, a group leader at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and the Center for Systems Biology Dresden. MIT postdoc Michele Gabriele, recent Harvard University PhD recipient Hugo Brandão, and MIT graduate student Simon Grosse-Holz are the lead authors of the paper, which appears today in Science.

    Out of the loop

    Using computer simulations and experimental data, scientists including Mirny’s group at MIT have shown that loops in the genome are formed by a process called extrusion, in which a molecular motor promotes the growth of progressively larger loops. The motor stops each time it encounters a “stop sign” on DNA. The motor that extrudes such loops is a protein complex called cohesin, while the DNA-bound protein CTCF serves as the stop sign. These cohesin-mediated loops between CTCF sites were seen in previous experiments.

    However, those experiments only offered a snapshot of a moment in time, with no information on how the loops change over time. In their new study, the researchers developed techniques that allowed them to fluorescently label CTCF DNA sites so they could image the DNA loops over several hours. They also created a new computational method that can infer the looping events from the imaging data.

    “This method was crucial for us to distinguish signal from noise in our experimental data and quantify looping,” Zechner says. “We believe that such approaches will become increasingly important for biology as we continue to push the limits of detection with experiments.”

    The researchers used their method to image a stretch of the genome in mouse embryonic stem cells. “If we put our data in the context of one cell division cycle, which lasts about 12 hours, the fully formed loop only actually exists for about 20 to 45 minutes, or about 3 to 6 percent of the time,” Grosse-Holz says.

    “If the loop is only present for such a tiny period of the cell cycle and very short-lived, we shouldn’t think of this fully looped state as being the primary regulator of gene expression,” Hansen says. “We think we need new models for how the 3D structure of the genome regulates gene expression, DNA repair, and other functional downstream processes.”

    While fully formed loops were rare, the researchers found that partially extruded loops were present about 92 percent of the time. These smaller loops have been difficult to observe with the previous methods of detecting loops in the genome.

    “In this study, by integrating our experimental data with polymer simulations, we have now been able to quantify the relative extents of the unlooped, partially extruded, and fully looped states,” Brandão says.

    “Since these interactions are very short, but very frequent, the previous methodologies were not able to fully capture their dynamics,” Gabriele adds. “With our new technique, we can start to resolve transitions between fully looped and unlooped states.”

    Play video

    The researchers hypothesize that these partial loops may play more important roles in gene regulation than fully formed loops. Strands of DNA run along each other as loops begin to form and then fall apart, and these interactions may help regulatory elements such as enhancers and gene promoters find each other.

    “More than 90 percent of the time, there are some transient loops, and presumably what’s important is having those loops that are being perpetually extruded,” Mirny says. “The process of extrusion itself may be more important than the fully looped state that only occurs for a short period of time.”

    More loops to study

    Since most of the other loops in the genome are weaker than the one the researchers studied in this paper, they suspect that many other loops will also prove to be highly transient. They now plan to use their new technique study some of those other loops, in a variety of cell types.

    “There are about 10,000 of these loops, and we’ve looked at one,” Hansen says. “We have a lot of indirect evidence to suggest that the results would be generalizable, but we haven’t demonstrated that. Using the technology platform we’ve set up, which combines new experimental and computational methods, we can begin to approach other loops in the genome.”

    The researchers also plan to investigate the role of specific loops in disease. Many diseases, including a neurodevelopmental disorder called FOXG1 syndrome, could be linked to faulty loop dynamics. The researchers are now studying how both the normal and mutated form of the FOXG1 gene, as well as the cancer-causing gene MYC, are affected by genome loop formation.

    The research was funded by the National Institutes of Health, the National Science Foundation, the Mathers Foundation, a Pew-Stewart Cancer Research Scholar grant, the Chaires d’excellence Internationale Blaise Pascal, an American-Italian Cancer Foundation research scholarship, and the Max Planck Institute for Molecular Cell Biology and Genetics. More

  • in

    Computing our climate future

    On Monday, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the first in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    With improvements to computer processing power and an increased understanding of the physical equations governing the Earth’s climate, scientists are continually working to refine climate models and improve their predictive power. But the tools they’re refining were originally conceived decades ago with only scientists in mind. When it comes to developing tangible climate action plans, these models remain inscrutable to the policymakers, public safety officials, civil engineers, and community organizers who need their predictive insight most.

    “What you end up having is a gap between what’s typically used in practice, and the real cutting-edge science,” says Noelle Selin, a professor in the Institute for Data, Systems and Society and the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and co-lead with Professor Raffaele Ferrari on the MIT Climate Grand Challenges flagship project “Bringing Computation to the Climate Crisis.” “How can we use new computational techniques, new understandings, new ways of thinking about modeling, to really bridge that gap between state-of-the-art scientific advances and modeling, and people who are actually needing to use these models?”

    Using this as a driving question, the team won’t just be trying to refine current climate models, they’re building a new one from the ground up.

    This kind of game-changing advancement is exactly what the MIT Climate Grand Challenges is looking for, which is why the proposal has been named one of the five flagship projects in the ambitious Institute-wide program aimed at tackling the climate crisis. The proposal, which was selected from 100 submissions and was among 27 finalists, will receive additional funding and support to further their goal of reimagining the climate modeling system. It also brings together contributors from across the Institute, including the MIT Schwarzman College of Computing, the School of Engineering, and the Sloan School of Management.

    When it comes to pursuing high-impact climate solutions that communities around the world can use, “it’s great to do it at MIT,” says Ferrari, EAPS Cecil and Ida Green Professor of Oceanography. “You’re not going to find many places in the world where you have the cutting-edge climate science, the cutting-edge computer science, and the cutting-edge policy science experts that we need to work together.”

    The climate model of the future

    The proposal builds on work that Ferrari began three years ago as part of a joint project with Caltech, the Naval Postgraduate School, and NASA’s Jet Propulsion Lab. Called the Climate Modeling Alliance (CliMA), the consortium of scientists, engineers, and applied mathematicians is constructing a climate model capable of more accurately projecting future changes in critical variables, such as clouds in the atmosphere and turbulence in the ocean, with uncertainties at least half the size of those in existing models.

    To do this, however, requires a new approach. For one thing, current models are too coarse in resolution — at the 100-to-200-kilometer scale — to resolve small-scale processes like cloud cover, rainfall, and sea ice extent. But also, explains Ferrari, part of this limitation in resolution is due to the fundamental architecture of the models themselves. The languages most global climate models are coded in were first created back in the 1960s and ’70s, largely by scientists for scientists. Since then, advances in computing driven by the corporate world and computer gaming have given rise to dynamic new computer languages, powerful graphics processing units, and machine learning.

    For climate models to take full advantage of these advancements, there’s only one option: starting over with a modern, more flexible language. Written in Julia, a part of Julialab’s Scientific Machine Learning technology, and spearheaded by Alan Edelman, a professor of applied mathematics in MIT’s Department of Mathematics, CliMA will be able to harness far more data than the current models can handle.

    “It’s been real fun finally working with people in computer science here at MIT,” Ferrari says. “Before it was impossible, because traditional climate models are in a language their students can’t even read.”

    The result is what’s being called the “Earth digital twin,” a climate model that can simulate global conditions on a large scale. This on its own is an impressive feat, but the team wants to take this a step further with their proposal.

    “We want to take this large-scale model and create what we call an ‘emulator’ that is only predicting a set of variables of interest, but it’s been trained on the large-scale model,” Ferrari explains. Emulators are not new technology, but what is new is that these emulators, being referred to as the “Earth digital cousins,” will take advantage of machine learning.

    “Now we know how to train a model if we have enough data to train them on,” says Ferrari. Machine learning for projects like this has only become possible in recent years as more observational data become available, along with improved computer processing power. The goal is to create smaller, more localized models by training them using the Earth digital twin. Doing so will save time and money, which is key if the digital cousins are going to be usable for stakeholders, like local governments and private-sector developers.

    Adaptable predictions for average stakeholders

    When it comes to setting climate-informed policy, stakeholders need to understand the probability of an outcome within their own regions — in the same way that you would prepare for a hike differently if there’s a 10 percent chance of rain versus a 90 percent chance. The smaller Earth digital cousin models will be able to do things the larger model can’t do, like simulate local regions in real time and provide a wider range of probabilistic scenarios.

    “Right now, if you wanted to use output from a global climate model, you usually would have to use output that’s designed for general use,” says Selin, who is also the director of the MIT Technology and Policy Program. With the project, the team can take end-user needs into account from the very beginning while also incorporating their feedback and suggestions into the models, helping to “democratize the idea of running these climate models,” as she puts it. Doing so means building an interactive interface that eventually will give users the ability to change input values and run the new simulations in real time. The team hopes that, eventually, the Earth digital cousins could run on something as ubiquitous as a smartphone, although developments like that are currently beyond the scope of the project.

    The next thing the team will work on is building connections with stakeholders. Through participation of other MIT groups, such as the Joint Program on the Science and Policy of Global Change and the Climate and Sustainability Consortium, they hope to work closely with policymakers, public safety officials, and urban planners to give them predictive tools tailored to their needs that can provide actionable outputs important for planning. Faced with rising sea levels, for example, coastal cities could better visualize the threat and make informed decisions about infrastructure development and disaster preparedness; communities in drought-prone regions could develop long-term civil planning with an emphasis on water conservation and wildfire resistance.

    “We want to make the modeling and analysis process faster so people can get more direct and useful feedback for near-term decisions,” she says.

    The final piece of the challenge is to incentivize students now so that they can join the project and make a difference. Ferrari has already had luck garnering student interest after co-teaching a class with Edelman and seeing the enthusiasm students have about computer science and climate solutions.

    “We’re intending in this project to build a climate model of the future,” says Selin. “So it seems really appropriate that we would also train the builders of that climate model.” More

  • in

    Architecture isn’t just for humans anymore

    In a rural valley of northwestern Nevada, home to stretches of wetlands, sagebrush-grassland, and dozens of natural springs, is a 3,800-acre parcel of off-grid land known as Fly Ranch. Owned by Burning Man, the community that yearly transforms the neighboring playa into a colorful free-wheeling temporary city, Fly Ranch is part of a long-term project to extend the festival’s experimental ethos beyond the one-week event. In 2018, the group, in conjunction with The Land Art Generator Initiative, invited proposals for sustainable systems for energy, water, food, shelter, and regenerative waste management on the site. 

    For recent MIT alumni Zhicheng Xu MArch ’22 and Mengqi Moon He SMArchS ’20, Fly Ranch presented a new challenge. Xu and He, who have backgrounds in landscape design, urbanism, and architecture, had been in the process of researching the use of timber as a building material, and thought the competition would be a good opportunity to experiment and showcase some of their initial research. “But because of our MIT education, we approached the problem with a very critical lens,” says Xu, “We were asking ourselves: Who are we designing for? What do we mean by shelter? Sheltering whom?” 

    Architecture for other-than-human worlds

    Their winning proposal, “Lodgers,” selected among 185 entries and currently on view at the Weisner Student Art Gallery, asks how to design a structure that will accommodate not only the land’s human inhabitants, but also the over 100 plant and animal species that call the desert home. In other words, what would an architecture look like that centered not only human needs, but also those of the broader ecosystem? 

    Developing the project during the pandemic lockdowns, Xu and He pored over a long list of hundreds of local plants and animals — from red-tailed hawks to desert rats to bullfrogs — and designed the project with these species in mind. Combining new computational tools with the traditional Western Shoshone and Northern Paiute designs found in brush shelters and woven baskets, the thatched organic structures called “lodgers” feature bee towers, nesting platforms for birds, sugar-glazed logs for breeding beetle larvae, and composting toilets and environmental education classrooms for humans. 

    But it wasn’t until they visited Fly Ranch, in the spring of 2021, that Xu and He’s understanding of the project deepened. For several nights, they camped onsite with other competition finalists, alongside park rangers and longtime Burners, eating community meals together and learning first-hand the complexities of the desert. At one point during the trip, they were caught in a sandstorm while driving a trailer-load of supplies down a dirt road. The experience, they say, was an important lesson in humility, and how such extremes made the landscape what it was. “That’s why we later came to the term ‘coping with the friction’ because it’s always there,” He says, “There’s no solution.” Xu adds, “The different elements from the land — the water, the heat, the sound, the wind — are the elements we have to cope with in the project. Those little moments made us realize we need to reposition ourselves, stay humble, and try to understand the land.” 

    Leave no trace

    While the deserts of the American West have long been vulnerable to human hubris — from large-scale military procedures to mining operations that have left deep scars on the landscape — Xu and He designed the “lodgers” to leave a light footprint. Instead of viewing buildings as permanent solutions, with the environment perceived as an obstacle to be overcome, Xu and He see their project as a “temporary inhabitant.” 

    To reduce carbon emissions, their goal was to adopt low-cost, low-tech, recycled materials that could be used without the need for special training or heavy equipment, so that the construction itself could be open to everyone in the community. In addition to scrap wood collected onsite, the project uses two-by-four lumber, among the most common and cheapest materials in American construction, and thatching for the facades created from the dry reeds and bulrush that grow abundantly in the region. If the structures are shut down, the use of renewable materials allows them to decompose naturally. 

    Fly Ranch at MIT 

    Now, the MIT community has the opportunity to experience part of the Nevada desert — and be part of the process of participatory design. “We are very fortunate to be funded by the Council of the Arts at MIT,” says Xu. “With that funding, we were able to expand the team, so the format of the exhibition was more democratic than just designing and building.” With the help of their classmates Calvin Zhong ’18 and Wuyahuang Li SMArchS ’21, Xu and He have brought their proposal to life. The ambitious immersive installation includes architectural models, field recordings, projections, and artifacts such as the skeletons of turtles and fish collected at Fly Ranch. Inside the structure is a large communal table, where Xu and He hope to host workshops and conversations to encourage more dialogue and collaboration. Having learned from the design build, Xu and He are now collecting feedback from MIT professors and colleagues to bring the project to the next level. In the fall, they will debut the “lodgers” at the Lisbon Architectural Triennale, and soon hope to build a prototype at Fly Ranch itself. 

    The structures, they hope, will inspire greater reflection on our entanglements with the other-than-human world, and the possibilities of an architecture designed to be impermanent. Humans, after all, are often only “occasional guests” in this landscape, and part of the greater cycles of emergence and decay. “To us, it’s a beautiful expression of how different species are entangled on the land. And us as humans is just another tiny piece in this entanglement,” says Xu. 

    Established as a gift from the MIT Class of 1983, the Wiesner Gallery honors the former president of MIT, Jerome Wiesner, for his support of the arts at the Institute. The gallery was fully renovated in fall 2016, thanks in part to the generosity of Harold ’44 and Arlene Schnitzer and the Council for the Arts at MIT, and now also serves as a central meeting space for MIT Student Arts Programming including the START Studio, Creative Arts Competition, Student Arts Advisory Board, and Arts Scholars. “Lodgers: Friction Between Neighbors” is on view in the Wiesner Student Art Gallery through April 29, and was funded in part by the Council for the Arts at MIT, a group of alumni and friends with a strong commitment to the arts and serving the MIT community. More

  • in

    Q&A: Climate Grand Challenges finalists on using data and science to forecast climate-related risk

    Note: This is the final article in a four-part interview series featuring the work of the 27 MIT Climate Grand Challenges finalist teams, which received a total of $2.7 million in startup funding to advance their projects. This month, the Institute will name a subset of the finalists as multiyear flagship projects.

    Advances in computation, artificial intelligence, robotics, and data science are enabling a new generation of observational tools and scientific modeling with the potential to produce timely, reliable, and quantitative analysis of future climate risks at a local scale. These projections can increase the accuracy and efficacy of early warning systems, improve emergency planning, and provide actionable information for climate mitigation and adaptation efforts, as human actions continue to change planetary conditions.

    In conversations prepared for MIT News, faculty from four Climate Grand Challenges teams with projects in the competition’s “Using data and science to forecast climate-related risk” category describe the promising new technologies that can help scientists understand the Earth’s climate system on a finer scale than ever before. (The other Climate Grand Challenges research themes include building equity and fairness into climate solutions, removing, managing, and storing greenhouse gases, and decarbonizing complex industries and processes.) The following responses have been edited for length and clarity.

    An observational system that can initiate a climate risk forecasting revolution

    Despite recent technological advances and massive volumes of data, climate forecasts remain highly uncertain. Gaps in observational capabilities create substantial challenges to predicting extreme weather events and establishing effective mitigation and adaptation strategies. R. John Hansman, the T. Wilson Professor of Aeronautics and Astronautics and director of the MIT International Center for Air Transportation, discusses the Stratospheric Airborne Climate Observatory System (SACOS) being developed together with Brent Minchew, the Cecil and Ida Green Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and a team that includes researchers from MIT Lincoln Laboratory and Harvard University.

    Q: How does SACOS reduce uncertainty in climate risk forecasting?

    A: There is a critical need for higher spatial and temporal resolution observations of the climate system than are currently available through remote (satellite or airborne) and surface (in-situ) sensing. We are developing an ensemble of high-endurance, solar-powered aircraft with instrument systems capable of performing months-long climate observing missions that satellites or aircraft alone cannot fulfill. Summer months are ideal for SACOS operations, as many key climate phenomena are active and short night periods reduce the battery mass, vehicle size, and technical risks. These observations hold the potential to inform and predict, allowing emergency planners, policymakers, and the rest of society to better prepare for the changes to come.

    Q: Describe the types of observing missions where SACOS could provide critical improvements.

    A: The demise of the Antarctic Ice Sheet, which is leading to rising sea levels around the world and threatening the displacement of millions of people, is one example. Current sea level forecasts struggle to account for giant fissures that create massive icebergs and cause the Antarctic Ice Sheet to flow more rapidly into the ocean. SACOS can track these fissures to accurately forecast ice slippage and give impacted populations enough time to prepare or evacuate. Elsewhere, widespread droughts cause rampant wildfires and water shortages. SACOS has the ability to monitor soil moisture and humidity in critically dry regions to identify where and when wildfires and droughts are imminent. SACOS also offers the most effective method to measure, track, and predict local ozone depletion over North America, which has resulted in increasingly severe summer thunderstorms.

    Quantifying and managing the risks of sea-level rise

    Prevailing estimates of sea-level rise range from approximately 20 centimeters to 2 meters by the end of the century, with the associated costs on the order of trillions of dollars. The instability of certain portions of the world’s ice sheets creates vast uncertainties, complicating how the world prepares for and responds to these potential changes. EAPS Professor Brent Minchew is leading another Climate Grand Challenges finalist team working on an integrated, multidisciplinary effort to improve the scientific understanding of sea-level rise and provide actionable information and tools to manage the risks it poses.

    Q: What have been the most significant challenges to understanding the potential rates of sea-level rise?

    A: West Antarctica is one of the most remote, inaccessible, and hostile places on Earth — to people and equipment. Thus, opportunities to observe the collapse of the West Antarctic Ice Sheet, which contains enough ice to raise global sea levels by about 3 meters, are limited and current observations crudely resolved. It is essential that we understand how the floating edge of the ice sheets, often called ice shelves, fracture and collapse because they provide critical forces that govern the rate of ice mass loss and can stabilize the West Antarctic Ice Sheet.

    Q: How will your project advance what is currently known about sea-level rise?

    A: We aim to advance global-scale projections of sea-level rise through novel observational technologies and computational models of ice sheet change and to link those predictions to region- to neighborhood-scale estimates of costs and adaptation strategies. To do this, we propose two novel instruments: a first-of-its-kind drone that can fly for months at a time over Antarctica making continuous observations of critical areas and an airdropped seismometer and GPS bundle that can be deployed to vulnerable and hard-to-reach areas of the ice sheet. This technology will provide greater data quality and density and will observe the ice sheet at frequencies that are currently inaccessible — elements that are essential for understanding the physics governing the evolution of the ice sheet and sea-level rise.

    Changing flood risk for coastal communities in the developing world

    Globally, more than 600 million people live in low-elevation coastal areas that face an increasing risk of flooding from sea-level rise. This includes two-thirds of cities with populations of more than 5 million and regions that conduct the vast majority of global trade. Dara Entekhabi, the Bacardi and Stockholm Water Foundations Professor in the Department of Civil and Environmental Engineering and professor in the Department of Earth, Atmospheric, and Planetary Sciences, outlines an interdisciplinary partnership that leverages data and technology to guide short-term and chart long-term adaptation pathways with Miho Mazereeuw, associate professor of architecture and urbanism and director of the Urban Risk Lab in the School of Architecture and Planning, and Danielle Wood, assistant professor in the Program in Media Arts and Sciences and the Department of Aeronautics and Astronautics.

    Q: What is the key problem this program seeks to address?

    A: The accumulated heating of the Earth system due to fossil burning is largely absorbed by the oceans, and the stored heat expands the ocean volume leading to increased base height for tides. When the high tides inundate a city, the condition is referred to as “sunny day” flooding, but the saline waters corrode infrastructure and wreak havoc on daily routines. The danger ahead for many coastal cities in the developing world is the combination of increasing high tide intrusions, coupled with heavy precipitation storm events.

    Q: How will your proposed solutions impact flood risk management?

    A: We are producing detailed risk maps for coastal cities in developing countries using newly available, very high-resolution remote-sensing data from space-borne instruments, as well as historical tides records and regional storm characteristics. Using these datasets, we aim to produce street-by-street risk maps that provide local decision-makers and stakeholders with a way to estimate present and future flood risks. With the model of future tides and probabilistic precipitation events, we can forecast future inundation by a flooding event, decadal changes with various climate-change and sea-level rise projections, and an increase in the likelihood of sunny-day flooding. Working closely with local partners, we will develop toolkits to explore short-term emergency response, as well as long-term mitigation and adaptation techniques in six pilot locations in South and Southeast Asia, Africa, and South America.

    Ocean vital signs

    On average, every person on Earth generates fossil fuel emissions equivalent to an 8-pound bag of carbon, every day. Much of this is absorbed by the ocean, but there is wide variability in the estimates of oceanic absorption, which translates into differences of trillions of dollars in the required cost of mitigation. In the Department of Earth, Atmospheric and Planetary Sciences, Christopher Hill, a principal research engineer specializing in Earth and planetary computational science, works with Ryan Woosley, a principal research scientist focusing on the carbon cycle and ocean acidification. Hill explains that they hope to use artificial intelligence and machine learning to help resolve this uncertainty.

    Q: What is the current state of knowledge on air-sea interactions?

    A: Obtaining specific, accurate field measurements of critical physical, chemical, and biological exchanges between the ocean and the planet have historically entailed expensive science missions with large ship-based infrastructure that leave gaps in real-time data about significant ocean climate processes. Recent advances in highly scalable in-situ autonomous observing and navigation combined with airborne, remote sensing, and machine learning innovations have the potential to transform data gathering, provide more accurate information, and address fundamental scientific questions around air-sea interaction.

    Q: How will your approach accelerate real-time, autonomous surface ocean observing from an experimental research endeavor to a permanent and impactful solution?

    A: Our project seeks to demonstrate how a scalable surface ocean observing network can be launched and operated, and to illustrate how this can reduce uncertainties in estimates of air-sea carbon dioxide exchange. With an initial high-impact goal of substantially eliminating the vast uncertainties that plague our understanding of ocean uptake of carbon dioxide, we will gather critical measurements for improving extended weather and climate forecast models and reducing climate impact uncertainty. The results have the potential to more accurately identify trillions of dollars worth of economic activity. More

  • in

    Ocean vital signs

    Without the ocean, the climate crisis would be even worse than it is. Each year, the ocean absorbs billions of tons of carbon from the atmosphere, preventing warming that greenhouse gas would otherwise cause. Scientists estimate about 25 to 30 percent of all carbon released into the atmosphere by both human and natural sources is absorbed by the ocean.

    “But there’s a lot of uncertainty in that number,” says Ryan Woosley, a marine chemist and a principal research scientist in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) at MIT. Different parts of the ocean take in different amounts of carbon depending on many factors, such as the season and the amount of mixing from storms. Current models of the carbon cycle don’t adequately capture this variation.

    To close the gap, Woosley and a team of other MIT scientists developed a research proposal for the MIT Climate Grand Challenges competition — an Institute-wide campaign to catalyze and fund innovative research addressing the climate crisis. The team’s proposal, “Ocean Vital Signs,” involves sending a fleet of sailing drones to cruise the oceans taking detailed measurements of how much carbon the ocean is really absorbing. Those data would be used to improve the precision of global carbon cycle models and improve researchers’ ability to verify emissions reductions claimed by countries.

    “If we start to enact mitigation strategies—either through removing CO2 from the atmosphere or reducing emissions — we need to know where CO2 is going in order to know how effective they are,” says Woosley. Without more precise models there’s no way to confirm whether observed carbon reductions were thanks to policy and people, or thanks to the ocean.

    “So that’s the trillion-dollar question,” says Woosley. “If countries are spending all this money to reduce emissions, is it enough to matter?”

    In February, the team’s Climate Grand Challenges proposal was named one of 27 finalists out of the almost 100 entries submitted. From among this list of finalists, MIT will announce in April the selection of five flagship projects to receive further funding and support.

    Woosley is leading the team along with Christopher Hill, a principal research engineer in EAPS. The team includes physical and chemical oceanographers, marine microbiologists, biogeochemists, and experts in computational modeling from across the department, in addition to collaborators from the Media Lab and the departments of Mathematics, Aeronautics and Astronautics, and Electrical Engineering and Computer Science.

    Today, data on the flux of carbon dioxide between the air and the oceans are collected in a piecemeal way. Research ships intermittently cruise out to gather data. Some commercial ships are also fitted with sensors. But these present a limited view of the entire ocean, and include biases. For instance, commercial ships usually avoid storms, which can increase the turnover of water exposed to the atmosphere and cause a substantial increase in the amount of carbon absorbed by the ocean.

    “It’s very difficult for us to get to it and measure that,” says Woosley. “But these drones can.”

    If funded, the team’s project would begin by deploying a few drones in a small area to test the technology. The wind-powered drones — made by a California-based company called Saildrone — would autonomously navigate through an area, collecting data on air-sea carbon dioxide flux continuously with solar-powered sensors. This would then scale up to more than 5,000 drone-days’ worth of observations, spread over five years, and in all five ocean basins.

    Those data would be used to feed neural networks to create more precise maps of how much carbon is absorbed by the oceans, shrinking the uncertainties involved in the models. These models would continue to be verified and improved by new data. “The better the models are, the more we can rely on them,” says Woosley. “But we will always need measurements to verify the models.”

    Improved carbon cycle models are relevant beyond climate warming as well. “CO2 is involved in so much of how the world works,” says Woosley. “We’re made of carbon, and all the other organisms and ecosystems are as well. What does the perturbation to the carbon cycle do to these ecosystems?”

    One of the best understood impacts is ocean acidification. Carbon absorbed by the ocean reacts to form an acid. A more acidic ocean can have dire impacts on marine organisms like coral and oysters, whose calcium carbonate shells and skeletons can dissolve in the lower pH. Since the Industrial Revolution, the ocean has become about 30 percent more acidic on average.

    “So while it’s great for us that the oceans have been taking up the CO2, it’s not great for the oceans,” says Woosley. “Knowing how this uptake affects the health of the ocean is important as well.” More