More stories

  • in

    Generative AI for smart grid modeling

    MIT’s Laboratory for Information and Decision Systems (LIDS) has been awarded $1,365,000 in funding from the Appalachian Regional Commission (ARC) to support its involvement with an innovative project, “Forming the Smart Grid Deployment Consortium (SGDC) and Expanding the HILLTOP+ Platform.”

    The grant was made available through ARC’s Appalachian Regional Initiative for Stronger Economies, which fosters regional economic transformation through multi-state collaboration.

    Led by Kalyan Veeramachaneni, research scientist and principal investigator at LIDS’ Data to AI Group, the project will focus on creating AI-driven generative models for customer load data. Veeramachaneni and colleagues will work alongside a team of universities and organizations led by Tennessee Tech University, including collaborators across Ohio, Pennsylvania, West Virginia, and Tennessee, to develop and deploy smart grid modeling services through the SGDC project.

    These generative models have far-reaching applications, including grid modeling and training algorithms for energy tech startups. When the models are trained on existing data, they create additional, realistic data that can augment limited datasets or stand in for sensitive ones. Stakeholders can then use these models to understand and plan for specific what-if scenarios far beyond what could be achieved with existing data alone. For example, generated data can predict the potential load on the grid if an additional 1,000 households were to adopt solar technologies, how that load might change throughout the day, and similar contingencies vital to future planning.

    The generative AI models developed by Veeramachaneni and his team will provide inputs to modeling services based on the HILLTOP+ microgrid simulation platform, originally prototyped by MIT Lincoln Laboratory. HILLTOP+ will be used to model and test new smart grid technologies in a virtual “safe space,” providing rural electric utilities with increased confidence in deploying smart grid technologies, including utility-scale battery storage. Energy tech startups will also benefit from HILLTOP+ grid modeling services, enabling them to develop and virtually test their smart grid hardware and software products for scalability and interoperability.

    The project aims to assist rural electric utilities and energy tech startups in mitigating the risks associated with deploying these new technologies. “This project is a powerful example of how generative AI can transform a sector — in this case, the energy sector,” says Veeramachaneni. “In order to be useful, generative AI technologies and their development have to be closely integrated with domain expertise. I am thrilled to be collaborating with experts in grid modeling, and working alongside them to integrate the latest and greatest from my research group and push the boundaries of these technologies.”

    “This project is testament to the power of collaboration and innovation, and we look forward to working with our collaborators to drive positive change in the energy sector,” says Satish Mahajan, principal investigator for the project at Tennessee Tech and a professor of electrical and computer engineering. Tennessee Tech’s Center for Rural Innovation director, Michael Aikens, adds, “Together, we are taking significant steps towards a more sustainable and resilient future for the Appalachian region.” More

  • in

    MIT researchers remotely map crops, field by field

    Crop maps help scientists and policymakers track global food supplies and estimate how they might shift with climate change and growing populations. But getting accurate maps of the types of crops that are grown from farm to farm often requires on-the-ground surveys that only a handful of countries have the resources to maintain.

    Now, MIT engineers have developed a method to quickly and accurately label and map crop types without requiring in-person assessments of every single farm. The team’s method uses a combination of Google Street View images, machine learning, and satellite data to automatically determine the crops grown throughout a region, from one fraction of an acre to the next. 

    The researchers used the technique to automatically generate the first nationwide crop map of Thailand — a smallholder country where small, independent farms make up the predominant form of agriculture. The team created a border-to-border map of Thailand’s four major crops — rice, cassava, sugarcane, and maize — and determined which of the four types was grown, at every 10 meters, and without gaps, across the entire country. The resulting map achieved an accuracy of 93 percent, which the researchers say is comparable to on-the-ground mapping efforts in high-income, big-farm countries.

    The team is applying their mapping technique to other countries such as India, where small farms sustain most of the population but the type of crops grown from farm to farm has historically been poorly recorded.

    “It’s a longstanding gap in knowledge about what is grown around the world,” says Sherrie Wang, the d’Arbeloff Career Development Assistant Professor in MIT’s Department of Mechanical Engineering, and the Institute for Data, Systems, and Society (IDSS). “The final goal is to understand agricultural outcomes like yield, and how to farm more sustainably. One of the key preliminary steps is to map what is even being grown — the more granularly you can map, the more questions you can answer.”

    Wang, along with MIT graduate student Jordi Laguarta Soler and Thomas Friedel of the agtech company PEAT GmbH, will present a paper detailing their mapping method later this month at the AAAI Conference on Artificial Intelligence.

    Ground truth

    Smallholder farms are often run by a single family or farmer, who subsist on the crops and livestock that they raise. It’s estimated that smallholder farms support two-thirds of the world’s rural population and produce 80 percent of the world’s food. Keeping tabs on what is grown and where is essential to tracking and forecasting food supplies around the world. But the majority of these small farms are in low to middle-income countries, where few resources are devoted to keeping track of individual farms’ crop types and yields.

    Crop mapping efforts are mainly carried out in high-income regions such as the United States and Europe, where government agricultural agencies oversee crop surveys and send assessors to farms to label crops from field to field. These “ground truth” labels are then fed into machine-learning models that make connections between the ground labels of actual crops and satellite signals of the same fields. They then label and map wider swaths of farmland that assessors don’t cover but that satellites automatically do.

    “What’s lacking in low- and middle-income countries is this ground label that we can associate with satellite signals,” Laguarta Soler says. “Getting these ground truths to train a model in the first place has been limited in most of the world.”

    The team realized that, while many developing countries do not have the resources to maintain crop surveys, they could potentially use another source of ground data: roadside imagery, captured by services such as Google Street View and Mapillary, which send cars throughout a region to take continuous 360-degree images with dashcams and rooftop cameras.

    In recent years, such services have been able to access low- and middle-income countries. While the goal of these services is not specifically to capture images of crops, the MIT team saw that they could search the roadside images to identify crops.

    Cropped image

    In their new study, the researchers worked with Google Street View (GSV) images taken throughout Thailand — a country that the service has recently imaged fairly thoroughly, and which consists predominantly of smallholder farms.

    Starting with over 200,000 GSV images randomly sampled across Thailand, the team filtered out images that depicted buildings, trees, and general vegetation. About 81,000 images were crop-related. They set aside 2,000 of these, which they sent to an agronomist, who determined and labeled each crop type by eye. They then trained a convolutional neural network to automatically generate crop labels for the other 79,000 images, using various training methods, including iNaturalist — a web-based crowdsourced  biodiversity database, and GPT-4V, a “multimodal large language model” that enables a user to input an image and ask the model to identify what the image is depicting. For each of the 81,000 images, the model generated a label of one of four crops that the image was likely depicting — rice, maize, sugarcane, or cassava.

    The researchers then paired each labeled image with the corresponding satellite data taken of the same location throughout a single growing season. These satellite data include measurements across multiple wavelengths, such as a location’s greenness and its reflectivity (which can be a sign of water). 

    “Each type of crop has a certain signature across these different bands, which changes throughout a growing season,” Laguarta Soler notes.

    The team trained a second model to make associations between a location’s satellite data and its corresponding crop label. They then used this model to process satellite data taken of the rest of the country, where crop labels were not generated or available. From the associations that the model learned, it then assigned crop labels across Thailand, generating a country-wide map of crop types, at a resolution of 10 square meters.

    This first-of-its-kind crop map included locations corresponding to the 2,000 GSV images that the researchers originally set aside, that were labeled by arborists. These human-labeled images were used to validate the map’s labels, and when the team looked to see whether the map’s labels matched the expert, “gold standard” labels, it did so 93 percent of the time.

    “In the U.S., we’re also looking at over 90 percent accuracy, whereas with previous work in India, we’ve only seen 75 percent because ground labels are limited,” Wang says. “Now we can create these labels in a cheap and automated way.”

    The researchers are moving to map crops across India, where roadside images via Google Street View and other services have recently become available.

    “There are over 150 million smallholder farmers in India,” Wang says. “India is covered in agriculture, almost wall-to-wall farms, but very small farms, and historically it’s been very difficult to create maps of India because there are very sparse ground labels.”

    The team is working to generate crop maps in India, which could be used to inform policies having to do with assessing and bolstering yields, as global temperatures and populations rise.

    “What would be interesting would be to create these maps over time,” Wang says. “Then you could start to see trends, and we can try to relate those things to anything like changes in climate and policies.” More

  • in

    New tool predicts flood risk from hurricanes in a warming climate

    Coastal cities and communities will face more frequent major hurricanes with climate change in the coming years. To help prepare coastal cities against future storms, MIT scientists have developed a method to predict how much flooding a coastal community is likely to experience as hurricanes evolve over the next decades.

    When hurricanes make landfall, strong winds whip up salty ocean waters that generate storm surge in coastal regions. As the storms move over land, torrential rainfall can induce further flooding inland. When multiple flood sources such as storm surge and rainfall interact, they can compound a hurricane’s hazards, leading to significantly more flooding than would result from any one source alone. The new study introduces a physics-based method for predicting how the risk of such complex, compound flooding may evolve under a warming climate in coastal cities.

    One example of compound flooding’s impact is the aftermath from Hurricane Sandy in 2012. The storm made landfall on the East Coast of the United States as heavy winds whipped up a towering storm surge that combined with rainfall-driven flooding in some areas to cause historic and devastating floods across New York and New Jersey.

    In their study, the MIT team applied the new compound flood-modeling method to New York City to predict how climate change may influence the risk of compound flooding from Sandy-like hurricanes over the next decades.  

    They found that, in today’s climate, a Sandy-level compound flooding event will likely hit New York City every 150 years. By midcentury, a warmer climate will drive up the frequency of such flooding, to every 60 years. At the end of the century, destructive Sandy-like floods will deluge the city every 30 years — a fivefold increase compared to the present climate.

    “Long-term average damages from weather hazards are usually dominated by the rare, intense events like Hurricane Sandy,” says study co-author Kerry Emanuel, professor emeritus of atmospheric science at MIT. “It is important to get these right.”

    While these are sobering projections, the researchers hope the flood forecasts can help city planners prepare and protect against future disasters. “Our methodology equips coastal city authorities and policymakers with essential tools to conduct compound flooding risk assessments from hurricanes in coastal cities at a detailed, granular level, extending to each street or building, in both current and future decades,” says study author Ali Sarhadi, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences.

    The team’s open-access study appears online today in the Bulletin of the American Meteorological Society. Co-authors include Raphaël Rousseau-Rizzi at MIT’s Lorenz Center, Kyle Mandli at Columbia University, Jeffrey Neal at the University of Bristol, Michael Wiper at the Charles III University of Madrid, and Monika Feldmann at the Swiss Federal Institute of Technology Lausanne.

    The seeds of floods

    To forecast a region’s flood risk, weather modelers typically look to the past. Historical records contain measurements of previous hurricanes’ wind speeds, rainfall, and spatial extent, which scientists use to predict where and how much flooding may occur with coming storms. But Sarhadi believes that the limitations and brevity of these historical records are insufficient for predicting future hurricanes’ risks.

    “Even if we had lengthy historical records, they wouldn’t be a good guide for future risks because of climate change,” he says. “Climate change is changing the structural characteristics, frequency, intensity, and movement of hurricanes, and we cannot rely on the past.”

    Sarhadi and his colleagues instead looked to predict a region’s risk of hurricane flooding in a changing climate using a physics-based risk assessment methodology. They first paired simulations of hurricane activity with coupled ocean and atmospheric models over time. With the hurricane simulations, developed originally by Emanuel, the researchers virtually scatter tens of thousands of “seeds” of hurricanes into a simulated climate. Most seeds dissipate, while a few grow into category-level storms, depending on the conditions of the ocean and atmosphere.

    When the team drives these hurricane simulations with climate models of ocean and atmospheric conditions under certain global temperature projections, they can see how hurricanes change, for instance in terms of intensity, frequency, and size, under past, current, and future climate conditions.

    The team then sought to precisely predict the level and degree of compound flooding from future hurricanes in coastal cities. The researchers first used rainfall models to simulate rain intensity for a large number of simulated hurricanes, then applied numerical models to hydraulically translate that rainfall intensity into flooding on the ground during landfalling of hurricanes, given information about a region such as its surface and topography characteristics. They also simulated the same hurricanes’ storm surges, using hydrodynamic models to translate hurricanes’ maximum wind speed and sea level pressure into surge height in coastal areas. The simulation further assessed the propagation of ocean waters into coastal areas, causing coastal flooding.

    Then, the team developed a numerical hydrodynamic model to predict how two sources of hurricane-induced flooding, such as storm surge and rain-driven flooding, would simultaneously interact through time and space, as simulated hurricanes make landfall in coastal regions such as New York City, in both current and future climates.  

    “There’s a complex, nonlinear hydrodynamic interaction between saltwater surge-driven flooding and freshwater rainfall-driven flooding, that forms compound flooding that a lot of existing methods ignore,” Sarhadi says. “As a result, they underestimate the risk of compound flooding.”

    Amplified risk

    With their flood-forecasting method in place, the team applied it to a specific test case: New York City. They used the multipronged method to predict the city’s risk of compound flooding from hurricanes, and more specifically from Sandy-like hurricanes, in present and future climates. Their simulations showed that the city’s odds of experiencing Sandy-like flooding will increase significantly over the next decades as the climate warms, from once every 150 years in the current climate, to every 60 years by 2050, and every 30 years by 2099.

    Interestingly, they found that much of this increase in risk has less to do with how hurricanes themselves will change with warming climates, but with how sea levels will increase around the world.

    “In future decades, we will experience sea level rise in coastal areas, and we also incorporated that effect into our models to see how much that would increase the risk of compound flooding,” Sarhadi explains. “And in fact, we see sea level rise is playing a major role in amplifying the risk of compound flooding from hurricanes in New York City.”

    The team’s methodology can be applied to any coastal city to assess the risk of compound flooding from hurricanes and extratropical storms. With this approach, Sarhadi hopes decision-makers can make informed decisions regarding the implementation of adaptive measures, such as reinforcing coastal defenses to enhance infrastructure and community resilience.

    “Another aspect highlighting the urgency of our research is the projected 25 percent increase in coastal populations by midcentury, leading to heightened exposure to damaging storms,” Sarhadi says. “Additionally, we have trillions of dollars in assets situated in coastal flood-prone areas, necessitating proactive strategies to reduce damages from compound flooding from hurricanes under a warming climate.”

    This research was supported, in part, by Homesite Insurance. More

  • in

    Co-creating climate futures with real-time data and spatial storytelling

    Virtual story worlds and game engines aren’t just for video games anymore. They are now tools for scientists and storytellers to digitally twin existing physical spaces and then turn them into vessels to dream up speculative climate stories and build collective designs of the future. That’s the theory and practice behind the MIT WORLDING initiative.

    Twice this year, WORLDING matched world-class climate story teams working in XR (extended reality) with relevant labs and researchers across MIT. One global group returned for a virtual gathering online in partnership with Unity for Humanity, while another met for one weekend in person, hosted at the MIT Media Lab.

    “We are witnessing the birth of an emergent field that fuses climate science, urban planning, real-time 3D engines, nonfiction storytelling, and speculative fiction, and it is all fueled by the urgency of the climate crises,” says Katerina Cizek, lead designer of the WORLDING initiative at the Co-Creation Studio of MIT Open Documentary Lab. “Interdisciplinary teams are forming and blossoming around the planet to collectively imagine and tell stories of healthy, livable worlds in virtual 3D spaces and then finding direct ways to translate that back to earth, literally.”

    At this year’s virtual version of WORLDING, five multidisciplinary teams were selected from an open call. In a week-long series of research and development gatherings, the teams met with MIT scientists, staff, fellows, students, and graduates, as well as other leading figures in the field. Guests ranged from curators at film festivals such as Sundance and Venice, climate policy specialists, and award-winning media creators to software engineers and renowned Earth and atmosphere scientists. The teams heard from MIT scholars in diverse domains, including geomorphology, urban planning as acts of democracy, and climate researchers at MIT Media Lab.

    Mapping climate data

    “We are measuring the Earth’s environment in increasingly data-driven ways. Hundreds of terabytes of data are taken every day about our planet in order to study the Earth as a holistic system, so we can address key questions about global climate change,” explains Rachel Connolly, an MIT Media Lab research scientist focused in the “Future Worlds” research theme, in a talk to the group. “Why is this important for your work and storytelling in general? Having the capacity to understand and leverage this data is critical for those who wish to design for and successfully operate in the dynamic Earth environment.”

    Making sense of billions of data points was a key theme during this year’s sessions. In another talk, Taylor Perron, an MIT professor of Earth, atmospheric and planetary sciences, shared how his team uses computational modeling combined with many other scientific processes to better understand how geology, climate, and life intertwine to shape the surfaces of Earth and other planets. His work resonated with one WORLDING team in particular, one aiming to digitally reconstruct the pre-Hispanic Lake Texcoco — where current day Mexico City is now situated — as a way to contrast and examine the region’s current water crisis.

    Democratizing the future

    While WORLDING approaches rely on rigorous science and the interrogation of large datasets, they are also founded on democratizing community-led approaches.

    MIT Department of Urban Studies and Planning graduate Lafayette Cruise MCP ’19 met with the teams to discuss how he moved his own practice as a trained urban planner to include a futurist component involving participatory methods. “I felt we were asking the same limited questions in regards to the future we were wanting to produce. We’re very limited, very constrained, as to whose values and comforts are being centered. There are so many possibilities for how the future could be.”

    Scaling to reach billions

    This work scales from the very local to massive global populations. Climate policymakers are concerned with reaching billions of people in the line of fire. “We have a goal to reach 1 billion people with climate resilience solutions,” says Nidhi Upadhyaya, deputy director at Atlantic Council’s Adrienne Arsht-Rockefeller Foundation Resilience Center. To get that reach, Upadhyaya is turning to games. “There are 3.3 billion-plus people playing video games across the world. Half of these players are women. This industry is worth $300 billion. Africa is currently among the fastest-growing gaming markets in the world, and 55 percent of the global players are in the Asia Pacific region.” She reminded the group that this conversation is about policy and how formats of mass communication can be used for policymaking, bringing about change, changing behavior, and creating empathy within audiences.

    Socially engaged game development is also connected to education at Unity Technologies, a game engine company. “We brought together our education and social impact work because we really see it as a critical flywheel for our business,” said Jessica Lindl, vice president and global head of social impact/education at Unity Technologies, in the opening talk of WORLDING. “We upscale about 900,000 students, in university and high school programs around the world, and about 800,000 adults who are actively learning and reskilling and upskilling in Unity. Ultimately resulting in our mission of the ‘world is a better place with more creators in it,’ millions of creators who reach billions of consumers — telling the world stories, and fostering a more inclusive, sustainable, and equitable world.”

    Access to these technologies is key, especially the hardware. “Accessibility has been missing in XR,” explains Reginé Gilbert, who studies and teaches accessibility and disability in user experience design at New York University. “XR is being used in artificial intelligence, assistive technology, business, retail, communications, education, empathy, entertainment, recreation, events, gaming, health, rehabilitation meetings, navigation, therapy, training, video programming, virtual assistance wayfinding, and so many other uses. This is a fun fact for folks: 97.8 percent of the world hasn’t tried VR [virtual reality] yet, actually.”

    Meanwhile, new hardware is on its way. The WORLDING group got early insights into the highly anticipated Apple Vision Pro headset, which promises to integrate many forms of XR and personal computing in one device. “They’re really pushing this kind of pass-through or mixed reality,” said Dan Miller, a Unity engineer on the poly spatial team, collaborating with Apple, who described the experience of the device as “You are viewing the real world. You’re pulling up windows, you’re interacting with content. It’s a kind of spatial computing device where you have multiple apps open, whether it’s your email client next to your messaging client with a 3D game in the middle. You’re interacting with all these things in the same space and at different times.”

    “WORLDING combines our passion for social-impact storytelling and incredible innovative storytelling,” said Paisley Smith of the Unity for Humanity Program at Unity Technologies. She added, “This is an opportunity for creators to incubate their game-changing projects and connect with experts across climate, story, and technology.”

    Meeting at MIT

    In a new in-person iteration of WORLDING this year, organizers collaborated closely with Connolly at the MIT Media Lab to co-design an in-person weekend conference Oct. 25 – Nov. 7 with 45 scholars and professionals who visualize climate data at NASA, the National Oceanic and Atmospheric Administration, planetariums, and museums across the United States.

    A participant said of the event, “An incredible workshop that had had a profound effect on my understanding of climate data storytelling and how to combine different components together for a more [holistic] solution.”

    “With this gathering under our new Future Worlds banner,” says Dava Newman, director of the MIT Media Lab and Apollo Program Professor of Astronautics chair, “the Media Lab seeks to affect human behavior and help societies everywhere to improve life here on Earth and in worlds beyond, so that all — the sentient, natural, and cosmic — worlds may flourish.” 

    “WORLDING’s virtual-only component has been our biggest strength because it has enabled a true, international cohort to gather, build, and create together. But this year, an in-person version showed broader opportunities that spatial interactivity generates — informal Q&As, physical worksheets, and larger-scale ideation, all leading to deeper trust-building,” says WORLDING producer Srushti Kamat SM ’23.

    The future and potential of WORLDING lies in the ongoing dialogue between the virtual and physical, both in the work itself and in the format of the workshops. More

  • in

    Accelerated climate action needed to sharply reduce current risks to life and life-support systems

    Hottest day on record. Hottest month on record. Extreme marine heatwaves. Record-low Antarctic sea-ice.

    While El Niño is a short-term factor in this year’s record-breaking heat, human-caused climate change is the long-term driver. And as global warming edges closer to 1.5 degrees Celsius — the aspirational upper limit set in the Paris Agreement in 2015 — ushering in more intense and frequent heatwaves, floods, wildfires, and other climate extremes much sooner than many expected, current greenhouse gas emissions-reduction policies are far too weak to keep the planet from exceeding that threshold. In fact, on roughly one-third of days in 2023, the average global temperature was at least 1.5 C higher than pre-industrial levels. Faster and bolder action will be needed — from the in-progress United Nations Climate Change Conference (COP28) and beyond — to stabilize the climate and minimize risks to human (and nonhuman) lives and the life-support systems (e.g., food, water, shelter, and more) upon which they depend.

    Quantifying the risks posed by simply maintaining existing climate policies — and the benefits (i.e., avoided damages and costs) of accelerated climate action aligned with the 1.5 C goal — is the central task of the 2023 Global Change Outlook, recently released by the MIT Joint Program on the Science and Policy of Global Change.

    Based on a rigorous, integrated analysis of population and economic growth, technological change, Paris Agreement emissions-reduction pledges (Nationally Determined Contributions, or NDCs), geopolitical tensions, and other factors, the report presents the MIT Joint Program’s latest projections for the future of the earth’s energy, food, water, and climate systems, as well as prospects for achieving the Paris Agreement’s short- and long-term climate goals.

    The 2023 Global Change Outlook performs its risk-benefit analysis by focusing on two scenarios. The first, Current Trends, assumes that Paris Agreement NDCs are implemented through the year 2030, and maintained thereafter. While this scenario represents an unprecedented global commitment to limit greenhouse gas emissions, it neither stabilizes climate nor limits climate change. The second scenario, Accelerated Actions, extends from the Paris Agreement’s initial NDCs and aligns with its long-term goals. This scenario aims to limit and stabilize human-induced global climate warming to 1.5 C by the end of this century with at least a 50 percent probability. Uncertainty is quantified using 400-member ensembles of projections for each scenario.

    This year’s report also includes a visualization tool that enables a higher-resolution exploration of both scenarios.

    Energy

    Between 2020 and 2050, population and economic growth are projected to drive continued increases in energy needs and electrification. Successful achievement of current Paris Agreement pledges will reinforce a shift away from fossil fuels, but additional actions will be required to accelerate the energy transition needed to cap global warming at 1.5 C by 2100.

    During this 30-year period under the Current Trends scenario, the share of fossil fuels in the global energy mix drops from 80 percent to 70 percent. Variable renewable energy (wind and solar) is the fastest growing energy source with more than an 8.6-fold increase. In the Accelerated Actions scenario, the share of low-carbon energy sources grows from 20 percent to slightly more than 60 percent, a much faster growth rate than in the Current Trends scenario; wind and solar energy undergo more than a 13.3-fold increase.

    While the electric power sector is expected to successfully scale up (with electricity production increasing by 73 percent under Current Trends, and 87 percent under Accelerated Actions) to accommodate increased demand (particularly for variable renewables), other sectors face stiffer challenges in their efforts to decarbonize.

    “Due to a sizeable need for hydrocarbons in the form of liquid and gaseous fuels for sectors such as heavy-duty long-distance transport, high-temperature industrial heat, agriculture, and chemical production, hydrogen-based fuels and renewable natural gas remain attractive options, but the challenges related to their scaling opportunities and costs must be resolved,” says MIT Joint Program Deputy Director Sergey Paltsev, a lead author of the 2023 Global Change Outlook.

    Water, food, and land

    With a global population projected to reach 9.9 billion by 2050, the Current Trends scenario indicates that more than half of the world’s population will experience pressures to its water supply, and that three of every 10 people will live in water basins where compounding societal and environmental pressures on water resources will be experienced. Population projections under combined water stress in all scenarios reveal that the Accelerated Actions scenario can reduce approximately 40 million of the additional 570 million people living in water-stressed basins at mid-century.

    Under the Current Trends scenario, agriculture and food production will keep growing. This will increase pressure for land-use change, water use, and use of energy-intensive inputs, which will also lead to higher greenhouse gas emissions. Under the Accelerated Actions scenario, less agricultural and food output is observed by 2050 compared to the Current Trends scenario, since this scenario affects economic growth and increases production costs. Livestock production is more greenhouse gas emissions-intensive than crop and food production, which, under carbon-pricing policies, drives demand downward and increases costs and prices. Such impacts are transmitted to the food sector and imply lower consumption of livestock-based products.

    Land-use changes in the Accelerated Actions scenario are similar to those in the Current Trends scenario by 2050, except for land dedicated to bioenergy production. At the world level, the Accelerated Actions scenario requires cropland area to increase by 1 percent and pastureland to decrease by 4.2 percent, but land use for bioenergy must increase by 44 percent.

    Climate trends

    Under the Current Trends scenario, the world is likely (more than 50 percent probability) to exceed 2 C global climate warming by 2060, 2.8 C by 2100, and 3.8 C by 2150. Our latest climate-model information indicates that maximum temperatures will likely outpace mean temperature trends over much of North and South America, Europe, northern and southeast Asia, and southern parts of Africa and Australasia. So as human-forced climate warming intensifies, these regions are expected to experience more pronounced record-breaking extreme heat events.

    Under the Accelerated Actions scenario, global temperature will continue to rise through the next two decades. But by 2050, global temperature will stabilize, and then slightly decline through the latter half of the century.

    “By 2100, the Accelerated Actions scenario indicates that the world can be virtually assured of remaining below 2 C of global warming,” says MIT Joint Program Deputy Director C. Adam Schlosser, a lead author of the report. “Nevertheless, additional policy mechanisms must be designed with more comprehensive targets that also support a cleaner environment, sustainable resources, as well as improved and equitable human health.”

    The Accelerated Actions scenario not only stabilizes global precipitation increase (by 2060), but substantially reduces the magnitude and potential range of increases to almost one-third of Current Trends global precipitation changes. Any global increase in precipitation heightens flood risk worldwide, so policies aligned with the Accelerated Actions scenario would considerably reduce that risk.

    Prospects for meeting Paris Agreement climate goals

    Numerous countries and regions are progressing in fulfilling their Paris Agreement pledges. Many have declared more ambitious greenhouse gas emissions-mitigation goals, while financing to assist the least-developed countries in sustainable development is not forthcoming at the levels needed. In this year’s Global Stocktake Synthesis Report, the U.N. Framework Convention on Climate Change evaluated emissions reductions communicated by the parties of the Paris Agreement and concluded that global emissions are not on track to fulfill the most ambitious long-term global temperature goals of the Paris Agreement (to keep warming well below 2 C — and, ideally, 1.5 C — above pre-industrial levels), and there is a rapidly narrowing window to raise ambition and implement existing commitments in order to achieve those targets. The Current Trends scenario arrives at the same conclusion.

    The 2023 Global Change Outlook finds that both global temperature targets remain achievable, but require much deeper near-term emissions reductions than those embodied in current NDCs.

    Reducing climate risk

    This report explores two well-known sets of risks posed by climate change. Research highlighted indicates that elevated climate-related physical risks will continue to evolve by mid-century, along with heightened transition risks that arise from shifts in the political, technological, social, and economic landscapes that are likely to occur during the transition to a low-carbon economy.

    “Our Outlook shows that without aggressive actions the world will surpass critical greenhouse gas concentration thresholds and climate targets in the coming decades,” says MIT Joint Program Director Ronald Prinn. “While the costs of inaction are getting higher, the costs of action are more manageable.” More

  • in

    A mineral produced by plate tectonics has a global cooling effect, study finds

    MIT geologists have found that a clay mineral on the seafloor, called smectite, has a surprisingly powerful ability to sequester carbon over millions of years.

    Under a microscope, a single grain of the clay resembles the folds of an accordion. These folds are known to be effective traps for organic carbon.

    Now, the MIT team has shown that the carbon-trapping clays are a product of plate tectonics: When oceanic crust crushes against a continental plate, it can bring rocks to the surface that, over time, can weather into minerals including smectite. Eventually, the clay sediment settles back in the ocean, where the minerals trap bits of dead organisms in their microscopic folds. This keeps the organic carbon from being consumed by microbes and expelled back into the atmosphere as carbon dioxide.

    Over millions of years, smectite can have a global effect, helping to cool the entire planet. Through a series of analyses, the researchers showed that smectite was likely produced after several major tectonic events over the last 500 million years. During each tectonic event, the clays trapped enough carbon to cool the Earth and induce the subsequent ice age.

    The findings are the first to show that plate tectonics can trigger ice ages through the production of carbon-trapping smectite.

    These clays can be found in certain tectonically active regions today, and the scientists believe that smectite continues to sequester carbon, providing a natural, albeit slow-acting, buffer against humans’ climate-warming activities.

    “The influence of these unassuming clay minerals has wide-ranging implications for the habitability of planets,” says Joshua Murray, a graduate student in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “There may even be a modern application for these clays in offsetting some of the carbon that humanity has placed into the atmosphere.”

    Murray and Oliver Jagoutz, professor of geology at MIT, have published their findings today in Nature Geoscience.

    A clear and present clay

    The new study follows up on the team’s previous work, which showed that each of the Earth’s major ice ages was likely triggered by a tectonic event in the tropics. The researchers found that each of these tectonic events exposed ocean rocks called ophiolites to the atmosphere. They put forth the idea that, when a tectonic collision occurs in a tropical region, ophiolites can undergo certain weathering effects, such as exposure to wind, rain, and chemical interactions, that transform the rocks into various minerals, including clays.

    “Those clay minerals, depending on the kinds you create, influence the climate in different ways,” Murray explains.

    At the time, it was unclear which minerals could come out of this weathering effect, and whether and how these minerals could directly contribute to cooling the planet. So, while it appeared there was a link between plate tectonics and ice ages, the exact mechanism by which one could trigger the other was still in question.

    With the new study, the team looked to see whether their proposed tectonic tropical weathering process would produce carbon-trapping minerals, and in quantities that would be sufficient to trigger a global ice age.

    The team first looked through the geologic literature and compiled data on the ways in which major magmatic minerals weather over time, and on the types of clay minerals this weathering can produce. They then worked these measurements into a weathering simulation of different rock types that are known to be exposed in tectonic collisions.

    “Then we look at what happens to these rock types when they break down due to weathering and the influence of a tropical environment, and what minerals form as a result,” Jagoutz says.

    Next, they plugged each weathered, “end-product” mineral into a simulation of the Earth’s carbon cycle to see what effect a given mineral might have, either in interacting with organic carbon, such as bits of dead organisms, or with inorganic, in the form of carbon dioxide in the atmosphere.

    From these analyses, one mineral had a clear presence and effect: smectite. Not only was the clay a naturally weathered product of tropical tectonics, it was also highly effective at trapping organic carbon. In theory, smectite seemed like a solid connection between tectonics and ice ages.

    But were enough of the clays actually present to trigger the previous four ice ages? Ideally, researchers should confirm this by finding smectite in ancient rock layers dating back to each global cooling period.

    “Unfortunately, as clays are buried by other sediments, they get cooked a bit, so we can’t measure them directly,” Murray says. “But we can look for their fingerprints.”

    A slow build

    The team reasoned that, as smectites are a product of ophiolites, these ocean rocks also bear characteristic elements such as nickel and chromium, which would be preserved in ancient sediments. If smectites were present in the past, nickel and chromium should be as well.

    To test this idea, the team looked through a database containing thousands of oceanic sedimentary rocks that were deposited over the last 500 million years. Over this time period, the Earth experienced four separate ice ages. Looking at rocks around each of these periods, the researchers observed large spikes of nickel and chromium, and inferred from this that smectite must also have been present.

    By their estimates, the clay mineral could have increased the preservation of organic carbon by less than one-tenth of a percent. In absolute terms, this is a miniscule amount. But over millions of years, they calculated that the clay’s accumulated, sequestered carbon was enough to trigger each of the four major ice ages.

    “We found that you really don’t need much of this material to have a huge effect on the climate,” Jagoutz says.

    “These clays also have probably contributed some of the Earth’s cooling in the last 3 to 5 million years, before humans got involved,” Murray adds. “In the absence of humans, these clays are probably making a difference to the climate. It’s just such a slow process.”

    “Jagoutz and Murray’s work is a nice demonstration of how important it is to consider all biotic and physical components of the global carbon cycle,” says Lee Kump, a professor of geosciences at Penn State University, who was not involved with the study. “Feedbacks among all these components control atmospheric greenhouse gas concentrations on all time scales, from the annual rise and fall of atmospheric carbon dioxide levels to the swings from icehouse to greenhouse over millions of years.”

    Could smectites be harnessed intentionally to further bring down the world’s carbon emissions? Murray sees some potential, for instance to shore up carbon reservoirs such as regions of permafrost. Warming temperatures are predicted to melt permafrost and expose long-buried organic carbon. If smectites could be applied to these regions, the clays could prevent this exposed carbon from escaping into and further warming the atmosphere.

    “If you want to understand how nature works, you have to understand it on the mineral and grain scale,” Jagoutz says. “And this is also the way forward for us to find solutions for this climatic catastrophe. If you study these natural processes, there’s a good chance you will stumble on something that will be actually useful.”

    This research was funded, in part, by the National Science Foundation. More

  • in

    In a surprising finding, light can make water evaporate without heat

    Evaporation is happening all around us all the time, from the sweat cooling our bodies to the dew burning off in the morning sun. But science’s understanding of this ubiquitous process may have been missing a piece all this time.

    In recent years, some researchers have been puzzled upon finding that water in their experiments, which was held in a sponge-like material known as a hydrogel, was evaporating at a higher rate than could be explained by the amount of heat, or thermal energy, that the water was receiving. And the excess has been significant — a doubling, or even a tripling or more, of the theoretical maximum rate.

    After carrying out a series of new experiments and simulations, and reexamining some of the results from various groups that claimed to have exceeded the thermal limit, a team of researchers at MIT has reached a startling conclusion: Under certain conditions, at the interface where water meets air, light can directly bring about evaporation without the need for heat, and it actually does so even more efficiently than heat. In these experiments, the water was held in a hydrogel material, but the researchers suggest that the phenomenon may occur under other conditions as well.

    The findings are published this week in a paper in PNAS, by MIT postdoc Yaodong Tu, professor of mechanical engineering Gang Chen, and four others.

    The phenomenon might play a role in the formation and evolution of fog and clouds, and thus would be important to incorporate into climate models to improve their accuracy, the researchers say. And it might play an important part in many industrial processes such as solar-powered desalination of water, perhaps enabling alternatives to the step of converting sunlight to heat first.

    The new findings come as a surprise because water itself does not absorb light to any significant degree. That’s why you can see clearly through many feet of clean water to the surface below. So, when the team initially began exploring the process of solar evaporation for desalination, they first put particles of a black, light-absorbing material in a container of water to help convert the sunlight to heat.

    Then, the team came across the work of another group that had achieved an evaporation rate double the thermal limit — which is the highest possible amount of evaporation that can take place for a given input of heat, based on basic physical principles such as the conservation of energy. It was in these experiments that the water was bound up in a hydrogel. Although they were initially skeptical, Chen and Tu starting their own experiments with hydrogels, including a piece of the material from the other group. “We tested it under our solar simulator, and it worked,” confirming the unusually high evaporation rate, Chen says. “So, we believed them now.” Chen and Tu then began making and testing their own hydrogels.

    They began to suspect that the excess evaporation was being caused by the light itself —that photons of light were actually knocking bundles of water molecules loose from the water’s surface. This effect would only take place right at the boundary layer between water and air, at the surface of the hydrogel material — and perhaps also on the sea surface or the surfaces of droplets in clouds or fog.

    In the lab, they monitored the surface of a hydrogel, a JELL-O-like matrix consisting mostly of water bound by a sponge-like lattice of thin membranes. They measured its responses to simulated sunlight with precisely controlled wavelengths.

    The researchers subjected the water surface to different colors of light in sequence and measured the evaporation rate. They did this by placing a container of water-laden hydrogel on a scale and directly measuring the amount of mass lost to evaporation, as well as monitoring the temperature above the hydrogel surface. The lights were shielded to prevent them from introducing extra heat. The researchers found that the effect varied with color and peaked at a particular wavelength of green light. Such a color dependence has no relation to heat, and so supports the idea that it is the light itself that is causing at least some of the evaporation.

    The puffs of white condensation on glass is water being evaporated from a hydrogel using green light, without heat.Image: Courtesy of the researchers

    The researchers tried to duplicate the observed evaporation rate with the same setup but using electricity to heat the material, and no light. Even though the thermal input was the same as in the other test, the amount of water that evaporated never exceeded the thermal limit. However, it did so when the simulated sunlight was on, confirming that light was the cause of the extra evaporation.

    Though water itself does not absorb much light, and neither does the hydrogel material itself, when the two combine they become strong absorbers, Chen says. That allows the material to harness the energy of the solar photons efficiently and exceed the thermal limit, without the need for any dark dyes for absorption.

    Having discovered this effect, which they have dubbed the photomolecular effect, the researchers are now working on how to apply it to real-world needs. They have a grant from the Abdul Latif Jameel Water and Food Systems Lab to study the use of this phenomenon to improve the efficiency of solar-powered desalination systems, and a Bose Grant to explore the phenomenon’s effects on climate change modeling.

    Tu explains that in standard desalination processes, “it normally has two steps: First we evaporate the water into vapor, and then we need to condense the vapor to liquify it into fresh water.” With this discovery, he says, potentially “we can achieve high efficiency on the evaporation side.” The process also could turn out to have applications in processes that require drying a material.

    Chen says that in principle, he thinks it may be possible to increase the limit of water produced by solar desalination, which is currently 1.5 kilograms per square meter, by as much as three- or fourfold using this light-based approach. “This could potentially really lead to cheap desalination,” he says.

    Tu adds that this phenomenon could potentially also be leveraged in evaporative cooling processes, using the phase change to provide a highly efficient solar cooling system.

    Meanwhile, the researchers are also working closely with other groups who are attempting to replicate the findings, hoping to overcome skepticism that has faced the unexpected findings and the hypothesis being advanced to explain them.

    The research team also included Jiawei Zhou, Shaoting Lin, Mohammed Alshrah, and Xuanhe Zhao, all in MIT’s Department of Mechanical Engineering. More

  • in

    Desalination system could produce freshwater that is cheaper than tap water

    Engineers at MIT and in China are aiming to turn seawater into drinking water with a completely passive device that is inspired by the ocean, and powered by the sun.

    In a paper appearing today in the journal Joule, the team outlines the design for a new solar desalination system that takes in saltwater and heats it with natural sunlight.

    The configuration of the device allows water to circulate in swirling eddies, in a manner similar to the much larger “thermohaline” circulation of the ocean. This circulation, combined with the sun’s heat, drives water to evaporate, leaving salt behind. The resulting water vapor can then be condensed and collected as pure, drinkable water. In the meantime, the leftover salt continues to circulate through and out of the device, rather than accumulating and clogging the system.

    The new system has a higher water-production rate and a higher salt-rejection rate than all other passive solar desalination concepts currently being tested.

    The researchers estimate that if the system is scaled up to the size of a small suitcase, it could produce about 4 to 6 liters of drinking water per hour and last several years before requiring replacement parts. At this scale and performance, the system could produce drinking water at a rate and price that is cheaper than tap water.

    “For the first time, it is possible for water, produced by sunlight, to be even cheaper than tap water,” says Lenan Zhang, a research scientist in MIT’s Device Research Laboratory.

    The team envisions a scaled-up device could passively produce enough drinking water to meet the daily requirements of a small family. The system could also supply off-grid, coastal communities where seawater is easily accessible.

    Zhang’s study co-authors include MIT graduate student Yang Zhong and Evelyn Wang, the Ford Professor of Engineering, along with Jintong Gao, Jinfang You, Zhanyu Ye, Ruzhu Wang, and Zhenyuan Xu of Shanghai Jiao Tong University in China.

    A powerful convection

    The team’s new system improves on their previous design — a similar concept of multiple layers, called stages. Each stage contained an evaporator and a condenser that used heat from the sun to passively separate salt from incoming water. That design, which the team tested on the roof of an MIT building, efficiently converted the sun’s energy to evaporate water, which was then condensed into drinkable water. But the salt that was left over quickly accumulated as crystals that clogged the system after a few days. In a real-world setting, a user would have to place stages on a frequent basis, which would significantly increase the system’s overall cost.

    In a follow-up effort, they devised a solution with a similar layered configuration, this time with an added feature that helped to circulate the incoming water as well as any leftover salt. While this design prevented salt from settling and accumulating on the device, it desalinated water at a relatively low rate.

    In the latest iteration, the team believes it has landed on a design that achieves both a high water-production rate, and high salt rejection, meaning that the system can quickly and reliably produce drinking water for an extended period. The key to their new design is a combination of their two previous concepts: a multistage system of evaporators and condensers, that is also configured to boost the circulation of water — and salt — within each stage.

    “We introduce now an even more powerful convection, that is similar to what we typically see in the ocean, at kilometer-long scales,” Xu says.

    The small circulations generated in the team’s new system is similar to the “thermohaline” convection in the ocean — a phenomenon that drives the movement of water around the world, based on differences in sea temperature (“thermo”) and salinity (“haline”).

    “When seawater is exposed to air, sunlight drives water to evaporate. Once water leaves the surface, salt remains. And the higher the salt concentration, the denser the liquid, and this heavier water wants to flow downward,” Zhang explains. “By mimicking this kilometer-wide phenomena in small box, we can take advantage of this feature to reject salt.”

    Tapping out

    The heart of the team’s new design is a single stage that resembles a thin box, topped with a dark material that efficiently absorbs the heat of the sun. Inside, the box is separated into a top and bottom section. Water can flow through the top half, where the ceiling is lined with an evaporator layer that uses the sun’s heat to warm up and evaporate any water in direct contact. The water vapor is then funneled to the bottom half of the box, where a condensing layer air-cools the vapor into salt-free, drinkable liquid. The researchers set the entire box at a tilt within a larger, empty vessel, then attached a tube from the top half of the box down through the bottom of the vessel, and floated the vessel in saltwater.

    In this configuration, water can naturally push up through the tube and into the box, where the tilt of the box, combined with the thermal energy from the sun, induces the water to swirl as it flows through. The small eddies help to bring water in contact with the upper evaporating layer while keeping salt circulating, rather than settling and clogging.

    The team built several prototypes, with one, three, and 10 stages, and tested their performance in water of varying salinity, including natural seawater and water that was seven times saltier.

    From these tests, the researchers calculated that if each stage were scaled up to a square meter, it would produce up to 5 liters of drinking water per hour, and that the system could desalinate water without accumulating salt for several years. Given this extended lifetime, and the fact that the system is entirely passive, requiring no electricity to run, the team estimates that the overall cost of running the system would be cheaper than what it costs to produce tap water in the United States.

    “We show that this device is capable of achieving a long lifetime,” Zhong says. “That means that, for the first time, it is possible for drinking water produced by sunlight to be cheaper than tap water. This opens up the possibility for solar desalination to address real-world problems.”

    “This is a very innovative approach that effectively mitigates key challenges in the field of desalination,” says Guihua Yu, who develops sustainable water and energy storage systems at the University of Texas at Austin, and was not involved in the research. “The design is particularly beneficial for regions struggling with high-salinity water. Its modular design makes it highly suitable for household water production, allowing for scalability and adaptability to meet individual needs.”

    Funding for the research at Shanghai Jiao Tong University was supported by the Natural Science Foundation of China. More