More stories

  • in

    Accelerated climate action needed to sharply reduce current risks to life and life-support systems

    Hottest day on record. Hottest month on record. Extreme marine heatwaves. Record-low Antarctic sea-ice.

    While El Niño is a short-term factor in this year’s record-breaking heat, human-caused climate change is the long-term driver. And as global warming edges closer to 1.5 degrees Celsius — the aspirational upper limit set in the Paris Agreement in 2015 — ushering in more intense and frequent heatwaves, floods, wildfires, and other climate extremes much sooner than many expected, current greenhouse gas emissions-reduction policies are far too weak to keep the planet from exceeding that threshold. In fact, on roughly one-third of days in 2023, the average global temperature was at least 1.5 C higher than pre-industrial levels. Faster and bolder action will be needed — from the in-progress United Nations Climate Change Conference (COP28) and beyond — to stabilize the climate and minimize risks to human (and nonhuman) lives and the life-support systems (e.g., food, water, shelter, and more) upon which they depend.

    Quantifying the risks posed by simply maintaining existing climate policies — and the benefits (i.e., avoided damages and costs) of accelerated climate action aligned with the 1.5 C goal — is the central task of the 2023 Global Change Outlook, recently released by the MIT Joint Program on the Science and Policy of Global Change.

    Based on a rigorous, integrated analysis of population and economic growth, technological change, Paris Agreement emissions-reduction pledges (Nationally Determined Contributions, or NDCs), geopolitical tensions, and other factors, the report presents the MIT Joint Program’s latest projections for the future of the earth’s energy, food, water, and climate systems, as well as prospects for achieving the Paris Agreement’s short- and long-term climate goals.

    The 2023 Global Change Outlook performs its risk-benefit analysis by focusing on two scenarios. The first, Current Trends, assumes that Paris Agreement NDCs are implemented through the year 2030, and maintained thereafter. While this scenario represents an unprecedented global commitment to limit greenhouse gas emissions, it neither stabilizes climate nor limits climate change. The second scenario, Accelerated Actions, extends from the Paris Agreement’s initial NDCs and aligns with its long-term goals. This scenario aims to limit and stabilize human-induced global climate warming to 1.5 C by the end of this century with at least a 50 percent probability. Uncertainty is quantified using 400-member ensembles of projections for each scenario.

    This year’s report also includes a visualization tool that enables a higher-resolution exploration of both scenarios.

    Energy

    Between 2020 and 2050, population and economic growth are projected to drive continued increases in energy needs and electrification. Successful achievement of current Paris Agreement pledges will reinforce a shift away from fossil fuels, but additional actions will be required to accelerate the energy transition needed to cap global warming at 1.5 C by 2100.

    During this 30-year period under the Current Trends scenario, the share of fossil fuels in the global energy mix drops from 80 percent to 70 percent. Variable renewable energy (wind and solar) is the fastest growing energy source with more than an 8.6-fold increase. In the Accelerated Actions scenario, the share of low-carbon energy sources grows from 20 percent to slightly more than 60 percent, a much faster growth rate than in the Current Trends scenario; wind and solar energy undergo more than a 13.3-fold increase.

    While the electric power sector is expected to successfully scale up (with electricity production increasing by 73 percent under Current Trends, and 87 percent under Accelerated Actions) to accommodate increased demand (particularly for variable renewables), other sectors face stiffer challenges in their efforts to decarbonize.

    “Due to a sizeable need for hydrocarbons in the form of liquid and gaseous fuels for sectors such as heavy-duty long-distance transport, high-temperature industrial heat, agriculture, and chemical production, hydrogen-based fuels and renewable natural gas remain attractive options, but the challenges related to their scaling opportunities and costs must be resolved,” says MIT Joint Program Deputy Director Sergey Paltsev, a lead author of the 2023 Global Change Outlook.

    Water, food, and land

    With a global population projected to reach 9.9 billion by 2050, the Current Trends scenario indicates that more than half of the world’s population will experience pressures to its water supply, and that three of every 10 people will live in water basins where compounding societal and environmental pressures on water resources will be experienced. Population projections under combined water stress in all scenarios reveal that the Accelerated Actions scenario can reduce approximately 40 million of the additional 570 million people living in water-stressed basins at mid-century.

    Under the Current Trends scenario, agriculture and food production will keep growing. This will increase pressure for land-use change, water use, and use of energy-intensive inputs, which will also lead to higher greenhouse gas emissions. Under the Accelerated Actions scenario, less agricultural and food output is observed by 2050 compared to the Current Trends scenario, since this scenario affects economic growth and increases production costs. Livestock production is more greenhouse gas emissions-intensive than crop and food production, which, under carbon-pricing policies, drives demand downward and increases costs and prices. Such impacts are transmitted to the food sector and imply lower consumption of livestock-based products.

    Land-use changes in the Accelerated Actions scenario are similar to those in the Current Trends scenario by 2050, except for land dedicated to bioenergy production. At the world level, the Accelerated Actions scenario requires cropland area to increase by 1 percent and pastureland to decrease by 4.2 percent, but land use for bioenergy must increase by 44 percent.

    Climate trends

    Under the Current Trends scenario, the world is likely (more than 50 percent probability) to exceed 2 C global climate warming by 2060, 2.8 C by 2100, and 3.8 C by 2150. Our latest climate-model information indicates that maximum temperatures will likely outpace mean temperature trends over much of North and South America, Europe, northern and southeast Asia, and southern parts of Africa and Australasia. So as human-forced climate warming intensifies, these regions are expected to experience more pronounced record-breaking extreme heat events.

    Under the Accelerated Actions scenario, global temperature will continue to rise through the next two decades. But by 2050, global temperature will stabilize, and then slightly decline through the latter half of the century.

    “By 2100, the Accelerated Actions scenario indicates that the world can be virtually assured of remaining below 2 C of global warming,” says MIT Joint Program Deputy Director C. Adam Schlosser, a lead author of the report. “Nevertheless, additional policy mechanisms must be designed with more comprehensive targets that also support a cleaner environment, sustainable resources, as well as improved and equitable human health.”

    The Accelerated Actions scenario not only stabilizes global precipitation increase (by 2060), but substantially reduces the magnitude and potential range of increases to almost one-third of Current Trends global precipitation changes. Any global increase in precipitation heightens flood risk worldwide, so policies aligned with the Accelerated Actions scenario would considerably reduce that risk.

    Prospects for meeting Paris Agreement climate goals

    Numerous countries and regions are progressing in fulfilling their Paris Agreement pledges. Many have declared more ambitious greenhouse gas emissions-mitigation goals, while financing to assist the least-developed countries in sustainable development is not forthcoming at the levels needed. In this year’s Global Stocktake Synthesis Report, the U.N. Framework Convention on Climate Change evaluated emissions reductions communicated by the parties of the Paris Agreement and concluded that global emissions are not on track to fulfill the most ambitious long-term global temperature goals of the Paris Agreement (to keep warming well below 2 C — and, ideally, 1.5 C — above pre-industrial levels), and there is a rapidly narrowing window to raise ambition and implement existing commitments in order to achieve those targets. The Current Trends scenario arrives at the same conclusion.

    The 2023 Global Change Outlook finds that both global temperature targets remain achievable, but require much deeper near-term emissions reductions than those embodied in current NDCs.

    Reducing climate risk

    This report explores two well-known sets of risks posed by climate change. Research highlighted indicates that elevated climate-related physical risks will continue to evolve by mid-century, along with heightened transition risks that arise from shifts in the political, technological, social, and economic landscapes that are likely to occur during the transition to a low-carbon economy.

    “Our Outlook shows that without aggressive actions the world will surpass critical greenhouse gas concentration thresholds and climate targets in the coming decades,” says MIT Joint Program Director Ronald Prinn. “While the costs of inaction are getting higher, the costs of action are more manageable.” More

  • in

    Ayomikun Ayodeji ’22 named a 2024 Rhodes Scholar

    Ayomikun “Ayo” Ayodeji ’22 from Lagos, Nigeria, has been selected as a Rhodes Scholar for West Africa. He will begin fully funded postgraduate studies at Oxford University in the U.K. next fall.

    Ayodeji was supported by Associate Dean Kim Benard and the Distinguished Fellowships team in Career Advising and Professional Development, and received additional mentorship from the Presidential Committee on Distinguished Fellowships.

    “Ayo has worked hard to develop his vision and to express it in ways that will capture the imagination of the broader world. It is a thrill to see him recognized this year as a Rhodes Scholar,” says Professor Nancy Kanwisher, who co-chairs the committee along with Professor Will Broadhead.

    Ayodeji graduated from MIT in 2022 with BS degrees in chemical engineering and management. He is currently an associate at Boston Consulting Group.

    He is passionate about championing reliable energy access across the African landscape and fostering culturally inclusive communities. As a Rhodes Scholar, he will pursue an MSc in energy systems and an MSc in global governance and diplomacy.

    During his time at MIT, Ayodeji’s curiosity for energy innovations was fueled by his research on perovskite solar cells under the MIT Energy Initiative. He then went on to intern at Pioneer Natural Resources where he explored the boundless applications of machine learning tools in completions. At BCG, Ayodeji supports both public and private sector clients on a variety of renewable energy topics including clean energy transition, decarbonization roadmaps, and workforce development.

    Ayodeji’s community-oriented mindset led him to team up with a group of friends and partner with the Northeast Children’s Trust (NECT), an organization that helps children affected by the Boko Haram insurgency in northeastern Nigeria. The project, sponsored by Davis Projects for Peace and MIT’s PKG Center, expanded NECT’s programs via an offline, portable classroom server.

    Ayodeji served as an undergraduate representative on the MIT Department of Chemical Engineering’s Diversity, Equity, and Inclusion Committee. He was also vice president of the MIT African Students’ Association and a coordinator for the annual MIT International Students Orientation. More

  • in

    MIT startup has big plans to pull carbon from the air

    In order to avoid the worst effects of climate change, the United Nations has said we’ll need to not only reduce emissions but also remove carbon dioxide from the atmosphere. One method for achieving carbon removal is direct air capture and storage. Such technologies are still in their infancy, but many efforts are underway to scale them up quickly in hopes of heading off the most catastrophic effects of climate change.

    The startup Noya, founded by Josh Santos ’14, is working to accelerate direct-air carbon removal with a low-power, modular system that can be mass manufactured and deployed around the world. The company plans to power its system with renewable energy and build its facilities near injection wells to store carbon underground.

    Using third-party auditors to verify the amount of carbon dioxide captured and stored, Noya is selling carbon credits to help organizations reach net-zero emissions targets.

    “Think of our systems for direct air capture like solar panels for carbon negativity,” says Santos, who formerly played a role in Tesla’s much-publicized manufacturing scale-up for its Model 3 electric sedan. “We can stack these boxes in a LEGO-like fashion to achieve scale in the field.”

    The three-year old company is currently building its first commercial pilot facility, and says its first full-scale commercial facility will have the capacity to pull millions of tons of carbon from the air each year. Noya has already secured millions of dollars in presales to help build its first facilities from organizations including Shopify, Watershed, and a university endowment.

    Santos says the ambitious approach, which is driven by the urgent need to scale carbon removal solutions, was influenced by his time at MIT.

    “I need to thank all of my MIT professors,” Santos says. “I don’t think any of this would be possible without the way in which MIT opened up my horizons by showing me what’s possible when you work really hard.”

    Finding a purpose

    Growing up in the southeastern U.S., Santos says he first recognized climate change as an issue by experiencing the increasing intensity of hurricanes in his neighborhood. One year a hurricane forced his family to evacuate their town. When they returned, their church was gone.

    “The storm left a really big mark on me and how I thought about the world,” Santos says. “I realized how much climate change can impact people.”

    When Santos came to MIT as an undergraduate, he took coursework related to climate change and energy systems, eventually majoring in chemical engineering. He also learned about startups through courses he took at the MIT Sloan School of Management and by taking part in MIT’s Undergraduate Research Opportunities Program (UROP), which exposed him to researchers in the early stages of commercializing research from MIT labs.

    More than the coursework, though, Santos says MIT instilled in him a desire to make a positive impact on the world, in part through a four-day development workshop called LeaderShape that he took one January during the Institute’s Independent Activities Period (IAP).

    “LeaderShape teaches students how to lead with integrity, and the core lesson is that any privilege you have you should try to leverage to improve the lives of other people,” Santos says. “That really stuck with me. Going to MIT is a huge privilege, and it makes me feel like I have a responsibility to put that privilege to work to the betterment of society. It shaped a lot of how I view my career.”

    After graduation, Santos worked at Tesla, then at Harley Davidson, where he worked on electric powertrains. Eventually he decided electric vehicle technology couldn’t solve climate change on its own, so in the spring of 2020 he founded Noya with friend Daniel Cavaro.

    The initial idea for Noya was to attach carbon capture devices to cooling towers to keep equipment costs low. The founders pivoted in response to the passage of the Inflation Reduction Act in 2022 because their machines weren’t big enough to qualify for the new tax credits in the law, which required each system to capture at least 1,000 tons of CO2 per year.

    Noya’s new systems will combine thousands of its modular units to create massive facilities that can capture millions of tons of CO2 right next to existing injection wells.

    Each of Noya’s units is about the size of a solar panel at about 6 feet wide, 4.5 feet tall, and 1 foot thick. A fan blows air through tiny channels in each unit that contain Noya’s carbon capture material. The company’s material solution consists of an activated carbon monolith and a proprietary chemical feedstock that binds to the carbon in the air. When the material becomes saturated with carbon, electricity is applied to the material and a light vacuum collects a pure stream of carbon.

    The goal is for each of Noya’s modules to remove about 60 tons of CO2 from the atmosphere per year.

    “Other direct air capture companies need a big hot piece of equipment — like an oven, steam generator, or kiln — that takes electricity and converts it to get heat to the material,” Santos says. “Any lost heat into the surrounding environment is excess cost. We skip the need for the excess equipment and their inefficiencies by adding the electricity directly to the material itself.”

    Scaling with urgency

    From its office in Oakland, California, Noya is putting an experimental module through tests to optimize its design. Noya will launch its first testing facility, which should remove about 350 tons of CO2 per year, in 2024. It has already secured renewable energy and injection storage partners for that facility. Over the next few years Noya plans to capture and remove thousands of tons of CO2, and the company’s first commercial-scale facility will aim to remove about 3 million tons of carbon annually.

    “That design is what we’ll replicate across the world to grow our planetary impact,” Santos says. “We’re trying to scale up as fast as possible.”

    Noya has already sold all of the carbon credits it expects to generate in its first five years, and the founders believe the growing demand from companies and governments to purchase high-quality carbon credits will outstrip supply for at least the next 10 years in the nascent carbon removal industry, which also includes approaches like enhanced rock weathering, biomass carbon storage, and ocean alkalinity enhancement.

    “We’re going to need something like 30 companies the size of Shell to achieve the scale we need,” Santos says. “I think there will be large companies in each of those verticals. We’re in the early innings here.”

    Santos believes the carbon removal market can scale without government mandates, but he also sees increasing government and public support for carbon removal technologies around the world.

    “Carbon removal is a waste management problem,” Santos says. “You can’t just throw trash in the middle of the street. The way we currently deal with trash is polluters pay to clean up their waste. Carbon removal should be like that. CO2 is a waste product, and we should have regulations in place that are requiring polluters, like businesses, to clean up their waste emissions. It’s a public good to provide cleaner air.” More

  • in

    How to tackle the global deforestation crisis

    Imagine if France, Germany, and Spain were completely blanketed in forests — and then all those trees were quickly chopped down. That’s nearly the amount of deforestation that occurred globally between 2001 and 2020, with profound consequences.

    Deforestation is a major contributor to climate change, producing between 6 and 17 percent of global greenhouse gas emissions, according to a 2009 study. Meanwhile, because trees also absorb carbon dioxide, removing it from the atmosphere, they help keep the Earth cooler. And climate change aside, forests protect biodiversity.

    “Climate change and biodiversity make this a global problem, not a local problem,” says MIT economist Ben Olken. “Deciding to cut down trees or not has huge implications for the world.”

    But deforestation is often financially profitable, so it continues at a rapid rate. Researchers can now measure this trend closely: In the last quarter-century, satellite-based technology has led to a paradigm change in charting deforestation. New deforestation datasets, based on the Landsat satellites, for instance, track forest change since 2000 with resolution at 30 meters, while many other products now offer frequent imaging at close resolution.

    “Part of this revolution in measurement is accuracy, and the other part is coverage,” says Clare Balboni, an assistant professor of economics at the London School of Economics (LSE). “On-site observation is very expensive and logistically challenging, and you’re talking about case studies. These satellite-based data sets just open up opportunities to see deforestation at scale, systematically, across the globe.”

    Balboni and Olken have now helped write a new paper providing a road map for thinking about this crisis. The open-access article, “The Economics of Tropical Deforestation,” appears this month in the Annual Review of Economics. The co-authors are Balboni, a former MIT faculty member; Aaron Berman, a PhD candidate in MIT’s Department of Economics; Robin Burgess, an LSE professor; and Olken, MIT’s Jane Berkowitz Carlton and Dennis William Carlton Professor of Microeconomics. Balboni and Olken have also conducted primary research in this area, along with Burgess.

    So, how can the world tackle deforestation? It starts with understanding the problem.

    Replacing forests with farms

    Several decades ago, some thinkers, including the famous MIT economist Paul Samuelson in the 1970s, built models to study forests as a renewable resource; Samuelson calculated the “maximum sustained yield” at which a forest could be cleared while being regrown. These frameworks were designed to think about tree farms or the U.S. national forest system, where a fraction of trees would be cut each year, and then new trees would be grown over time to take their place.

    But deforestation today, particularly in tropical areas, often looks very different, and forest regeneration is not common.

    Indeed, as Balboni and Olken emphasize, deforestation is now rampant partly because the profits from chopping down trees come not just from timber, but from replacing forests with agriculture. In Brazil, deforestation has increased along with agricultural prices; in Indonesia, clearing trees accelerated as the global price of palm oil went up, leading companies to replace forests with palm tree orchards.

    All this tree-clearing creates a familiar situation: The globally shared costs of climate change from deforestation are “externalities,” as economists say, imposed on everyone else by the people removing forest land. It is akin to a company that pollutes into a river, affecting the water quality of residents.

    “Economics has changed the way it thinks about this over the last 50 years, and two things are central,” Olken says. “The relevance of global externalities is very important, and the conceptualization of alternate land uses is very important.” This also means traditional forest-management guidance about regrowth is not enough. With the economic dynamics in mind, which policies might work, and why?

    The search for solutions

    As Balboni and Olken note, economists often recommend “Pigouvian” taxes (named after the British economist Arthur Pigou) in these cases, levied against people imposing externalities on others. And yet, it can be hard to identify who is doing the deforesting.

    Instead of taxing people for clearing forests, governments can pay people to keep forests intact. The UN uses Payments for Environmental Services (PES) as part of its REDD+ (Reducing Emissions from Deforestation and forest Degradation) program. However, it is similarly tough to identify the optimal landowners to subsidize, and these payments may not match the quick cash-in of deforestation. A 2017 study in Uganda showed PES reduced deforestation somewhat; a 2022 study in Indonesia found no reduction; another 2022 study, in Brazil, showed again that some forest protection resulted.

    “There’s mixed evidence from many of these [studies],” Balboni says. These policies, she notes, must reach people who would otherwise clear forests, and a key question is, “How can we assess their success compared to what would have happened anyway?”

    Some places have tried cash transfer programs for larger populations. In Indonesia, a 2020 study found such subsidies reduced deforestation near villages by 30 percent. But in Mexico, a similar program meant more people could afford milk and meat, again creating demand for more agriculture and thus leading to more forest-clearing.

    At this point, it might seem that laws simply banning deforestation in key areas would work best — indeed, about 16 percent of the world’s land overall is protected in some way. Yet the dynamics of protection are tricky. Even with protected areas in place, there is still “leakage” of deforestation into other regions. 

    Still more approaches exist, including “nonstate agreements,” such as the Amazon Soy Moratorium in Brazil, in which grain traders pledged not to buy soy from deforested lands, and reduced deforestation without “leakage.”

    Also, intriguingly, a 2008 policy change in the Brazilian Amazon made agricultural credit harder to obtain by requiring recipients to comply with environmental and land registration rules. The result? Deforestation dropped by up to 60 percent over nearly a decade. 

    Politics and pulp

    Overall, Balboni and Olken observe, beyond “externalities,” two major challenges exist. One, it is often unclear who holds property rights in forests. In these circumstances, deforestation seems to increase. Two, deforestation is subject to political battles.

    For instance, as economist Bard Harstad of Stanford University has observed, environmental lobbying is asymmetric. Balboni and Olken write: “The conservationist lobby must pay the government in perpetuity … while the deforestation-oriented lobby need pay only once to deforest in the present.” And political instability leads to more deforestation because “the current administration places lower value on future conservation payments.”

    Even so, national political measures can work. In the Amazon from 2001 to 2005, Brazilian deforestation rates were three to four times higher than on similar land across the border, but that imbalance vanished once the country passed conservation measures in 2006. However, deforestation ramped up again after a 2014 change in government. Looking at particular monitoring approaches, a study of Brazil’s satellite-based Real-Time System for Detection of Deforestation (DETER), launched in 2004, suggests that a 50 percent annual increase in its use in municipalities created a 25 percent reduction in deforestation from 2006 to 2016.

    How precisely politics matters may depend on the context. In a 2021 paper, Balboni and Olken (with three colleagues) found that deforestation actually decreased around elections in Indonesia. Conversely, in Brazil, one study found that deforestation rates were 8 to 10 percent higher where mayors were running for re-election between 2002 and 2012, suggesting incumbents had deforestation industry support.

    “The research there is aiming to understand what the political economy drivers are,” Olken says, “with the idea that if you understand those things, reform in those countries is more likely.”

    Looking ahead, Balboni and Olken also suggest that new research estimating the value of intact forest land intact could influence public debates. And while many scholars have studied deforestation in Brazil and Indonesia, fewer have examined the Democratic Republic of Congo, another deforestation leader, and sub-Saharan Africa.

    Deforestation is an ongoing crisis. But thanks to satellites and many recent studies, experts know vastly more about the problem than they did a decade or two ago, and with an economics toolkit, can evaluate the incentives and dynamics at play.

    “To the extent that there’s ambuiguity across different contexts with different findings, part of the point of our review piece is to draw out common themes — the important considerations in determining which policy levers can [work] in different circumstances,” Balboni says. “That’s a fast-evolving area. We don’t have all the answers, but part of the process is bringing together growing evidence about [everything] that affects how successful those choices can be.” More

  • in

    Explained: The 1.5 C climate benchmark

    The summer of 2023 has been a season of weather extremes.

    In June, uncontrolled wildfires ripped through parts of Canada, sending smoke into the U.S. and setting off air quality alerts in dozens of downwind states. In July, the world set the hottest global temperature on record, which it held for three days in a row, then broke again on day four.

    From July into August, unrelenting heat blanketed large parts of Europe, Asia, and the U.S., while India faced a torrential monsoon season, and heavy rains flooded regions in the northeastern U.S. And most recently, whipped up by high winds and dry vegetation, a historic wildfire tore through Maui, devastating an entire town.

    These extreme weather events are mainly a consequence of climate change driven by humans’ continued burning of coal, oil, and natural gas. Climate scientists agree that extreme weather such as what people experienced this summer will likely grow more frequent and intense in the coming years unless something is done, on a persistent and planet-wide scale, to rein in global temperatures.

    Just how much reining-in are they talking about? The number that is internationally agreed upon is 1.5 degrees Celsius. To prevent worsening and potentially irreversible effects of climate change, the world’s average temperature should not exceed that of preindustrial times by more than 1.5 degrees Celsius (2.7 degrees Fahrenheit).

    As more regions around the world face extreme weather, it’s worth taking stock of the 1.5-degree bar, where the planet stands in relation to this threshold, and what can be done at the global, regional, and personal level, to “keep 1.5 alive.”

    Why 1.5 C?

    In 2015, in response to the growing urgency of climate impacts, nearly every country in the world signed onto the Paris Agreement, a landmark international treaty under which 195 nations pledged to hold the Earth’s temperature to “well below 2 degrees Celsius above pre-industrial levels,” and going further, aim to “limit the temperature increase to 1.5 degrees Celsius above pre-industrial levels.”

    The treaty did not define a particular preindustrial period, though scientists generally consider the years from 1850 to 1900 to be a reliable reference; this time predates humans’ use of fossil fuels and is also the earliest period when global observations of land and sea temperatures are available. During this period, the average global temperature, while swinging up and down in certain years, generally hovered around 13.5 degrees Celsius, or 56.3 degrees Fahrenheit.

    The treaty was informed by a fact-finding report which concluded that, even global warming of 1.5 degrees Celsius above the preindustrial average, over an extended, decades-long period, would lead to high risks for “some regions and vulnerable ecosystems.” The recommendation then, was to set the 1.5 degrees Celsius limit as a “defense line” — if the world can keep below this line, it potentially could avoid the more extreme and irreversible climate effects that would occur with a 2 degrees Celsius increase, and for some places, an even smaller increase than that.

    But, as many regions are experiencing today, keeping below the 1.5 line is no guarantee of avoiding extreme, global warming effects.

    “There is nothing magical about the 1.5 number, other than that is an agreed aspirational target. Keeping at 1.4 is better than 1.5, and 1.3 is better than 1.4, and so on,” says Sergey Paltsev, deputy director of MIT’s Joint Program on the Science and Policy of Global Change. “The science does not tell us that if, for example, the temperature increase is 1.51 degrees Celsius, then it would definitely be the end of the world. Similarly, if the temperature would stay at 1.49 degrees increase, it does not mean that we will eliminate all impacts of climate change. What is known: The lower the target for an increase in temperature, the lower the risks of climate impacts.”

    How close are we to 1.5 C?

    In 2022, the average global temperature was about 1.15 degrees Celsius above preindustrial levels. According to the World Meteorological Organization (WMO), the cyclical weather phenomenon La Niña recently contributed to temporarily cooling and dampening the effects of human-induced climate change. La Niña lasted for three years and ended around March of 2023.

    In May, the WMO issued a report that projected a significant likelihood (66 percent) that the world would exceed the 1.5 degrees Celsius threshold in the next four years. This breach would likely be driven by human-induced climate change, combined with a warming El Niño — a cyclical weather phenomenon that temporarily heats up ocean regions and pushes global temperatures higher.

    This summer, an El Niño is currently underway, and the event typically raises global temperatures in the year after it sets in, which in this case would be in 2024. The WMO predicts that, for each of the next four years, the global average temperature is likely to swing between 1.1 and 1.8 degrees Celsius above preindustrial levels.

    Though there is a good chance the world will get hotter than the 1.5-degree limit as the result of El Niño, the breach would be temporary, and for now, would not have failed the Paris Agreement, which aims to keep global temperatures below the 1.5-degree limit over the long term (averaged over several decades rather than a single year).

    “But we should not forget that this is a global average, and there are variations regionally and seasonally,” says Elfatih Eltahir, the H.M. King Bhumibol Professor and Professor of Civil and Environmental Engineering at MIT. “This year, we had extreme conditions around the world, even though we haven’t reached the 1.5 C threshold. So, even if we control the average at a global magnitude, we are going to see events that are extreme, because of climate change.”

    More than a number

    To hold the planet’s long-term average temperature to below the 1.5-degree threshold, the world will have to reach net zero emissions by the year 2050, according to the Intergovernmental Panel on Climate Change (IPCC). This means that, in terms of the emissions released by the burning of coal, oil, and natural gas, the entire world will have to remove as much as it puts into the atmosphere.

    “In terms of innovations, we need all of them — even those that may seem quite exotic at this point: fusion, direct air capture, and others,” Paltsev says.

    The task of curbing emissions in time is particularly daunting for the United States, which generates the most carbon dioxide emissions of any other country in the world.

    “The U.S.’s burning of fossil fuels and consumption of energy is just way above the rest of the world. That’s a persistent problem,” Eltahir says. “And the national statistics are an aggregate of what a lot of individuals are doing.”

    At an individual level, there are things that can be done to help bring down one’s personal emissions, and potentially chip away at rising global temperatures.

    “We are consumers of products that either embody greenhouse gases, such as meat, clothes, computers, and homes, or we are directly responsible for emitting greenhouse gases, such as when we use cars, airplanes, electricity, and air conditioners,” Paltsev says. “Our everyday choices affect the amount of emissions that are added to the atmosphere.”

    But to compel people to change their emissions, it may be less about a number, and more about a feeling.

    “To get people to act, my hypothesis is, you need to reach them not just by convincing them to be good citizens and saying it’s good for the world to keep below 1.5 degrees, but showing how they individually will be impacted,” says Eltahir, who specializes on the study of regional climates, focusing on how climate change impacts the water cycle and frequency of extreme weather such as heat waves.

    “True climate progress requires a dramatic change in how the human system gets its energy,” Paltsev says. “It is a huge undertaking. Are you ready personally to make sacrifices and to change the way of your life? If one gets an honest answer to that question, it would help to understand why true climate progress is so difficult to achieve.” More

  • in

    Q&A: A high-tech take on Wagner’s “Parsifal” opera

    The world-famous Bayreuth Festival in Germany, annually centered around the works of composer Richard Wagner, launched this summer on July 25 with a production that has been making headlines. Director Jay Scheib, an MIT faculty member, has created a version of Wagner’s celebrated opera “Parsifal” that is set in an apocalyptic future (rather than the original Medieval past), and uses augmented reality headset technology for a portion of the audience, among other visual effects. People using the headsets see hundreds of additional visuals, from fast-moving clouds to arrows being shot at them. The AR portion of the production was developed through a team led by designer and MIT Technical Instructor Joshua Higgason.

    The new “Parsifal” has engendered extensive media attention and discussion among opera followers and the viewing public. Five years in the making, it was developed with the encouragement of Bayreuth Festival general manager Katharina Wagner, Richard Wagner’s great-granddaughter. The production runs until Aug. 27, and can also be streamed on Stage+. Scheib, the Class of 1949 Professor in MIT’s Music and Theater Arts program, recently talked to MIT News about the project from Bayreuth.

    Q: Your production of “Parsifal” led off this year’s entire Bayreuth festival. How’s it going?

    A: From my point of view it’s going quite swimmingly. The leading German opera critics and the audiences have been super-supportive and Bayreuth makes it possible for a work to evolve … Given the complexity of the technical challenge of making an AR project function in an opera house, the bar was so high, it was a difficult challenge, and we’re really happy we found a way forward, a way to make it work, and a way to make it fit into an artistic process. I feel great.

    Q: You offer a new interpretation of “Parsifal,” and a new setting for it. What is it, and why did you choose to interpret it this way?

    A: One of the main themes in “Parsifal” is that the long-time king of this holy grail cult is wounded, and his wound will not heal. [With that in mind], we looked at what the world was like when the opera premiered in the late 19th century, around the time of what was known as the Great African Scramble, when Europe re-drew the map of Africa, largely based on resources, including mineral resources.

    Cobalt remains [the focus of] dirty mining practices in the Democratic Republic of Congo, and is a requirement for a lot of our electronic objects, in particular batteries. There are also these massive copper deposits discovered under a Buddhist temple in Afghanistan, and lithium under a sacred site in Nevada. We face an intense challenge in climate change, and the predictions are not good. Some of our solutions like electric cars require these materials, so they’re only solutions for some people, while others suffer [where minerals are being mined]. We started thinking about how wounds never heal, and when the prospect of creating a better world opens new wounds in other communities. … That became a theme. It also comes out of the time when we were making it, when Covid happened and George Floyd was murdered, which created an opportunity in the U.S. to start speaking very openly about wounds that have not healed.

    We set it in a largely post-human environment, where we didn’t succeed, and everything has collapsed. In the third act, there’s derelict mining equipment, and the holy water is this energy-giving force, but in fact it’s this lithium-ion pool, which gives us energy and then poisons us. That’s the theme we created.

    Q: What were your goals about integrating the AR technology into the opera, and how did you achieve that?

    A: First, I was working with my collaborator Joshua Higgason. No one had ever really done this before, so we just started researching whether it was possible. And most of the people we talked to said, “Don’t do it. It’s just not going to work.” Having always been a daredevil at heart, I was like, “Oh, come on, we can figure this out.”

    We were diligent in exploring the possibilities. We made multiple trips to Bayreuth and made these milimeter-accurate laser scans of the auditorium and the stage. We built a variety of models to see how to make AR work in a large environment, where 2,000 headsets could respond simultaneously. We built a team of animators and developers and programmers and designers, from Portugal to Cambridge to New York to Hungary, the UK, and a group in Germany. Josh led this team, and they got after it, but it took us the better part of two years to make it possible for an audience, some of whom don’t really use smartphones, to put on an AR headset and have it just work.

    I can’t even believe we did this. But it’s working.

    Q: In opera there’s hopefully a productive tension between tradition and innovation. How do you think about that when it comes to Wagner at Bayreuth?

    A: Innovation is the tradition at Bayreuth. Musically and scenographically. “Parsifal” was composed for this particular opera house, and I’m incredibly respectful of what this event is made for. We are trying to create a balanced and unified experience, between the scenic design and the AR and the lighting and the costume design, and create perfect moments of convergence where you really lose yourself in the environment. I believe wholly in the production and the performers are extraordinary. Truly, truly, truly extraordinary.

    Q: People have been focused on the issue of bringing AR to Bayreuth, but what has Bayreuth brought to you as a director?

    A: Working in Bayreuth has been an incredible experience. The level of intellectual integrity among the technicians is extraordinary. The amount of care and patience and curiosity and expertise in Bayreuth is off the charts. This community of artists is the greatest. … People come here because it’s an incredible meeting of the minds, and for that I’m immensely filled with gratitude every day I come into the rehearsal room. The conductor, Pablo Heras-Casado, and I have been working on this for several years. And the music is still first. We’re setting up technology not to overtake the music, but to support it, and visually amplify it.

    It must be said that Katharina Wagner has been one of the most powerfully supportive artistic directors I have ever worked with. I find it inspiring to witness her tenacity and vision in seeing all of this through, despite the hurdles. It’s been a great collaboration. That’s the essence: great collaboration. More

  • in

    System tracks movement of food through global humanitarian supply chain

    Although more than enough food is produced to feed everyone in the world, as many as 828 million people face hunger today. Poverty, social inequity, climate change, natural disasters, and political conflicts all contribute to inhibiting access to food. For decades, the U.S. Agency for International Development (USAID) Bureau for Humanitarian Assistance (BHA) has been a leader in global food assistance, supplying millions of metric tons of food to recipients worldwide. Alleviating hunger — and the conflict and instability hunger causes — is critical to U.S. national security.

    But BHA is only one player within a large, complex supply chain in which food gets handed off between more than 100 partner organizations before reaching its final destination. Traditionally, the movement of food through the supply chain has been a black-box operation, with stakeholders largely out of the loop about what happens to the food once it leaves their custody. This lack of direct visibility into operations is due to siloed data repositories, insufficient data sharing among stakeholders, and different data formats that operators must manually sort through and standardize. As a result, accurate, real-time information — such as where food shipments are at any given time, which shipments are affected by delays or food recalls, and when shipments have arrived at their final destination — is lacking. A centralized system capable of tracing food along its entire journey, from manufacture through delivery, would enable a more effective humanitarian response to food-aid needs.

    In 2020, a team from MIT Lincoln Laboratory began engaging with BHA to create an intelligent dashboard for their supply-chain operations. This dashboard brings together the expansive food-aid datasets from BHA’s existing systems into a single platform, with tools for visualizing and analyzing the data. When the team started developing the dashboard, they quickly realized the need for considerably more data than BHA had access to.

    “That’s where traceability comes in, with each handoff partner contributing key pieces of information as food moves through the supply chain,” explains Megan Richardson, a researcher in the laboratory’s Humanitarian Assistance and Disaster Relief Systems Group.

    Richardson and the rest of the team have been working with BHA and their partners to scope, build, and implement such an end-to-end traceability system. This system consists of serialized, unique identifiers (IDs) — akin to fingerprints — that are assigned to individual food items at the time they are produced. These individual IDs remain linked to items as they are aggregated along the supply chain, first domestically and then internationally. For example, individually tagged cans of vegetable oil get packaged into cartons; cartons are placed onto pallets and transported via railway and truck to warehouses; pallets are loaded onto shipping containers at U.S. ports; and pallets are unloaded and cartons are unpackaged overseas.

    With a trace

    Today, visibility at the single-item level doesn’t exist. Most suppliers mark pallets with a lot number (a lot is a batch of items produced in the same run), but this is for internal purposes (i.e., to track issues stemming back to their production supply, like over-enriched ingredients or machinery malfunction), not data sharing. So, organizations know which supplier lot a pallet and carton are associated with, but they can’t track the unique history of an individual carton or item within that pallet. As the lots move further downstream toward their final destination, they are often mixed with lots from other productions, and possibly other commodity types altogether, because of space constraints. On the international side, such mixing and the lack of granularity make it difficult to quickly pull commodities out of the supply chain if food safety concerns arise. Current response times can span several months.

    “Commodities are grouped differently at different stages of the supply chain, so it is logical to track them in those groupings where needed,” Richardson says. “Our item-level granularity serves as a form of Rosetta Stone to enable stakeholders to efficiently communicate throughout these stages. We’re trying to enable a way to track not only the movement of commodities, including through their lot information, but also any problems arising independent of lot, like exposure to high humidity levels in a warehouse. Right now, we have no way to associate commodities with histories that may have resulted in an issue.”

    “You can now track your checked luggage across the world and the fish on your dinner plate,” adds Brice MacLaren, also a researcher in the laboratory’s Humanitarian Assistance and Disaster Relief Systems Group. “So, this technology isn’t new, but it’s new to BHA as they evolve their methodology for commodity tracing. The traceability system needs to be versatile, working across a wide variety of operators who take custody of the commodity along the supply chain and fitting into their existing best practices.”

    As food products make their way through the supply chain, operators at each receiving point would be able to scan these IDs via a Lincoln Laboratory-developed mobile application (app) to indicate a product’s current location and transaction status — for example, that it is en route on a particular shipping container or stored in a certain warehouse. This information would get uploaded to a secure traceability server. By scanning a product, operators would also see its history up until that point.   

    Hitting the mark

    At the laboratory, the team tested the feasibility of their traceability technology, exploring different ways to mark and scan items. In their testing, they considered barcodes and radio-frequency identification (RFID) tags and handheld and fixed scanners. Their analysis revealed 2D barcodes (specifically data matrices) and smartphone-based scanners were the most feasible options in terms of how the technology works and how it fits into existing operations and infrastructure.

    “We needed to come up with a solution that would be practical and sustainable in the field,” MacLaren says. “While scanners can automatically read any RFID tags in close proximity as someone is walking by, they can’t discriminate exactly where the tags are coming from. RFID is expensive, and it’s hard to read commodities in bulk. On the other hand, a phone can scan a barcode on a particular box and tell you that code goes with that box. The challenge then becomes figuring out how to present the codes for people to easily scan without significantly interrupting their usual processes for handling and moving commodities.” 

    As the team learned from partner representatives in Kenya and Djibouti, offloading at the ports is a chaotic, fast operation. At manual warehouses, porters fling bags over their shoulders or stack cartons atop their heads any which way they can and run them to a drop point; at bagging terminals, commodities come down a conveyor belt and land this way or that way. With this variability comes several questions: How many barcodes do you need on an item? Where should they be placed? What size should they be? What will they cost? The laboratory team is considering these questions, keeping in mind that the answers will vary depending on the type of commodity; vegetable oil cartons will have different specifications than, say, 50-kilogram bags of wheat or peas.

    Leaving a mark

    Leveraging results from their testing and insights from international partners, the team has been running a traceability pilot evaluating how their proposed system meshes with real-world domestic and international operations. The current pilot features a domestic component in Houston, Texas, and an international component in Ethiopia, and focuses on tracking individual cartons of vegetable oil and identifying damaged cans. The Ethiopian team with Catholic Relief Services recently received a container filled with pallets of uniquely barcoded cartons of vegetable oil cans (in the next pilot, the cans will be barcoded, too). They are now scanning items and collecting data on product damage by using smartphones with the laboratory-developed mobile traceability app on which they were trained. 

    “The partners in Ethiopia are comparing a couple lid types to determine whether some are more resilient than others,” Richardson says. “With the app — which is designed to scan commodities, collect transaction data, and keep history — the partners can take pictures of damaged cans and see if a trend with the lid type emerges.”

    Next, the team will run a series of pilots with the World Food Program (WFP), the world’s largest humanitarian organization. The first pilot will focus on data connectivity and interoperability, and the team will engage with suppliers to directly print barcodes on individual commodities instead of applying barcode labels to packaging, as they did in the initial feasibility testing. The WFP will provide input on which of their operations are best suited for testing the traceability system, considering factors like the network bandwidth of WFP staff and local partners, the commodity types being distributed, and the country context for scanning. The BHA will likely also prioritize locations for system testing.

    “Our goal is to provide an infrastructure to enable as close to real-time data exchange as possible between all parties, given intermittent power and connectivity in these environments,” MacLaren says.

    In subsequent pilots, the team will try to integrate their approach with existing systems that partners rely on for tracking procurements, inventory, and movement of commodities under their custody so that this information is automatically pushed to the traceability server. The team also hopes to add a capability for real-time alerting of statuses, like the departure and arrival of commodities at a port or the exposure of unclaimed commodities to the elements. Real-time alerts would enable stakeholders to more efficiently respond to food-safety events. Currently, partners are forced to take a conservative approach, pulling out more commodities from the supply chain than are actually suspect, to reduce risk of harm. Both BHA and WHP are interested in testing out a food-safety event during one of the pilots to see how the traceability system works in enabling rapid communication response.

    To implement this technology at scale will require some standardization for marking different commodity types as well as give and take among the partners on best practices for handling commodities. It will also require an understanding of country regulations and partner interactions with subcontractors, government entities, and other stakeholders.

    “Within several years, I think it’s possible for BHA to use our system to mark and trace all their food procured in the United States and sent internationally,” MacLaren says.

    Once collected, the trove of traceability data could be harnessed for other purposes, among them analyzing historical trends, predicting future demand, and assessing the carbon footprint of commodity transport. In the future, a similar traceability system could scale for nonfood items, including medical supplies distributed to disaster victims, resources like generators and water trucks localized in emergency-response scenarios, and vaccines administered during pandemics. Several groups at the laboratory are also interested in such a system to track items such as tools deployed in space or equipment people carry through different operational environments.

    “When we first started this program, colleagues were asking why the laboratory was involved in simple tasks like making a dashboard, marking items with barcodes, and using hand scanners,” MacLaren says. “Our impact here isn’t about the technology; it’s about providing a strategy for coordinated food-aid response and successfully implementing that strategy. Most importantly, it’s about people getting fed.” More

  • in

    Study: The ocean’s color is changing as a consequence of climate change

    The ocean’s color has changed significantly over the last 20 years, and the global trend is likely a consequence of human-induced climate change, report scientists at MIT, the National Oceanography Center in the U.K., and elsewhere.  

    In a study appearing today in Nature, the team writes that they have detected changes in ocean color over the past two decades that cannot be explained by natural, year-to-year variability alone. These color shifts, though subtle to the human eye, have occurred over 56 percent of the world’s oceans — an expanse that is larger than the total land area on Earth.

    In particular, the researchers found that tropical ocean regions near the equator have become steadily greener over time. The shift in ocean color indicates that ecosystems within the surface ocean must also be changing, as the color of the ocean is a literal reflection of the organisms and materials in its waters.

    At this point, the researchers cannot say how exactly marine ecosystems are changing to reflect the shifting color. But they are pretty sure of one thing: Human-induced climate change is likely the driver.

    “I’ve been running simulations that have been telling me for years that these changes in ocean color are going to happen,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences and the Center for Global Change Science. “To actually see it happening for real is not surprising, but frightening. And these changes are consistent with man-induced changes to our climate.”

    “This gives additional evidence of how human activities are affecting life on Earth over a huge spatial extent,” adds lead author B. B. Cael PhD ’19 of the National Oceanography Center in Southampton, U.K. “It’s another way that humans are affecting the biosphere.”

    The study’s co-authors also include Stephanie Henson of the National Oceanography Center, Kelsey Bisson at Oregon State University, and Emmanuel Boss of the University of Maine.

    Above the noise

    The ocean’s color is a visual product of whatever lies within its upper layers. Generally, waters that are deep blue reflect very little life, whereas greener waters indicate the presence of ecosystems, and mainly phytoplankton — plant-like microbes that are abundant in upper ocean and that contain the green pigment chlorophyll. The pigment helps plankton harvest sunlight, which they use to capture carbon dioxide from the atmosphere and convert it into sugars.

    Phytoplankton are the foundation of the marine food web that sustains progressively more complex organisms, on up to krill, fish, and seabirds and marine mammals. Phytoplankton are also a powerful muscle in the ocean’s ability to capture and store carbon dioxide. Scientists are therefore keen to monitor phytoplankton across the surface oceans and to see how these essential communities might respond to climate change. To do so, scientists have tracked changes in chlorophyll, based on the ratio of how much blue versus green light is reflected from the ocean surface, which can be monitored from space

    But around a decade ago, Henson, who is a co-author of the current study, published a paper with others, which showed that, if scientists were tracking chlorophyll alone, it would take at least 30 years of continuous monitoring to detect any trend that was driven specifically by climate change. The reason, the team argued, was that the large, natural variations in chlorophyll from year to year would overwhelm any anthropogenic influence on chlorophyll concentrations. It would therefore take several decades to pick out a meaningful, climate-change-driven signal amid the normal noise.

    In 2019, Dutkiewicz and her colleagues published a separate paper, showing through a new model that the natural variation in other ocean colors is much smaller compared to that of chlorophyll. Therefore, any signal of climate-change-driven changes should be easier to detect over the smaller, normal variations of other ocean colors. They predicted that such changes should be apparent within 20, rather than 30 years of monitoring.

    “So I thought, doesn’t it make sense to look for a trend in all these other colors, rather than in chlorophyll alone?” Cael says. “It’s worth looking at the whole spectrum, rather than just trying to estimate one number from bits of the spectrum.”

     The power of seven

    In the current study, Cael and the team analyzed measurements of ocean color taken by the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite, which has been monitoring ocean color for 21 years. MODIS takes measurements in seven visible wavelengths, including the two colors researchers traditionally use to estimate chlorophyll.

    The differences in color that the satellite picks up are too subtle for human eyes to differentiate. Much of the ocean appears blue to our eye, whereas the true color may contain a mix of subtler wavelengths, from blue to green and even red.

    Cael carried out a statistical analysis using all seven ocean colors measured by the satellite from 2002 to 2022 together. He first looked at how much the seven colors changed from region to region during a given year, which gave him an idea of their natural variations. He then zoomed out to see how these annual variations in ocean color changed over a longer stretch of two decades. This analysis turned up a clear trend, above the normal year-to-year variability.

    To see whether this trend is related to climate change, he then looked to Dutkiewicz’s model from 2019. This model simulated the Earth’s oceans under two scenarios: one with the addition of greenhouse gases, and the other without it. The greenhouse-gas model predicted that a significant trend should show up within 20 years and that this trend should cause changes to ocean color in about 50 percent of the world’s surface oceans — almost exactly what Cael found in his analysis of real-world satellite data.

    “This suggests that the trends we observe are not a random variation in the Earth system,” Cael says. “This is consistent with anthropogenic climate change.”

    The team’s results show that monitoring ocean colors beyond chlorophyll could give scientists a clearer, faster way to detect climate-change-driven changes to marine ecosystems.

    “The color of the oceans has changed,” Dutkiewicz says. “And we can’t say how. But we can say that changes in color reflect changes in plankton communities, that will impact everything that feeds on plankton. It will also change how much the ocean will take up carbon, because different types of plankton have different abilities to do that. So, we hope people take this seriously. It’s not only models that are predicting these changes will happen. We can now see it happening, and the ocean is changing.”

    This research was supported, in part, by NASA. More