More stories

  • in

    J-WAFS launches Food and Climate Systems Transformation Alliance

    Food systems around the world are increasingly at risk from the impacts of climate change. At the same time, these systems, which include all activities from food production to consumption and food waste, are responsible for about one-third of the human-caused greenhouse gas emissions warming the planet. 

    To drive research-based innovation that will make food systems more resilient and sustainable, MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) announced the launch of a new initiative at an event during the UN Climate Change Conference in Glasgow, Scotland, last week. The initiative, called the Food and Climate Systems Transformation (FACT) Alliance, will better connect researchers to farmers, food businesses, policymakers, and other food systems stakeholders around the world. 

    “Time is not on our side,” says Greg Sixt, the director of the FACT Alliance and research manager for food and climate systems at J-WAFS. “To date, the research community hasn’t delivered actionable solutions quickly enough or in the policy-relevant form needed if time-critical changes are to be made to our food systems. The FACT Alliance aims to change this.”

    Why, in fact, do our food systems need transformation?

    At COP26 (which stands for “conference of the parties” to the UN Framework Convention on Climate Change, being held for the 26th time this year), a number of countries have pledged to end deforestation, reduce methane emissions, and cease public financing of coal power. In his keynote address at the FACT Alliance event, Professor Pete Smith of the University of Aberdeen, an alliance member institution, noted that food and agriculture also need to be addressed because “there’s an interaction between climate change and the food system.” 

    The UN Intergovernmental Panel on Climate Change warns that a two-degree Celsius increase in average global temperature over preindustrial levels could trigger a worldwide food crisis, and emissions from food systems alone could push us past the two-degree mark even if energy-related emissions could be zeroed out. 

    Smith said dramatic and rapid transformations are needed to deliver safe, nutritious food for all, with reduced environmental impact and increased resilience to climate change. With a global network of leading research institutions and collaborating stakeholder organizations, the FACT Alliance aims to facilitate new, solutions-oriented research for addressing the most challenging aspects of food systems in the era of climate change. 

    How the FACT Alliance works

    Central to the work of the FACT Alliance is the development of new methodologies for aligning data across scales and food systems components, improving data access, integrating research across the diverse disciplines that address aspects of food systems, making stakeholders partners in the research process, and assessing impact in the context of complex and interconnected food and climate systems. 

    The FACT Alliance will conduct what’s known as “convergence research,” which meets complex problems with approaches that embody deep integration across disciplines. This kind of research calls for close association with the stakeholders who both make decisions and are directly affected by how food systems work, be they farmers, extension services (i.e., agricultural advisories), policymakers, international aid organizations, consumers, or others. By inviting stakeholders and collaborators to be part of the research process, the FACT Alliance allows for engagement at the scale, geography, and scope that is most relevant to the needs of each, integrating global and local teams to achieve better outcomes. 

    “Doing research in isolation of all the stakeholders and in isolation of the goals that we want to achieve will not deliver the transformation that we need,” said Smith. “The problem is too big for us to solve in isolation, and we need broad alliances to tackle the issue, and that’s why we developed the FACT Alliance.” 

    Members and collaborators

    Led by MIT’s J-WAFS, the FACT Alliance is currently made up of 16 core members and an associated network of collaborating stakeholder organizations. 

    “As the central convener of MIT research on food systems, J-WAFS catalyzes collaboration across disciplines,” says Maria Zuber, vice president for research at MIT. “Now, by bringing together a world-class group of research institutions and stakeholders from key sectors, the FACT Alliance aims to advance research that will help alleviate climate impacts on food systems and mitigate food system impacts on climate.”

    J-WAFS co-hosted the COP26 event “Bridging the Science-Policy Gap for Impactful, Demand-Driven Food Systems Innovation” with Columbia University, the American University of Beirut, and the CGIAR research program Climate Change, Agriculture and Food Security (CCAFS). The event featured a panel discussion with several FACT Alliance members and the UK Foreign, Commonwealth and Development Office (FCDO). More

  • in

    Q&A: Options for the Diablo Canyon nuclear plant

    The Diablo Canyon nuclear plant in California, the only one still operating in the state, is set to close in 2025. A team of researchers at MIT’s Center for Advanced Nuclear Energy Systems, Abdul Latif Jameel Water and Food Systems Lab, and Center for Energy and Environmental Policy Research; Stanford’s Precourt Energy Institute; and energy analysis firm LucidCatalyst LLC have analyzed the potential benefits the plant could provide if its operation were extended to 2030 or 2045.

    They found that this nuclear plant could simultaneously help to stabilize the state’s electric grid, provide desalinated water to supplement the state’s chronic water shortages, and provide carbon-free hydrogen fuel for transportation. MIT News asked report co-authors Jacopo Buongiorno, the TEPCO Professor of Nuclear Science and Engineering, and John Lienhard, the Jameel Professor of Water and Food, to discuss the group’s findings.

    Q: Your report suggests co-locating a major desalination plant alongside the existing Diablo Canyon power plant. What would be the potential benefits from operating a desalination plant in conjunction with the power plant?

    Lienhard: The cost of desalinated water produced at Diablo Canyon would be lower than for a stand-alone plant because the cost of electricity would be significantly lower and you could take advantage of the existing infrastructure for the intake of seawater and the outfall of brine. Electricity would be cheaper because there’s not an electrical transmission charge — you’re right at the plant, and the power doesn’t have to be sent across the grid.

    Depending on the scale at which the desalination plant is built, you could make a very significant impact on the water shortfalls of state and federal projects in the area. In fact, one of the numbers that came out of this study was that an intermediate-sized desalination plant there would produce more fresh water than the highest estimate of the net yield from the proposed Delta Conveyance Project on the Sacramento River. You could get that amount of water at Diablo Canyon for an investment cost less than half as large, and without the associated impacts that would come with the Delta Conveyance Project.

    And the technology envisioned for desalination here, reverse osmosis, is available off the shelf. You can buy this equipment today. In fact, it’s already in use in California and thousands of other places around the world.

    Q: You discuss in the report three potential products from the Diablo Canyon plant:  desalinatinated water, power for the grid, and clean hydrogen. How well can the plant accommodate all of those efforts, and are there advantages to combining them as opposed to doing any one of them separately?

    Buongiorno: California, like many other regions in the world, is facing multiple challenges as it seeks to reduce carbon emissions on a grand scale. First, the wide deployment of intermittent energy sources such as solar and wind creates a great deal of variability on the grid that can be balanced by dispatchable firm power generators like Diablo. So, the first mission for Diablo is to continue to provide reliable, clean electricity to the grid.

    The second challenge is the prolonged drought and water scarcity for the state in general. And one way to address that is water desalination co-located with the nuclear plant at the Diablo site, as John explained.

    The third challenge is related to decarbonization the transportation sector. A possible approach is replacing conventional cars and trucks with vehicles powered by fuel cells which consume hydrogen. Hydrogen has to be produced from a primary energy source. Nuclear power, through a process called electrolysis, can do that quite efficiently and in a manner that is carbon-free.

    Our economic analysis took into account the expected revenue from selling these multiple products — electricity for the grid, hydrogen for the transportation sector, water for farmers or other local users — as well as the costs associated with deploying the new facilities needed to produce desalinated water and hydrogen. We found that, if Diablo’s operating license was extended until 2035, it would cut carbon emissions by an average of 7 million metric tons a year — a more than 11 percent reduction from 2017 levels — and save ratepayers $2.6 billion in power system costs.

    Further delaying the retirement of Diablo to 2045 would spare 90,000 acres of land that would need to be dedicated to renewable energy production to replace the facility’s capacity, and it would save ratepayers up to $21 billion in power system costs.

    Finally, if Diablo was operated as a polygeneration facility that provides electricity, desalinated water, and hydrogen simultaneously, its value, quantified in terms of dollars per unit electricity generated, could increase by 50 percent.

    Lienhard: Most of the desalination scenarios that we considered did not consume the full electrical output of that plant, meaning that under most scenarios you would continue to make electricity and do something with it, beyond just desalination. I think it’s also important to remember that this power plant produces 15 percent of California’s carbon-free electricity today and is responsible for 8 percent of the state’s total electrical production. In other words, Diablo Canyon is a very large factor in California’s decarbonization. When or if this plant goes offline, the near-term outcome is likely to be increased reliance on natural gas to produce electricity, meaning a rise in California’s carbon emissions.

    Q: This plant in particular has been highly controversial since its inception. What’s your assessment of the plant’s safety beyond its scheduled shutdown, and how do you see this report as contributing to the decision-making about that shutdown?

    Buongiorno: The Diablo Canyon Nuclear Power Plant has a very strong safety record. The potential safety concern for Diablo is related to its proximity to several fault lines. Being located in California, the plant was designed to withstand large earthquakes to begin with. Following the Fukushima accident in 2011, the Nuclear Regulatory Commission reviewed the plant’s ability to withstand external events (e.g., earthquakes, tsunamis, floods, tornadoes, wildfires, hurricanes) of exceptionally rare and severe magnitude. After nine years of assessment the NRC’s conclusion is that “existing seismic capacity or effective flood protection [at Diablo Canyon] will address the unbounded reevaluated hazards.” That is, Diablo was designed and built to withstand even the rarest and strongest earthquakes that are physically possible at this site.

    As an additional level of protection, the plant has been retrofitted with special equipment and procedures meant to ensure reliable cooling of the reactor core and spent fuel pool under a hypothetical scenario in which all design-basis safety systems have been disabled by a severe external event.

    Lienhard: As for the potential impact of this report, PG&E [the California utility] has already made the decision to shut down the plant, and we and others hope that decision will be revisited and reversed. We believe that this report gives the relevant stakeholders and policymakers a lot of information about options and value associated with keeping the plant running, and about how California could benefit from clean water and clean power generated at Diablo Canyon. It’s not up to us to make the decision, of course — that is a decision that must be made by the people of California. All we can do is provide information.

    Q: What are the biggest challenges or obstacles to seeing these ideas implemented?

    Lienhard: California has very strict environmental protection regulations, and it’s good that they do. One of the areas of great concern to California is the health of the ocean and protection of the coastal ecosystem. As a result, very strict rules are in place about the intake and outfall of both power plants and desalination plants, to protect marine life. Our analysis suggests that this combined plant can be implemented within the parameters prescribed by the California Ocean Plan and that it can meet the regulatory requirements.

    We believe that deeper analysis would be needed before you could proceed. You would need to do site studies and really get out into the water and look in detail at what’s there. But the preliminary analysis is positive. A second challenge is that the discourse in California around nuclear power has generally not been very supportive, and similarly some groups in California oppose desalination. We expect that that both of those points of view would be part of the conversation about whether or not to procede with this project.

    Q: How particular is this analysis to the specifics of this location? Are there aspects of it that apply to other nuclear plants, domestically or globally?

    Lienhard: Hundreds of nuclear plants around the world are situated along the coast, and many are in water stressed regions. Although our analysis focused on Diablo Canyon, we believe that the general findings are applicable to many other seaside nuclear plants, so that this approach and these conclusions could potentially be applied at hundreds of sites worldwide. More

  • in

    Saving seaweed with machine learning

    Last year, Charlene Xia ’17, SM ’20 found herself at a crossroads. She was finishing up her master’s degree in media arts and sciences from the MIT Media Lab and had just submitted applications to doctoral degree programs. All Xia could do was sit and wait. In the meantime, she narrowed down her career options, regardless of whether she was accepted to any program.

    “I had two thoughts: I’m either going to get a PhD to work on a project that protects our planet, or I’m going to start a restaurant,” recalls Xia.

    Xia poured over her extensive cookbook collection, researching international cuisines as she anxiously awaited word about her graduate school applications. She even looked into the cost of a food truck permit in the Boston area. Just as she started hatching plans to open a plant-based skewer restaurant, Xia received word that she had been accepted into the mechanical engineering graduate program at MIT.

    Shortly after starting her doctoral studies, Xia’s advisor, Professor David Wallace, approached her with an interesting opportunity. MathWorks, a software company known for developing the MATLAB computing platform, had announced a new seed funding program in MIT’s Department of Mechanical Engineering. The program encouraged collaborative research projects focused on the health of the planet.

    “I saw this as a super-fun opportunity to combine my passion for food, my technical expertise in ocean engineering, and my interest in sustainably helping our planet,” says Xia.

    Play video

    From MIT Mechanical Engineering: “Saving Seaweed with Machine Learning”

    Wallace knew Xia would be up to the task of taking an interdisciplinary approach to solve an issue related to the health of the planet. “Charlene is a remarkable student with extraordinary talent and deep thoughtfulness. She is pretty much fearless, embracing challenges in almost any domain with the well-founded belief that, with effort, she will become a master,” says Wallace.

    Alongside Wallace and Associate Professor Stefanie Mueller, Xia proposed a project to predict and prevent the spread of diseases in aquaculture. The team focused on seaweed farms in particular.

    Already popular in East Asian cuisines, seaweed holds tremendous potential as a sustainable food source for the world’s ever-growing population. In addition to its nutritive value, seaweed combats various environmental threats. It helps fight climate change by absorbing excess carbon dioxide in the atmosphere, and can also absorb fertilizer run-off, keeping coasts cleaner.

    As with so much of marine life, seaweed is threatened by the very thing it helps mitigate against: climate change. Climate stressors like warm temperatures or minimal sunlight encourage the growth of harmful bacteria such as ice-ice disease. Within days, entire seaweed farms are decimated by unchecked bacterial growth.

    To solve this problem, Xia turned to the microbiota present in these seaweed farms as a predictive indicator of any threat to the seaweed or livestock. “Our project is to develop a low-cost device that can detect and prevent diseases before they affect seaweed or livestock by monitoring the microbiome of the environment,” says Xia.

    The team pairs old technology with the latest in computing. Using a submersible digital holographic microscope, they take a 2D image. They then use a machine learning system known as a neural network to convert the 2D image into a representation of the microbiome present in the 3D environment.

    “Using a machine learning network, you can take a 2D image and reconstruct it almost in real time to get an idea of what the microbiome looks like in a 3D space,” says Xia.

    The software can be run in a small Raspberry Pi that could be attached to the holographic microscope. To figure out how to communicate these data back to the research team, Xia drew upon her master’s degree research.

    In that work, under the guidance of Professor Allan Adams and Professor Joseph Paradiso in the Media Lab, Xia focused on developing small underwater communication devices that can relay data about the ocean back to researchers. Rather than the usual $4,000, these devices were designed to cost less than $100, helping lower the cost barrier for those interested in uncovering the many mysteries of our oceans. The communication devices can be used to relay data about the ocean environment from the machine learning algorithms.

    By combining these low-cost communication devices along with microscopic images and machine learning, Xia hopes to design a low-cost, real-time monitoring system that can be scaled to cover entire seaweed farms.

    “It’s almost like having the ‘internet of things’ underwater,” adds Xia. “I’m developing this whole underwater camera system alongside the wireless communication I developed that can give me the data while I’m sitting on dry land.”

    Armed with these data about the microbiome, Xia and her team can detect whether or not a disease is about to strike and jeopardize seaweed or livestock before it is too late.

    While Xia still daydreams about opening a restaurant, she hopes the seaweed project will prompt people to rethink how they consider food production in general.

    “We should think about farming and food production in terms of the entire ecosystem,” she says. “My meta-goal for this project would be to get people to think about food production in a more holistic and natural way.” More

  • in

    Countering climate change with cool pavements

    Pavements are an abundant urban surface, covering around 40 percent of American cities. But in addition to carrying traffic, they can also emit heat.

    Due to what’s called the urban heat island effect, densely built, impermeable surfaces like pavements can absorb solar radiation and warm up their surroundings by re-emitting that radiation as heat. This phenomenon poses a serious threat to cities. It increases air temperatures by up as much as 7 degrees Fahrenheit and contributes to health and environmental risks — risks that climate change will magnify.

    In response, researchers at the MIT Concrete Sustainability Hub (MIT CSHub) are studying how a surface that ordinarily heightens urban heat islands can instead lessen their intensity. Their research focuses on “cool pavements,” which reflect more solar radiation and emit less heat than conventional paving surfaces.

    A recent study by a team of current and former MIT CSHub researchers in the journal of Environmental Science and Technology outlines cool pavements and their implementation. The study found that they could lower air temperatures in Boston and Phoenix by up to 1.7 degrees Celsius (3 F) and 2.1 C (3.7 F), respectively. They would also reduce greenhouse gas emissions, cutting total emissions by up to 3 percent in Boston and 6 percent in Phoenix. Achieving these savings, however, requires that cool pavement strategies be selected according to the climate, traffic, and building configurations of each neighborhood.

    Cities like Los Angeles and Phoenix have already conducted sizeable experiments with cool pavements, but the technology is still not widely implemented. The CSHub team hopes their research can guide future cool paving projects to help cities cope with a changing climate.

    Scratching the surface

    It’s well known that darker surfaces get hotter in sunlight than lighter ones. Climate scientists use a metric called “albedo” to help describe this phenomenon.

    “Albedo is a measure of surface reflectivity,” explains Hessam AzariJafari, the paper’s lead author and a postdoc at the MIT CSHub. “Surfaces with low albedo absorb more light and tend to be darker, while high-albedo surfaces are brighter and reflect more light.”

    Albedo is central to cool pavements. Typical paving surfaces, like conventional asphalt, possess a low albedo and absorb more radiation and emit more heat. Cool pavements, however, have brighter materials that reflect more than three times as much radiation and, consequently, re-emit far less heat.

    “We can build cool pavements in many different ways,” says Randolph Kirchain, a researcher in the Materials Science Laboratory and co-director of the Concrete Sustainability Hub. “Brighter materials like concrete and lighter-colored aggregates offer higher albedo, while existing asphalt pavements can be made ‘cool’ through reflective coatings.”

    CSHub researchers considered these several options in a study of Boston and Phoenix. Their analysis considered different outcomes when concrete, reflective asphalt, and reflective concrete replaced conventional asphalt pavements — which make up more than 95 percent of pavements worldwide.

    Situational awareness

    For a comprehensive understanding of the environmental benefits of cool pavements in Boston and Phoenix, researchers had to look beyond just paving materials. That’s because in addition to lowering air temperatures, cool pavements exert direct and indirect impacts on climate change.  

    “The one direct impact is radiative forcing,” notes AzariJafari. “By reflecting radiation back into the atmosphere, cool pavements exert a radiative forcing, meaning that they change the Earth’s energy balance by sending more energy out of the atmosphere — similar to the polar ice caps.”

    Cool pavements also exert complex, indirect climate change impacts by altering energy use in adjacent buildings.

    “On the one hand, by lowering temperatures, cool pavements can reduce some need for AC [air conditioning] in the summer while increasing heating demand in the winter,” says AzariJafari. “Conversely, by reflecting light — called incident radiation — onto nearby buildings, cool pavements can warm structures up, which can increase AC usage in the summer and lower heating demand in the winter.”

    What’s more, albedo effects are only a portion of the overall life cycle impacts of a cool pavement. In fact, impacts from construction and materials extraction (referred to together as embodied impacts) and the use of the pavement both dominate the life cycle. The primary use phase impact of a pavement — apart from albedo effects  — is excess fuel consumption: Pavements with smooth surfaces and stiff structures cause less excess fuel consumption in the vehicles that drive on them.

    Assessing the climate-change impacts of cool pavements, then, is an intricate process — one involving many trade-offs. In their study, the researchers sought to analyze and measure them.

    A full reflection

    To determine the ideal implementation of cool pavements in Boston and Phoenix, researchers investigated the life cycle impacts of shifting from conventional asphalt pavements to three cool pavement options: reflective asphalt, concrete, and reflective concrete.

    To do this, they used coupled physical simulations to model buildings in thousands of hypothetical neighborhoods. Using this data, they then trained a neural network model to predict impacts based on building and neighborhood characteristics. With this tool in place, it was possible to estimate the impact of cool pavements for each of the thousands of roads and hundreds of thousands of buildings in Boston and Phoenix.

    In addition to albedo effects, they also looked at the embodied impacts for all pavement types and the effect of pavement type on vehicle excess fuel consumption due to surface qualities, stiffness, and deterioration rate.

    After assessing the life cycle impacts of each cool pavement type, the researchers calculated which material — conventional asphalt, reflective asphalt, concrete, and reflective concrete — benefited each neighborhood most. They found that while cool pavements were advantageous in Boston and Phoenix overall, the ideal materials varied greatly within and between both cities.

    “One benefit that was universal across neighborhood type and paving material, was the impact of radiative forcing,” notes AzariJafari. “This was particularly the case in areas with shorter, less-dense buildings, where the effect was most pronounced.”

    Unlike radiative forcing, however, changes to building energy demand differed by location. In Boston, cool pavements reduced energy demand as often as they increased it across all neighborhoods. In Phoenix, cool pavements had a negative impact on energy demand in most census tracts due to incident radiation. When factoring in radiative forcing, though, cool pavements ultimately had a net benefit.

    Only after considering embodied emissions and impacts on fuel consumption did the ideal pavement type manifest for each neighborhood. Once factoring in uncertainty over the life cycle, researchers found that reflective concrete pavements had the best results, proving optimal in 53 percent and 73 percent of the neighborhoods in Boston and Phoenix, respectively.

    Once again, uncertainties and variations were identified. In Boston, replacing conventional asphalt pavements with a cool option was always preferred, while in Phoenix concrete pavements — reflective or not — had better outcomes due to rigidity at high temperatures that minimized vehicle fuel consumption. And despite the dominance of concrete in Phoenix, in 17 percent of its neighborhoods all reflective paving options proved more or less as effective, while in 1 percent of cases, conventional pavements were actually superior.

    “Though the climate change impacts we studied have proven numerous and often at odds with each other, our conclusions are unambiguous: Cool pavements could offer immense climate change mitigation benefits for both cities,” says Kirchain.

    The improvements to air temperatures would be noticeable: the team found that cool pavements would lower peak summer air temperatures in Boston by 1.7 C (3 F) and in Phoenix by 2.1 C (3.7 F). The carbon dioxide emissions reductions would likewise be impressive. Boston would decrease its carbon dioxide emissions by as much as 3 percent over 50 years while reductions in Phoenix would reach 6 percent over the same period.

    This analysis is one of the most comprehensive studies of cool pavements to date — but there’s more to investigate. Just as with pavements, it’s also possible to adjust building albedo, which may result in changes to building energy demand. Intensive grid decarbonization and the introduction of low-carbon concrete mixtures may also alter the emissions generated by cool pavements.

    There’s still lots of ground to cover for the CSHub team. But by studying cool pavements, they’ve elevated a brilliant climate change solution and opened avenues for further research and future mitigation.

    The MIT Concrete Sustainability Hub is a team of researchers from several departments across MIT working on concrete and infrastructure science, engineering, and economics. Its research is supported by the Portland Cement Association and the Ready Mixed Concrete Research and Education Foundation. More

  • in

    Global warming begets more warming, new paleoclimate study finds

    It is increasingly clear that the prolonged drought conditions, record-breaking heat, sustained wildfires, and frequent, more extreme storms experienced in recent years are a direct result of rising global temperatures brought on by humans’ addition of carbon dioxide to the atmosphere. And a new MIT study on extreme climate events in Earth’s ancient history suggests that today’s planet may become more volatile as it continues to warm.

    The study, appearing today in Science Advances, examines the paleoclimate record of the last 66 million years, during the Cenozoic era, which began shortly after the extinction of the dinosaurs. The scientists found that during this period, fluctuations in the Earth’s climate experienced a surprising “warming bias.” In other words, there were far more warming events — periods of prolonged global warming, lasting thousands to tens of thousands of years — than cooling events. What’s more, warming events tended to be more extreme, with greater shifts in temperature, than cooling events.

    The researchers say a possible explanation for this warming bias may lie in a “multiplier effect,” whereby a modest degree of warming — for instance from volcanoes releasing carbon dioxide into the atmosphere — naturally speeds up certain biological and chemical processes that enhance these fluctuations, leading, on average, to still more warming.

    Interestingly, the team observed that this warming bias disappeared about 5 million years ago, around the time when ice sheets started forming in the Northern Hemisphere. It’s unclear what effect the ice has had on the Earth’s response to climate shifts. But as today’s Arctic ice recedes, the new study suggests that a multiplier effect may kick back in, and the result may be a further amplification of human-induced global warming.

    “The Northern Hemisphere’s ice sheets are shrinking, and could potentially disappear as a long-term consequence of human actions” says the study’s lead author Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “Our research suggests that this may make the Earth’s climate fundamentally more susceptible to extreme, long-term global warming events such as those seen in the geologic past.”

    Arnscheidt’s study co-author is Daniel Rothman, professor of geophysics at MIT, and  co-founder and co-director of MIT’s Lorenz Center.

    A volatile push

    For their analysis, the team consulted large databases of sediments containing deep-sea benthic foraminifera — single-celled organisms that have been around for hundreds of millions of years and whose hard shells are preserved in sediments. The composition of these shells is affected by the ocean temperatures as organisms are growing; the shells are therefore considered a reliable proxy for the Earth’s ancient temperatures.

    For decades, scientists have analyzed the composition of these shells, collected from all over the world and dated to various time periods, to track how the Earth’s temperature has fluctuated over millions of years. 

    “When using these data to study extreme climate events, most studies have focused on individual large spikes in temperature, typically of a few degrees Celsius warming,” Arnscheidt says. “Instead, we tried to look at the overall statistics and consider all the fluctuations involved, rather than picking out the big ones.”

    The team first carried out a statistical analysis of the data and observed that, over the last 66 million years, the distribution of global temperature fluctuations didn’t resemble a standard bell curve, with symmetric tails representing an equal probability of extreme warm and extreme cool fluctuations. Instead, the curve was noticeably lopsided, skewed toward more warm than cool events. The curve also exhibited a noticeably longer tail, representing warm events that were more extreme, or of higher temperature, than the most extreme cold events.

    “This indicates there’s some sort of amplification relative to what you would otherwise have expected,” Arnscheidt says. “Everything’s pointing to something fundamental that’s causing this push, or bias toward warming events.”

    “It’s fair to say that the Earth system becomes more volatile, in a warming sense,” Rothman adds.

    A warming multiplier

    The team wondered whether this warming bias might have been a result of “multiplicative noise” in the climate-carbon cycle. Scientists have long understood that higher temperatures, up to a point, tend to speed up biological and chemical processes. Because the carbon cycle, which is a key driver of long-term climate fluctuations, is itself composed of such processes, increases in temperature may lead to larger fluctuations, biasing the system towards extreme warming events.

    In mathematics, there exists a set of equations that describes such general amplifying, or multiplicative effects. The researchers applied this multiplicative theory to their analysis to see whether the equations could predict the asymmetrical distribution, including the degree of its skew and the length of its tails.

    In the end, they found that the data, and the observed bias toward warming, could be explained by the multiplicative theory. In other words, it’s very likely that, over the last 66 million years, periods of modest warming were on average further enhanced by multiplier effects, such as the response of biological and chemical processes that further warmed the planet.

    As part of the study, the researchers also looked at the correlation between past warming events and changes in Earth’s orbit. Over hundreds of thousands of years, Earth’s orbit around the sun regularly becomes more or less elliptical. But scientists have wondered why many past warming events appeared to coincide with these changes, and why these events feature outsized warming compared with what the change in Earth’s orbit could have wrought on its own.

    So, Arnscheidt and Rothman incorporated the Earth’s orbital changes into the multiplicative model and their analysis of Earth’s temperature changes, and found that multiplier effects could predictably amplify, on average, the modest temperature rises due to changes in Earth’s orbit.

    “Climate warms and cools in synchrony with orbital changes, but the orbital cycles themselves would predict only modest changes in climate,” Rothman says. “But if we consider a multiplicative model, then modest warming, paired with this multiplier effect, can result in extreme events that tend to occur at the same time as these orbital changes.”

    “Humans are forcing the system in a new way,” Arnscheidt adds. “And this study is showing that, when we increase temperature, we’re likely going to interact with these natural, amplifying effects.”

    This research was supported, in part, by MIT’s School of Science. More

  • in

    Electrifying cars and light trucks to meet Paris climate goals

    On Aug. 5, the White House announced that it seeks to ensure that 50 percent of all new passenger vehicles sold in the United States by 2030 are powered by electricity. The purpose of this target is to enable the U.S to remain competitive with China in the growing electric vehicle (EV) market and meet its international climate commitments. Setting ambitious EV sales targets and transitioning to zero-carbon power sources in the United States and other nations could lead to significant reductions in carbon dioxide and other greenhouse gas emissions in the transportation sector and move the world closer to achieving the Paris Agreement’s long-term goal of keeping global warming well below 2 degrees Celsius relative to preindustrial levels.

    At this time, electrification of the transportation sector is occurring primarily in private light-duty vehicles (LDVs). In 2020, the global EV fleet exceeded 10 million, but that’s a tiny fraction of the cars and light trucks on the road. How much of the LDV fleet will need to go electric to keep the Paris climate goal in play? 

    To help answer that question, researchers at the MIT Joint Program on the Science and Policy of Global Change and MIT Energy Initiative have assessed the potential impacts of global efforts to reduce carbon dioxide emissions on the evolution of LDV fleets over the next three decades.

    Using an enhanced version of the multi-region, multi-sector MIT Economic Projection and Policy Analysis (EPPA) model that includes a representation of the household transportation sector, they projected changes for the 2020-50 period in LDV fleet composition, carbon dioxide emissions, and related impacts for 18 different regions. Projections were generated under four increasingly ambitious climate mitigation scenarios: a “Reference” scenario based on current market trends and fuel efficiency policies, a “Paris Forever” scenario in which current Paris Agreement commitments (Nationally Determined Contributions, or NDCs) are maintained but not strengthened after 2030, a “Paris to 2 C” scenario in which decarbonization actions are enhanced to be consistent with capping global warming at 2 C, and an “Accelerated Actions” scenario the caps global warming at 1.5 C through much more aggressive emissions targets than the current NDCs.

    Based on projections spanning the first three scenarios, the researchers found that the global EV fleet will likely grow to about 95-105 million EVs by 2030, and 585-823 million EVs by 2050. In the Accelerated Actions scenario, global EV stock reaches more than 200 million vehicles in 2030, and more than 1 billion in 2050, accounting for two-thirds of the global LDV fleet. The research team also determined that EV uptake will likely grow but vary across regions over the 30-year study time frame, with China, the United States, and Europe remaining the largest markets. Finally, the researchers found that while EVs play a role in reducing oil use, a more substantial reduction in oil consumption comes from economy-wide carbon pricing. The results appear in a study in the journal Economics of Energy & Environmental Policy.

    “Our study shows that EVs can contribute significantly to reducing global carbon emissions at a manageable cost,” says MIT Joint Program Deputy Director and MIT Energy Initiative Senior Research Scientist Sergey Paltsev, the lead author. “We hope that our findings will help decision-makers to design efficient pathways to reduce emissions.”  

    To boost the EV share of the global LDV fleet, the study’s co-authors recommend more ambitious policies to mitigate climate change and decarbonize the electric grid. They also envision an “integrated system approach” to transportation that emphasizes making internal combustion engine vehicles more efficient, a long-term shift to low- and net-zero carbon fuels, and systemic efficiency improvements through digitalization, smart pricing, and multi-modal integration. While the study focuses on EV deployment, the authors also stress for the need for investment in all possible decarbonization options related to transportation, including enhancing public transportation, avoiding urban sprawl through strategic land-use planning, and reducing the use of private motorized transport by mode switching to walking, biking, and mass transit.

    This research is an extension of the authors’ contribution to the MIT Mobility of the Future study. More