More stories

  • in

    Developing electricity-powered, low-emissions alternatives to carbon-intensive industrial processes

    On April 11, 2022, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This is the second article in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    One of the biggest leaps that humankind could take to drastically lower greenhouse gas emissions globally would be the complete decarbonization of industry. But without finding low-cost, environmentally friendly substitutes for industrial materials, the traditional production of steel, cement, ammonia, and ethylene will continue pumping out billions of tons of carbon annually; these sectors alone are responsible for at least one third of society’s global greenhouse gas emissions. 

    A major problem is that industrial manufacturers, whose success depends on reliable, cost-efficient, and large-scale production methods, are too heavily invested in processes that have historically been powered by fossil fuels to quickly switch to new alternatives. It’s a machine that kicked on more than 100 years ago, and which MIT electrochemical engineer Yet-Ming Chiang says we can’t shut off without major disruptions to the world’s massive supply chain of these materials. What’s needed, Chiang says, is a broader, collaborative clean energy effort that takes “targeted fundamental research, all the way through to pilot demonstrations that greatly lowers the risk for adoption of new technology by industry.”

    This would be a new approach to decarbonization of industrial materials production that relies on largely unexplored but cleaner electrochemical processes. New production methods could be optimized and integrated into the industrial machine to make it run on low-cost, renewable electricity in place of fossil fuels. 

    Recognizing this, Chiang, the Kyocera Professor in the Department of Materials Science and Engineering, teamed with research collaborator Bilge Yildiz, the Breene M. Kerr Professor of Nuclear Science and Engineering and professor of materials science and engineering, with key input from Karthish Manthiram, visiting professor in the Department of Chemical Engineering, to submit a project proposal to the MIT Climate Grand Challenges. Their plan: to create an innovation hub on campus that would bring together MIT researchers individually investigating decarbonization of steel, cement, ammonia, and ethylene under one roof, combining research equipment and directly collaborating on new methods to produce these four key materials.

    Many researchers across MIT have already signed on to join the effort, including Antoine Allanore, associate professor of metallurgy, who specializes in the development of sustainable materials and manufacturing processes, and Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in the Department of Materials Science and Engineering, who is an expert in materials economics and sustainability. Other MIT faculty currently involved include Fikile Brushett, Betar Gallant, Ahmed Ghoniem, William Green, Jeffrey Grossman, Ju Li, Yuriy Román-Leshkov, Yang Shao-Horn, Robert Stoner, Yogesh Surendranath, Timothy Swager, and Kripa Varanasi.

    “The team we brought together has the expertise needed to tackle these challenges, including electrochemistry — using electricity to decarbonize these chemical processes — and materials science and engineering, process design and scale-up technoeconomic analysis, and system integration, which is all needed for this to go out from our labs to the field,” says Yildiz.

    Selected from a field of more than 100 proposals, their Center for Electrification and Decarbonization of Industry (CEDI) will be the first such institute worldwide dedicated to testing and scaling the most innovative and promising technologies in sustainable chemicals and materials. CEDI will work to facilitate rapid translation of lab discoveries into affordable, scalable industry solutions, with potential to offset as much as 15 percent of greenhouse gas emissions. The team estimates that some CEDI projects already underway could be commercialized within three years.

    “The real timeline is as soon as possible,” says Chiang.

    To achieve CEDI’s ambitious goals, a physical location is key, staffed with permanent faculty, as well as undergraduates, graduate students, and postdocs. Yildiz says the center’s success will depend on engaging student researchers to carry forward with research addressing the biggest ongoing challenges to decarbonization of industry.

    “We are training young scientists, students, on the learned urgency of the problem,” says Yildiz. “We empower them with the skills needed, and even if an individual project does not find the implementation in the field right away, at least, we would have trained the next generation that will continue to go after them in the field.”

    Chiang’s background in electrochemistry showed him how the efficiency of cement production could benefit from adopting clean electricity sources, and Yildiz’s work on ethylene, the source of plastic and one of industry’s most valued chemicals, has revealed overlooked cost benefits to switching to electrochemical processes with less expensive starting materials. With industry partners, they hope to continue these lines of fundamental research along with Allanore, who is focused on electrifying steel production, and Manthiram, who is developing new processes for ammonia. Olivetti will focus on understanding risks and barriers to implementation. This multilateral approach aims to speed up the timeline to industry adoption of new technologies at the scale needed for global impact.

    “One of the points of emphasis in this whole center is going to be applying technoeconomic analysis of what it takes to be successful at a technical and economic level, as early in the process as possible,” says Chiang.

    The impact of large-scale industry adoption of clean energy sources in these four key areas that CEDI plans to target first would be profound, as these sectors are currently responsible for 7.5 billion tons of emissions annually. There is the potential for even greater impact on emissions as new knowledge is applied to other industrial products beyond the initial four targets of steel, cement, ammonia, and ethylene. Meanwhile, the center will stand as a hub to attract new industry, government stakeholders, and research partners to collaborate on urgently needed solutions, both newly arising and long overdue.

    When Chiang and Yildiz first met to discuss ideas for MIT Climate Grand Challenges, they decided they wanted to build a climate research center that functioned unlike any other to help pivot large industry toward decarbonization. Beyond considering how new solutions will impact industry’s bottom line, CEDI will also investigate unique synergies that could arise from the electrification of industry, like processes that would create new byproducts that could be the feedstock to other industry processes, reducing waste and increasing efficiencies in the larger system. And because industry is so good at scaling, those added benefits would be widespread, finally replacing century-old technologies with critical updates designed to improve production and markedly reduce industry’s carbon footprint sooner rather than later.

    “Everything we do, we’re going to try to do with urgency,” Chiang says. “The fundamental research will be done with urgency, and the transition to commercialization, we’re going to do with urgency.” More

  • in

    A new heat engine with no moving parts is as efficient as a steam turbine

    Engineers at MIT and the National Renewable Energy Laboratory (NREL) have designed a heat engine with no moving parts. Their new demonstrations show that it converts heat to electricity with over 40 percent efficiency — a performance better than that of traditional steam turbines.

    The heat engine is a thermophotovoltaic (TPV) cell, similar to a solar panel’s photovoltaic cells, that passively captures high-energy photons from a white-hot heat source and converts them into electricity. The team’s design can generate electricity from a heat source of between 1,900 to 2,400 degrees Celsius, or up to about 4,300 degrees Fahrenheit.

    The researchers plan to incorporate the TPV cell into a grid-scale thermal battery. The system would absorb excess energy from renewable sources such as the sun and store that energy in heavily insulated banks of hot graphite. When the energy is needed, such as on overcast days, TPV cells would convert the heat into electricity, and dispatch the energy to a power grid.

    With the new TPV cell, the team has now successfully demonstrated the main parts of the system in separate, small-scale experiments. They are working to integrate the parts to demonstrate a fully operational system. From there, they hope to scale up the system to replace fossil-fuel-driven power plants and enable a fully decarbonized power grid, supplied entirely by renewable energy.

    “Thermophotovoltaic cells were the last key step toward demonstrating that thermal batteries are a viable concept,” says Asegun Henry, the Robert N. Noyce Career Development Professor in MIT’s Department of Mechanical Engineering. “This is an absolutely critical step on the path to proliferate renewable energy and get to a fully decarbonized grid.”

    Henry and his collaborators have published their results today in the journal Nature. Co-authors at MIT include Alina LaPotin, Kevin Schulte, Kyle Buznitsky, Colin Kelsall, Andrew Rohskopf, and Evelyn Wang, the Ford Professor of Engineering and head of the Department of Mechanical Engineering, along with collaborators at NREL in Golden, Colorado.

    Jumping the gap

    More than 90 percent of the world’s electricity comes from sources of heat such as coal, natural gas, nuclear energy, and concentrated solar energy. For a century, steam turbines have been the industrial standard for converting such heat sources into electricity.

    On average, steam turbines reliably convert about 35 percent of a heat source into electricity, with about 60 percent representing the highest efficiency of any heat engine to date. But the machinery depends on moving parts that are temperature- limited. Heat sources higher than 2,000 degrees Celsius, such as Henry’s proposed thermal battery system, would be too hot for turbines.

    In recent years, scientists have looked into solid-state alternatives — heat engines with no moving parts, that could potentially work efficiently at higher temperatures.

    “One of the advantages of solid-state energy converters are that they can operate at higher temperatures with lower maintenance costs because they have no moving parts,” Henry says. “They just sit there and reliably generate electricity.”

    Thermophotovoltaic cells offered one exploratory route toward solid-state heat engines. Much like solar cells, TPV cells could be made from semiconducting materials with a particular bandgap — the gap between a material’s valence band and its conduction band. If a photon with a high enough energy is absorbed by the material, it can kick an electron across the bandgap, where the electron can then conduct, and thereby generate electricity — doing so without moving rotors or blades.

    To date, most TPV cells have only reached efficiencies of around 20 percent, with the record at 32 percent, as they have been made of relatively low-bandgap materials that convert lower-temperature, low-energy photons, and therefore convert energy less efficiently.

    Catching light

    In their new TPV design, Henry and his colleagues looked to capture higher-energy photons from a higher-temperature heat source, thereby converting energy more efficiently. The team’s new cell does so with higher-bandgap materials and multiple junctions, or material layers, compared with existing TPV designs.

    The cell is fabricated from three main regions: a high-bandgap alloy, which sits over a slightly lower-bandgap alloy, underneath which is a mirror-like layer of gold. The first layer captures a heat source’s highest-energy photons and converts them into electricity, while lower-energy photons that pass through the first layer are captured by the second and converted to add to the generated voltage. Any photons that pass through this second layer are then reflected by the mirror, back to the heat source, rather than being absorbed as wasted heat.

    The team tested the cell’s efficiency by placing it over a heat flux sensor — a device that directly measures the heat absorbed from the cell. They exposed the cell to a high-temperature lamp and concentrated the light onto the cell. They then varied the bulb’s intensity, or temperature, and observed how the cell’s power efficiency — the amount of power it produced, compared with the heat it absorbed — changed with temperature. Over a range of 1,900 to 2,400 degrees Celsius, the new TPV cell maintained an efficiency of around 40 percent.

    “We can get a high efficiency over a broad range of temperatures relevant for thermal batteries,” Henry says.

    The cell in the experiments is about a square centimeter. For a grid-scale thermal battery system, Henry envisions the TPV cells would have to scale up to about 10,000 square feet (about a quarter of a football field), and would operate in climate-controlled warehouses to draw power from huge banks of stored solar energy. He points out that an infrastructure exists for making large-scale photovoltaic cells, which could also be adapted to manufacture TPVs.

    “There’s definitely a huge net positive here in terms of sustainability,” Henry says. “The technology is safe, environmentally benign in its life cycle, and can have a tremendous impact on abating carbon dioxide emissions from electricity production.”

    This research was supported, in part, by the U.S. Department of Energy. More

  • in

    Computing our climate future

    On Monday, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the first in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    With improvements to computer processing power and an increased understanding of the physical equations governing the Earth’s climate, scientists are continually working to refine climate models and improve their predictive power. But the tools they’re refining were originally conceived decades ago with only scientists in mind. When it comes to developing tangible climate action plans, these models remain inscrutable to the policymakers, public safety officials, civil engineers, and community organizers who need their predictive insight most.

    “What you end up having is a gap between what’s typically used in practice, and the real cutting-edge science,” says Noelle Selin, a professor in the Institute for Data, Systems and Society and the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and co-lead with Professor Raffaele Ferrari on the MIT Climate Grand Challenges flagship project “Bringing Computation to the Climate Crisis.” “How can we use new computational techniques, new understandings, new ways of thinking about modeling, to really bridge that gap between state-of-the-art scientific advances and modeling, and people who are actually needing to use these models?”

    Using this as a driving question, the team won’t just be trying to refine current climate models, they’re building a new one from the ground up.

    This kind of game-changing advancement is exactly what the MIT Climate Grand Challenges is looking for, which is why the proposal has been named one of the five flagship projects in the ambitious Institute-wide program aimed at tackling the climate crisis. The proposal, which was selected from 100 submissions and was among 27 finalists, will receive additional funding and support to further their goal of reimagining the climate modeling system. It also brings together contributors from across the Institute, including the MIT Schwarzman College of Computing, the School of Engineering, and the Sloan School of Management.

    When it comes to pursuing high-impact climate solutions that communities around the world can use, “it’s great to do it at MIT,” says Ferrari, EAPS Cecil and Ida Green Professor of Oceanography. “You’re not going to find many places in the world where you have the cutting-edge climate science, the cutting-edge computer science, and the cutting-edge policy science experts that we need to work together.”

    The climate model of the future

    The proposal builds on work that Ferrari began three years ago as part of a joint project with Caltech, the Naval Postgraduate School, and NASA’s Jet Propulsion Lab. Called the Climate Modeling Alliance (CliMA), the consortium of scientists, engineers, and applied mathematicians is constructing a climate model capable of more accurately projecting future changes in critical variables, such as clouds in the atmosphere and turbulence in the ocean, with uncertainties at least half the size of those in existing models.

    To do this, however, requires a new approach. For one thing, current models are too coarse in resolution — at the 100-to-200-kilometer scale — to resolve small-scale processes like cloud cover, rainfall, and sea ice extent. But also, explains Ferrari, part of this limitation in resolution is due to the fundamental architecture of the models themselves. The languages most global climate models are coded in were first created back in the 1960s and ’70s, largely by scientists for scientists. Since then, advances in computing driven by the corporate world and computer gaming have given rise to dynamic new computer languages, powerful graphics processing units, and machine learning.

    For climate models to take full advantage of these advancements, there’s only one option: starting over with a modern, more flexible language. Written in Julia, a part of Julialab’s Scientific Machine Learning technology, and spearheaded by Alan Edelman, a professor of applied mathematics in MIT’s Department of Mathematics, CliMA will be able to harness far more data than the current models can handle.

    “It’s been real fun finally working with people in computer science here at MIT,” Ferrari says. “Before it was impossible, because traditional climate models are in a language their students can’t even read.”

    The result is what’s being called the “Earth digital twin,” a climate model that can simulate global conditions on a large scale. This on its own is an impressive feat, but the team wants to take this a step further with their proposal.

    “We want to take this large-scale model and create what we call an ‘emulator’ that is only predicting a set of variables of interest, but it’s been trained on the large-scale model,” Ferrari explains. Emulators are not new technology, but what is new is that these emulators, being referred to as the “Earth digital cousins,” will take advantage of machine learning.

    “Now we know how to train a model if we have enough data to train them on,” says Ferrari. Machine learning for projects like this has only become possible in recent years as more observational data become available, along with improved computer processing power. The goal is to create smaller, more localized models by training them using the Earth digital twin. Doing so will save time and money, which is key if the digital cousins are going to be usable for stakeholders, like local governments and private-sector developers.

    Adaptable predictions for average stakeholders

    When it comes to setting climate-informed policy, stakeholders need to understand the probability of an outcome within their own regions — in the same way that you would prepare for a hike differently if there’s a 10 percent chance of rain versus a 90 percent chance. The smaller Earth digital cousin models will be able to do things the larger model can’t do, like simulate local regions in real time and provide a wider range of probabilistic scenarios.

    “Right now, if you wanted to use output from a global climate model, you usually would have to use output that’s designed for general use,” says Selin, who is also the director of the MIT Technology and Policy Program. With the project, the team can take end-user needs into account from the very beginning while also incorporating their feedback and suggestions into the models, helping to “democratize the idea of running these climate models,” as she puts it. Doing so means building an interactive interface that eventually will give users the ability to change input values and run the new simulations in real time. The team hopes that, eventually, the Earth digital cousins could run on something as ubiquitous as a smartphone, although developments like that are currently beyond the scope of the project.

    The next thing the team will work on is building connections with stakeholders. Through participation of other MIT groups, such as the Joint Program on the Science and Policy of Global Change and the Climate and Sustainability Consortium, they hope to work closely with policymakers, public safety officials, and urban planners to give them predictive tools tailored to their needs that can provide actionable outputs important for planning. Faced with rising sea levels, for example, coastal cities could better visualize the threat and make informed decisions about infrastructure development and disaster preparedness; communities in drought-prone regions could develop long-term civil planning with an emphasis on water conservation and wildfire resistance.

    “We want to make the modeling and analysis process faster so people can get more direct and useful feedback for near-term decisions,” she says.

    The final piece of the challenge is to incentivize students now so that they can join the project and make a difference. Ferrari has already had luck garnering student interest after co-teaching a class with Edelman and seeing the enthusiasm students have about computer science and climate solutions.

    “We’re intending in this project to build a climate model of the future,” says Selin. “So it seems really appropriate that we would also train the builders of that climate model.” More

  • in

    MIT announces five flagship projects in first-ever Climate Grand Challenges competition

    MIT today announced the five flagship projects selected in its first-ever Climate Grand Challenges competition. These multiyear projects will define a dynamic research agenda focused on unraveling some of the toughest unsolved climate problems and bringing high-impact, science-based solutions to the world on an accelerated basis.

    Representing the most promising concepts to emerge from the two-year competition, the five flagship projects will receive additional funding and resources from MIT and others to develop their ideas and swiftly transform them into practical solutions at scale.

    “Climate Grand Challenges represents a whole-of-MIT drive to develop game-changing advances to confront the escalating climate crisis, in time to make a difference,” says MIT President L. Rafael Reif. “We are inspired by the creativity and boldness of the flagship ideas and by their potential to make a significant contribution to the global climate response. But given the planet-wide scale of the challenge, success depends on partnership. We are eager to work with visionary leaders in every sector to accelerate this impact-oriented research, implement serious solutions at scale, and inspire others to join us in confronting this urgent challenge for humankind.”

    Brief descriptions of the five Climate Grand Challenges flagship projects are provided below.

    Bringing Computation to the Climate Challenge

    This project leverages advances in artificial intelligence, machine learning, and data sciences to improve the accuracy of climate models and make them more useful to a variety of stakeholders — from communities to industry. The team is developing a digital twin of the Earth that harnesses more data than ever before to reduce and quantify uncertainties in climate projections.

    Research leads: Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in the Department of Earth, Atmospheric and Planetary Sciences, and director of the Program in Atmospheres, Oceans, and Climate; and Noelle Eckley Selin, director of the Technology and Policy Program and professor with a joint appointment in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences

    Center for Electrification and Decarbonization of Industry

    This project seeks to reinvent and electrify the processes and materials behind hard-to-decarbonize industries like steel, cement, ammonia, and ethylene production. A new innovation hub will perform targeted fundamental research and engineering with urgency, pushing the technological envelope on electricity-driven chemical transformations.

    Research leads: Yet-Ming Chiang, the Kyocera Professor of Materials Science and Engineering, and Bilge Yıldız, the Breene M. Kerr Professor in the Department of Nuclear Science and Engineering and professor in the Department of Materials Science and Engineering

    Preparing for a new world of weather and climate extremes

    This project addresses key gaps in knowledge about intensifying extreme events such as floods, hurricanes, and heat waves, and quantifies their long-term risk in a changing climate. The team is developing a scalable climate-change adaptation toolkit to help vulnerable communities and low-carbon energy providers prepare for these extreme weather events.

    Research leads: Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Sciences and co-director of the MIT Lorenz Center; Miho Mazereeuw, associate professor of architecture and urbanism in the Department of Architecture and director of the Urban Risk Lab; and Paul O’Gorman, professor in the Program in Atmospheres, Oceans, and Climate in the Department of Earth, Atmospheric and Planetary Sciences

    The Climate Resilience Early Warning System

    The CREWSnet project seeks to reinvent climate change adaptation with a novel forecasting system that empowers underserved communities to interpret local climate risk, proactively plan for their futures incorporating resilience strategies, and minimize losses. CREWSnet will initially be demonstrated in southwestern Bangladesh, serving as a model for similarly threatened regions around the world.

    Research leads: John Aldridge, assistant leader of the Humanitarian Assistance and Disaster Relief Systems Group at MIT Lincoln Laboratory, and Elfatih Eltahir, the H.M. King Bhumibol Professor of Hydrology and Climate in the Department of Civil and Environmental Engineering

    Revolutionizing agriculture with low-emissions, resilient crops

    This project works to revolutionize the agricultural sector with climate-resilient crops and fertilizers that have the ability to dramatically reduce greenhouse gas emissions from food production.

    Research lead: Christopher Voigt, the Daniel I.C. Wang Professor in the Department of Biological Engineering

    “As one of the world’s leading institutions of research and innovation, it is incumbent upon MIT to draw on our depth of knowledge, ingenuity, and ambition to tackle the hard climate problems now confronting the world,” says Richard Lester, MIT associate provost for international activities. “Together with collaborators across industry, finance, community, and government, the Climate Grand Challenges teams are looking to develop and implement high-impact, path-breaking climate solutions rapidly and at a grand scale.”

    The initial call for ideas in 2020 yielded nearly 100 letters of interest from almost 400 faculty members and senior researchers, representing 90 percent of MIT departments. After an extensive evaluation, 27 finalist teams received a total of $2.7 million to develop comprehensive research and innovation plans. The projects address four broad research themes:

    To select the winning projects, research plans were reviewed by panels of international experts representing relevant scientific and technical domains as well as experts in processes and policies for innovation and scalability.

    “In response to climate change, the world really needs to do two things quickly: deploy the solutions we already have much more widely, and develop new solutions that are urgently needed to tackle this intensifying threat,” says Maria Zuber, MIT vice president for research. “These five flagship projects exemplify MIT’s strong determination to bring its knowledge and expertise to bear in generating new ideas and solutions that will help solve the climate problem.”

    “The Climate Grand Challenges flagship projects set a new standard for inclusive climate solutions that can be adapted and implemented across the globe,” says MIT Chancellor Melissa Nobles. “This competition propels the entire MIT research community — faculty, students, postdocs, and staff — to act with urgency around a worsening climate crisis, and I look forward to seeing the difference these projects can make.”

    “MIT’s efforts on climate research amid the climate crisis was a primary reason that I chose to attend MIT, and remains a reason that I view the Institute favorably. MIT has a clear opportunity to be a thought leader in the climate space in our own MIT way, which is why CGC fits in so well,” says senior Megan Xu, who served on the Climate Grand Challenges student committee and is studying ways to make the food system more sustainable.

    The Climate Grand Challenges competition is a key initiative of “Fast Forward: MIT’s Climate Action Plan for the Decade,” which the Institute published in May 2021. Fast Forward outlines MIT’s comprehensive plan for helping the world address the climate crisis. It consists of five broad areas of action: sparking innovation, educating future generations, informing and leveraging government action, reducing MIT’s own climate impact, and uniting and coordinating all of MIT’s climate efforts. More

  • in

    New England renewables + Canadian hydropower

    The urgent need to cut carbon emissions has prompted a growing number of U.S. states to commit to achieving 100 percent clean electricity by 2040 or 2050. But figuring out how to meet those commitments and still have a reliable and affordable power system is a challenge. Wind and solar installations will form the backbone of a carbon-free power system, but what technologies can meet electricity demand when those intermittent renewable sources are not adequate?

    In general, the options being discussed include nuclear power, natural gas with carbon capture and storage (CCS), and energy storage technologies such as new and improved batteries and chemical storage in the form of hydrogen. But in the northeastern United States, there is one more possibility being proposed: electricity imported from hydropower plants in the neighboring Canadian province of Quebec.

    The proposition makes sense. Those plants can produce as much electricity as about 40 large nuclear power plants, and some power generated in Quebec already comes to the Northeast. So, there could be abundant additional supply to fill any shortfall when New England’s intermittent renewables underproduce. However, U.S. wind and solar investors view Canadian hydropower as a competitor and argue that reliance on foreign supply discourages further U.S. investment.

    Two years ago, three researchers affiliated with the MIT Center for Energy and Environmental Policy Research (CEEPR) — Emil Dimanchev SM ’18, now a PhD candidate at the Norwegian University of Science and Technology; Joshua Hodge, CEEPR’s executive director; and John Parsons, a senior lecturer in the MIT Sloan School of Management — began wondering whether viewing Canadian hydro as another source of electricity might be too narrow. “Hydropower is a more-than-hundred-year-old technology, and plants are already built up north,” says Dimanchev. “We might not need to build something new. We might just need to use those plants differently or to a greater extent.”

    So the researchers decided to examine the potential role and economic value of Quebec’s hydropower resource in a future low-carbon system in New England. Their goal was to help inform policymakers, utility decision-makers, and others about how best to incorporate Canadian hydropower into their plans and to determine how much time and money New England should spend to integrate more hydropower into its system. What they found out was surprising, even to them.

    The analytical methods

    To explore possible roles for Canadian hydropower to play in New England’s power system, the MIT researchers first needed to predict how the regional power system might look in 2050 — both the resources in place and how they would be operated, given any policy constraints. To perform that analysis, they used GenX, a modeling tool originally developed by Jesse Jenkins SM ’14, PhD ’18 and Nestor Sepulveda SM ’16, PhD ’20 while they were researchers at the MIT Energy Initiative (MITEI).

    The GenX model is designed to support decision-making related to power system investment and real-time operation and to examine the impacts of possible policy initiatives on those decisions. Given information on current and future technologies — different kinds of power plants, energy storage technologies, and so on — GenX calculates the combination of equipment and operating conditions that can meet a defined future demand at the lowest cost. The GenX modeling tool can also incorporate specified policy constraints, such as limits on carbon emissions.

    For their study, Dimanchev, Hodge, and Parsons set parameters in the GenX model using data and assumptions derived from a variety of sources to build a representation of the interconnected power systems in New England, New York, and Quebec. (They included New York to account for that state’s existing demand on the Canadian hydro resources.) For data on the available hydropower, they turned to Hydro-Québec, the public utility that owns and operates most of the hydropower plants in Quebec.

    It’s standard in such analyses to include real-world engineering constraints on equipment, such as how quickly certain power plants can be ramped up and down. With help from Hydro-Québec, the researchers also put hour-to-hour operating constraints on the hydropower resource.

    Most of Hydro-Québec’s plants are “reservoir hydropower” systems. In them, when power isn’t needed, the flow on a river is restrained by a dam downstream of a reservoir, and the reservoir fills up. When power is needed, the dam is opened, and the water in the reservoir runs through downstream pipes, turning turbines and generating electricity. Proper management of such a system requires adhering to certain operating constraints. For example, to prevent flooding, reservoirs must not be allowed to overfill — especially prior to spring snowmelt. And generation can’t be increased too quickly because a sudden flood of water could erode the river edges or disrupt fishing or water quality.

    Based on projections from the National Renewable Energy Laboratory and elsewhere, the researchers specified electricity demand for every hour of the year 2050, and the model calculated the cost-optimal mix of technologies and system operating regime that would satisfy that hourly demand, including the dispatch of the Hydro-Québec hydropower system. In addition, the model determined how electricity would be traded among New England, New York, and Quebec.

    Effects of decarbonization limits on technology mix and electricity trading

    To examine the impact of the emissions-reduction mandates in the New England states, the researchers ran the model assuming reductions in carbon emissions between 80 percent and 100 percent relative to 1990 levels. The results of those runs show that, as emissions limits get more stringent, New England uses more wind and solar and extends the lifetime of its existing nuclear plants. To balance the intermittency of the renewables, the region uses natural gas plants, demand-side management, battery storage (modeled as lithium-ion batteries), and trading with Quebec’s hydropower-based system. Meanwhile, the optimal mix in Quebec is mostly composed of existing hydro generation. Some solar is added, but new reservoirs are built only if renewable costs are assumed to be very high.

    The most significant — and perhaps surprising — outcome is that in all the scenarios, the hydropower-based system of Quebec is not only an exporter but also an importer of electricity, with the direction of flow on the Quebec-New England transmission lines changing over time.

    Historically, energy has always flowed from Quebec to New England. The model results for 2018 show electricity flowing from north to south, with the quantity capped by the current transmission capacity limit of 2,225 megawatts (MW).

    An analysis for 2050, assuming that New England decarbonizes 90 percent and the capacity of the transmission lines remains the same, finds electricity flows going both ways. Flows from north to south still dominate. But for nearly 3,500 of the 8,760 hours of the year, electricity flows in the opposite direction — from New England to Quebec. And for more than 2,200 of those hours, the flow going north is at the maximum the transmission lines can carry.

    The direction of flow is motivated by economics. When renewable generation is abundant in New England, prices are low, and it’s cheaper for Quebec to import electricity from New England and conserve water in its reservoirs. Conversely, when New England’s renewables are scarce and prices are high, New England imports hydro-generated electricity from Quebec.

    So rather than delivering electricity, Canadian hydro provides a means of storing the electricity generated by the intermittent renewables in New England.

    “We see this in our modeling because when we tell the model to meet electricity demand using these resources, the model decides that it is cost-optimal to use the reservoirs to store energy rather than anything else,” says Dimanchev. “We should be sending the energy back and forth, so the reservoirs in Quebec are in essence a battery that we use to store some of the electricity produced by our intermittent renewables and discharge it when we need it.”

    Given that outcome, the researchers decided to explore the impact of expanding the transmission capacity between New England and Quebec. Building transmission lines is always contentious, but what would be the impact if it could be done?

    Their model results shows that when transmission capacity is increased from 2,225 MW to 6,225 MW, flows in both directions are greater, and in both cases the flow is at the new maximum for more than 1,000 hours.

    Results of the analysis thus confirm that the economic response to expanded transmission capacity is more two-way trading. To continue the battery analogy, more transmission capacity to and from Quebec effectively increases the rate at which the battery can be charged and discharged.

    Effects of two-way trading on the energy mix

    What impact would the advent of two-way trading have on the mix of energy-generating sources in New England and Quebec in 2050?

    Assuming current transmission capacity, in New England, the change from one-way to two-way trading increases both wind and solar power generation and to a lesser extent nuclear; it also decreases the use of natural gas with CCS. The hydro reservoirs in Canada can provide long-duration storage — over weeks, months, and even seasons — so there is less need for natural gas with CCS to cover any gaps in supply. The level of imports is slightly lower, but now there are also exports. Meanwhile, in Quebec, two-way trading reduces solar power generation, and the use of wind disappears. Exports are roughly the same, but now there are imports as well. Thus, two-way trading reallocates renewables from Quebec to New England, where it’s more economical to install and operate solar and wind systems.

    Another analysis examined the impact on the energy mix of assuming two-way trading plus expanded transmission capacity. For New England, greater transmission capacity allows wind, solar, and nuclear to expand further; natural gas with CCS all but disappears; and both imports and exports increase significantly. In Quebec, solar decreases still further, and both exports and imports of electricity increase.

    Those results assume that the New England power system decarbonizes by 99 percent in 2050 relative to 1990 levels. But at 90 percent and even 80 percent decarbonization levels, the model concludes that natural gas capacity decreases with the addition of new transmission relative to the current transmission scenario. Existing plants are retired, and new plants are not built as they are no longer economically justified. Since natural gas plants are the only source of carbon emissions in the 2050 energy system, the researchers conclude that the greater access to hydro reservoirs made possible by expanded transmission would accelerate the decarbonization of the electricity system.

    Effects of transmission changes on costs

    The researchers also explored how two-way trading with expanded transmission capacity would affect costs in New England and Quebec, assuming 99 percent decarbonization in New England. New England’s savings on fixed costs (investments in new equipment) are largely due to a decreased need to invest in more natural gas with CCS, and its savings on variable costs (operating costs) are due to a reduced need to run those plants. Quebec’s savings on fixed costs come from a reduced need to invest in solar generation. The increase in cost — borne by New England — reflects the construction and operation of the increased transmission capacity. The net benefit for the region is substantial.

    Thus, the analysis shows that everyone wins as transmission capacity increases — and the benefit grows as the decarbonization target tightens. At 99 percent decarbonization, the overall New England-Quebec region pays about $21 per megawatt-hour (MWh) of electricity with today’s transmission capacity but only $18/MWh with expanded transmission. Assuming 100 percent reduction in carbon emissions, the region pays $29/MWh with current transmission capacity and only $22/MWh with expanded transmission.

    Addressing misconceptions

    These results shed light on several misconceptions that policymakers, supporters of renewable energy, and others tend to have.

    The first misconception is that the New England renewables and Canadian hydropower are competitors. The modeling results instead show that they’re complementary. When the power systems in New England and Quebec work together as an integrated system, the Canadian reservoirs are used part of the time to store the renewable electricity. And with more access to hydropower storage in Quebec, there’s generally more renewable investment in New England.

    The second misconception arises when policymakers refer to Canadian hydro as a “baseload resource,” which implies a dependable source of electricity — particularly one that supplies power all the time. “Our study shows that by viewing Canadian hydropower as a baseload source of electricity — or indeed a source of electricity at all — you’re not taking full advantage of what that resource can provide,” says Dimanchev. “What we show is that Quebec’s reservoir hydro can provide storage, specifically for wind and solar. It’s a solution to the intermittency problem that we foresee in carbon-free power systems for 2050.”

    While the MIT analysis focuses on New England and Quebec, the researchers believe that their results may have wider implications. As power systems in many regions expand production of renewables, the value of storage grows. Some hydropower systems have storage capacity that has not yet been fully utilized and could be a good complement to renewable generation. Taking advantage of that capacity can lower the cost of deep decarbonization and help move some regions toward a decarbonized supply of electricity.

    This research was funded by the MIT Center for Energy and Environmental Policy Research, which is supported in part by a consortium of industry and government associates.

    This article appears in the Autumn 2021 issue of Energy Futures, the magazine of the MIT Energy Initiative. More

  • in

    Leveraging science and technology against the world’s top problems

    Looking back on nearly a half-century at MIT, Richard K. Lester, associate provost and Japan Steel Industry Professor, sees a “somewhat eccentric professional trajectory.”

    But while his path has been irregular, there has been a clearly defined through line, Lester says: the emergence of new science and new technologies, the potential of these developments to shake up the status quo and address some of society’s most consequential problems, and what the outcomes might mean for America’s place in the world.

    Perhaps no assignment in Lester’s portfolio better captures this theme than the new MIT Climate Grand Challenges competition. Spearheaded by Lester and Maria Zuber, MIT vice president for research, and launched at the height of the pandemic in summer 2020, this initiative is designed to mobilize the entire MIT research community around tackling “the really hard, challenging problems currently standing in the way of an effective global response to the climate emergency,” says Lester. “The focus is on those problems where progress requires developing and applying frontier knowledge in the natural and social sciences and cutting-edge technologies. This is the MIT community swinging for the fences in areas where we have a comparative advantage.”This is a passion project for him, not least because it has engaged colleagues from nearly all of MIT’s departments. After nearly 100 initial ideas were submitted by more than 300 faculty, 27 teams were named finalists and received funding to develop comprehensive research and innovation plans in such areas as decarbonizing complex industries; risk forecasting and adaptation; advancing climate equity; and carbon removal, management, and storage. In April, a small subset of this group will become multiyear flagship projects, augmenting the work of existing MIT units that are pursuing climate research. Lester is sunny in the face of these extraordinarily complex problems. “This is a bottom-up effort with exciting proposals, and where the Institute is collectively committed — it’s MIT at its best.”

    Nuclear to the core

    This initiative carries a particular resonance for Lester, who remains deeply engaged in nuclear engineering. “The role of nuclear energy is central and will need to become even more central if we’re to succeed in addressing the climate challenge,” he says. He also acknowledges that for nuclear energy technologies — both fission and fusion — to play a vital role in decarbonizing the economy, they must not just win “in the court of public opinion, but in the marketplace,” he says. “Over the years, my research has sought to elucidate what needs to be done to overcome these obstacles.”

    In fact, Lester has been campaigning for much of his career for a U.S. nuclear innovation agenda, a commitment that takes on increased urgency as the contours of the climate crisis sharpen. He argues for the rapid development and testing of nuclear technologies that can complement the renewable but intermittent energy sources of sun and wind. Whether powerful, large-scale, molten-salt-cooled reactors or small, modular, light water reactors, nuclear batteries or promising new fusion projects, U.S. energy policy must embrace nuclear innovation, says Lester, or risk losing the high-stakes race for a sustainable future.

    Chancing into a discipline

    Lester’s introduction to nuclear science was pure happenstance.

    Born in the English industrial city of Leeds, he grew up in a musical family and played piano, violin, and then viola. “It was a big part of my life,” he says, and for a time, music beckoned as a career. He tumbled into a chemical engineering concentration at Imperial College, London, after taking a job in a chemical factory following high school. “There’s a certain randomness to life, and in my case, it’s reflected in my choice of major, which had a very large impact on my ultimate career.”

    In his second year, Lester talked his way into running a small experiment in the university’s research reactor, on radiation effects in materials. “I got hooked, and began thinking of studying nuclear engineering.” But there were few graduate programs in British universities at the time. Then serendipity struck again. The instructor of Lester’s single humanities course at Imperial had previously taught at MIT, and suggested Lester take a look at the nuclear program there. “I will always be grateful to him (and, indirectly, to MIT’s Humanities program) for opening my eyes to the existence of this institution where I’ve spent my whole adult life,” says Lester.

    He arrived at MIT with the notion of mitigating the harms of nuclear weapons. It was a time when the nuclear arms race “was an existential threat in everyone’s life,” he recalls. He targeted his graduate studies on nuclear proliferation. But he also encountered an electrifying study by MIT meteorologist Jule Charney. “Professor Charney produced one of the first scientific assessments of the effects on climate of increasing CO2 concentrations in the atmosphere, with quantitative estimates that have not fundamentally changed in 40 years.”

    Lester shifted directions. “I came to MIT to work on nuclear security, but stayed in the nuclear field because of the contributions that it can and must make in addressing climate change,” he says.

    Research and policy

    His path forward, Lester believed, would involve applying his science and technology expertise to critical policy problems, grounded in immediate, real-world concerns, and aiming for broad policy impacts. Even as a member of NSE, he joined with colleagues from many MIT departments to study American industrial practices and what was required to make them globally competitive, and then founded MIT’s Industrial Performance Center (IPC). Working at the IPC with interdisciplinary teams of faculty and students on the sources of productivity and innovation, his research took him to many countries at different stages of industrialization, including China, Taiwan, Japan, and Brazil.

    Lester’s wide-ranging work yielded books (including the MIT Press bestseller “Made in America”), advisory positions with governments, corporations, and foundations, and unexpected collaborations. “My interests were always fairly broad, and being at MIT made it possible to team up with world-leading scholars and extraordinary students not just in nuclear engineering, but in many other fields such as political science, economics, and management,” he says.

    Forging cross-disciplinary ties and bringing creative people together around a common goal proved a valuable skill as Lester stepped into positions of ever-greater responsibility at the Institute. He didn’t exactly relish the prospect of a desk job, though. “I religiously avoided administrative roles until I felt I couldn’t keep avoiding them,” he says.

    Today, as associate provost, he tends to MIT’s international activities — a daunting task given increasing scrutiny of research universities’ globe-spanning research partnerships and education of foreign students. But even in the midst of these consuming chores, Lester remains devoted to his home department. “Being a nuclear engineer is a central part of my identity,” he says.

    To students entering the nuclear field nearly 50 years after he did, who are understandably “eager to fix everything that seems wrong immediately,” he has a message: “Be patient. The hard things, the ones that are really worth doing, will take a long time to do.” Putting the climate crisis behind us will take two generations, Lester believes. Current students will start the job, but it will also take the efforts of their children’s generation before it is done.  “So we need you to be energetic and creative, of course, but whatever you do we also need you to be patient and to have ‘stick-to-itiveness’ — and maybe also a moral compass that our generation has lacked.” More

  • in

    Q&A: Climate Grand Challenges finalists on using data and science to forecast climate-related risk

    Note: This is the final article in a four-part interview series featuring the work of the 27 MIT Climate Grand Challenges finalist teams, which received a total of $2.7 million in startup funding to advance their projects. This month, the Institute will name a subset of the finalists as multiyear flagship projects.

    Advances in computation, artificial intelligence, robotics, and data science are enabling a new generation of observational tools and scientific modeling with the potential to produce timely, reliable, and quantitative analysis of future climate risks at a local scale. These projections can increase the accuracy and efficacy of early warning systems, improve emergency planning, and provide actionable information for climate mitigation and adaptation efforts, as human actions continue to change planetary conditions.

    In conversations prepared for MIT News, faculty from four Climate Grand Challenges teams with projects in the competition’s “Using data and science to forecast climate-related risk” category describe the promising new technologies that can help scientists understand the Earth’s climate system on a finer scale than ever before. (The other Climate Grand Challenges research themes include building equity and fairness into climate solutions, removing, managing, and storing greenhouse gases, and decarbonizing complex industries and processes.) The following responses have been edited for length and clarity.

    An observational system that can initiate a climate risk forecasting revolution

    Despite recent technological advances and massive volumes of data, climate forecasts remain highly uncertain. Gaps in observational capabilities create substantial challenges to predicting extreme weather events and establishing effective mitigation and adaptation strategies. R. John Hansman, the T. Wilson Professor of Aeronautics and Astronautics and director of the MIT International Center for Air Transportation, discusses the Stratospheric Airborne Climate Observatory System (SACOS) being developed together with Brent Minchew, the Cecil and Ida Green Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and a team that includes researchers from MIT Lincoln Laboratory and Harvard University.

    Q: How does SACOS reduce uncertainty in climate risk forecasting?

    A: There is a critical need for higher spatial and temporal resolution observations of the climate system than are currently available through remote (satellite or airborne) and surface (in-situ) sensing. We are developing an ensemble of high-endurance, solar-powered aircraft with instrument systems capable of performing months-long climate observing missions that satellites or aircraft alone cannot fulfill. Summer months are ideal for SACOS operations, as many key climate phenomena are active and short night periods reduce the battery mass, vehicle size, and technical risks. These observations hold the potential to inform and predict, allowing emergency planners, policymakers, and the rest of society to better prepare for the changes to come.

    Q: Describe the types of observing missions where SACOS could provide critical improvements.

    A: The demise of the Antarctic Ice Sheet, which is leading to rising sea levels around the world and threatening the displacement of millions of people, is one example. Current sea level forecasts struggle to account for giant fissures that create massive icebergs and cause the Antarctic Ice Sheet to flow more rapidly into the ocean. SACOS can track these fissures to accurately forecast ice slippage and give impacted populations enough time to prepare or evacuate. Elsewhere, widespread droughts cause rampant wildfires and water shortages. SACOS has the ability to monitor soil moisture and humidity in critically dry regions to identify where and when wildfires and droughts are imminent. SACOS also offers the most effective method to measure, track, and predict local ozone depletion over North America, which has resulted in increasingly severe summer thunderstorms.

    Quantifying and managing the risks of sea-level rise

    Prevailing estimates of sea-level rise range from approximately 20 centimeters to 2 meters by the end of the century, with the associated costs on the order of trillions of dollars. The instability of certain portions of the world’s ice sheets creates vast uncertainties, complicating how the world prepares for and responds to these potential changes. EAPS Professor Brent Minchew is leading another Climate Grand Challenges finalist team working on an integrated, multidisciplinary effort to improve the scientific understanding of sea-level rise and provide actionable information and tools to manage the risks it poses.

    Q: What have been the most significant challenges to understanding the potential rates of sea-level rise?

    A: West Antarctica is one of the most remote, inaccessible, and hostile places on Earth — to people and equipment. Thus, opportunities to observe the collapse of the West Antarctic Ice Sheet, which contains enough ice to raise global sea levels by about 3 meters, are limited and current observations crudely resolved. It is essential that we understand how the floating edge of the ice sheets, often called ice shelves, fracture and collapse because they provide critical forces that govern the rate of ice mass loss and can stabilize the West Antarctic Ice Sheet.

    Q: How will your project advance what is currently known about sea-level rise?

    A: We aim to advance global-scale projections of sea-level rise through novel observational technologies and computational models of ice sheet change and to link those predictions to region- to neighborhood-scale estimates of costs and adaptation strategies. To do this, we propose two novel instruments: a first-of-its-kind drone that can fly for months at a time over Antarctica making continuous observations of critical areas and an airdropped seismometer and GPS bundle that can be deployed to vulnerable and hard-to-reach areas of the ice sheet. This technology will provide greater data quality and density and will observe the ice sheet at frequencies that are currently inaccessible — elements that are essential for understanding the physics governing the evolution of the ice sheet and sea-level rise.

    Changing flood risk for coastal communities in the developing world

    Globally, more than 600 million people live in low-elevation coastal areas that face an increasing risk of flooding from sea-level rise. This includes two-thirds of cities with populations of more than 5 million and regions that conduct the vast majority of global trade. Dara Entekhabi, the Bacardi and Stockholm Water Foundations Professor in the Department of Civil and Environmental Engineering and professor in the Department of Earth, Atmospheric, and Planetary Sciences, outlines an interdisciplinary partnership that leverages data and technology to guide short-term and chart long-term adaptation pathways with Miho Mazereeuw, associate professor of architecture and urbanism and director of the Urban Risk Lab in the School of Architecture and Planning, and Danielle Wood, assistant professor in the Program in Media Arts and Sciences and the Department of Aeronautics and Astronautics.

    Q: What is the key problem this program seeks to address?

    A: The accumulated heating of the Earth system due to fossil burning is largely absorbed by the oceans, and the stored heat expands the ocean volume leading to increased base height for tides. When the high tides inundate a city, the condition is referred to as “sunny day” flooding, but the saline waters corrode infrastructure and wreak havoc on daily routines. The danger ahead for many coastal cities in the developing world is the combination of increasing high tide intrusions, coupled with heavy precipitation storm events.

    Q: How will your proposed solutions impact flood risk management?

    A: We are producing detailed risk maps for coastal cities in developing countries using newly available, very high-resolution remote-sensing data from space-borne instruments, as well as historical tides records and regional storm characteristics. Using these datasets, we aim to produce street-by-street risk maps that provide local decision-makers and stakeholders with a way to estimate present and future flood risks. With the model of future tides and probabilistic precipitation events, we can forecast future inundation by a flooding event, decadal changes with various climate-change and sea-level rise projections, and an increase in the likelihood of sunny-day flooding. Working closely with local partners, we will develop toolkits to explore short-term emergency response, as well as long-term mitigation and adaptation techniques in six pilot locations in South and Southeast Asia, Africa, and South America.

    Ocean vital signs

    On average, every person on Earth generates fossil fuel emissions equivalent to an 8-pound bag of carbon, every day. Much of this is absorbed by the ocean, but there is wide variability in the estimates of oceanic absorption, which translates into differences of trillions of dollars in the required cost of mitigation. In the Department of Earth, Atmospheric and Planetary Sciences, Christopher Hill, a principal research engineer specializing in Earth and planetary computational science, works with Ryan Woosley, a principal research scientist focusing on the carbon cycle and ocean acidification. Hill explains that they hope to use artificial intelligence and machine learning to help resolve this uncertainty.

    Q: What is the current state of knowledge on air-sea interactions?

    A: Obtaining specific, accurate field measurements of critical physical, chemical, and biological exchanges between the ocean and the planet have historically entailed expensive science missions with large ship-based infrastructure that leave gaps in real-time data about significant ocean climate processes. Recent advances in highly scalable in-situ autonomous observing and navigation combined with airborne, remote sensing, and machine learning innovations have the potential to transform data gathering, provide more accurate information, and address fundamental scientific questions around air-sea interaction.

    Q: How will your approach accelerate real-time, autonomous surface ocean observing from an experimental research endeavor to a permanent and impactful solution?

    A: Our project seeks to demonstrate how a scalable surface ocean observing network can be launched and operated, and to illustrate how this can reduce uncertainties in estimates of air-sea carbon dioxide exchange. With an initial high-impact goal of substantially eliminating the vast uncertainties that plague our understanding of ocean uptake of carbon dioxide, we will gather critical measurements for improving extended weather and climate forecast models and reducing climate impact uncertainty. The results have the potential to more accurately identify trillions of dollars worth of economic activity. More

  • in

    MIT Energy Conference focuses on climate’s toughest challenges

    This year’s MIT Energy Conference, the largest student-led event of its kind, included keynote talks and panels that tackled some of the thorniest remaining challenges in the global effort to cut back on climate-altering emissions. These include the production of construction materials such as steel and cement, and the role of transportation including aviation and shipping. While the challenges are formidable, approaches incorporating methods such as fusion, heat pumps, energy efficiency, and the use of hydrogen hold promise, participants said.

    The two-day conference, held on March 31 and April 1 for more than 900 participants, included keynote lectures, 14 panel discussions, a fireside chat, networking events, and more. The event this year included the final round of the annual MIT Climate and Energy Prize, whose winning team receives $100,000 and other support. The prize, awarded since 2007, has led to the creation of more than 220 companies and $1.1 billion in investments.

    This year’s winner is a project that hopes to provide an innovative, efficient waterless washing machine aimed at the vast majority of the world’s people, who still do laundry by hand.

    “A truly consequential moment in history”

    In his opening keynote address Fatih Birol, executive director of the International Energy Agency, noted that this year’s conference was taking place during the unprovoked invasion of Ukraine by Russia, a leading gas and oil exporter. As a result, “global oil markets are going through a major turmoil,” he said.

    He said that Russian oil exports are expected to drop by 3 million barrels a day, and that international efforts to release reserves and promote increased production elsewhere will help, but will not suffice. “We have to look to other measures” to make up the shortfall, he said, noting that his agency has produced a 10-point plan of measures to help reduce global demand for oil.

    Europe gets 45 percent of its natural gas from Russia, and the agency also has developed a 10-point plan to help alleviate expected shortages there, including measures to improve energy efficiency in homes and industries, promote renewable heating sources, and postpone retirement of some nuclear plants. But he emphasized that “our goals to reach our climate targets should not be yet another victim of Mr. Putin and his allies.”  Unfortunately, Birol said, “I see that addressing climate change is sliding down in the policy agenda of many governments.”

    But he sees reasons for optimism as well, in terms of the feasibility of achieving the global emissions reduction target, agreed to by countries representing 80 percent of the global economy, of reaching net zero carbon dioxide emissions by 2050. The IEA has developed a roadmap for the entire energy sector to get there, which is now used by many governments as a benchmark, according to Birol.

    In addition, the trend is already clear, he said. “More than 90 percent of all power plants installed in the world [last year] were renewable energy,” mainly solar and wind. And 10 percent of cars sold worldwide last year, and 20 percent in Europe, were electric cars. “Please remember that in 2019 it was only 2 percent!” he said. He also predicted that “nuclear is going to make a comeback in many countries,” both in terms of large plants and newer small modular reactors.

    Birol said that “I hope that the current crisis gives governments the impetus to address the energy security concerns, to reach our climate goals, and … [to] choose the right direction at this very important turning point.”

    The conference’s second day began with keynote talks by Gina McCarthy, national climate advisor at the White House Office of Domestic Climate Policy, and Maria Zuber, MIT’s vice president for research. In her address, Zuber said, “This conference comes at a truly consequential moment in history — a moment that puts into stark relief the enormous risks created by our current fossil-fuel based energy system — risks we cannot continue to accept.”

    She added that “time is not on our side.” To meet global commitments for limiting climate impacts, the world needs to reduce emissions by about half by 2030, and get to net zero by 2050. “In other words, we need to transform our entire global energy system in a few decades,” she said. She cited MIT’s “Fast Forward” climate action plan, issued last year, as presenting the two tracks that the world needs to pursue simultaneously: going as far as possible, as fast as possible, with the tools that exist now, while also innovating and investing in new ideas, technologies, practices, and institutions that may be needed to reach the net-zero goal.

    On the first track, she said, citing an IEA report, “from here until 2040, we can get most of the emissions reductions we need with technologies that are currently available or on the verge of becoming commercially available.” These include electrifying and boosting efficiency in buildings, industry, and transportation; increasing the portion of electricity coming from emissions-free sources; and investing in new infrastructure such as electric vehicle charging stations.

    But more than that is needed, she pointed out. For example, the amount of methane that leaks away into the atmosphere from fossil fuel operations is equivalent to all the natural gas used in Europe’s power sector, Zuber said. Recovering and selling that methane can dramatically reduce global methane emissions, often at little or no cost.

    For the longer run, “we need track-two solutions to decarbonize tough industries like aviation, shipping, chemicals, concrete, and steel,” and to remove carbon dioxide from the atmosphere. She described some of the promising technologies that are in the pipeline. Fusion, for example, has moved from being a scientific challenge to an engineering problem whose solution seems well underway, she said.

    Another important area is food-related systems, which currently account for a third of all global emissions. For example, fertilizer production uses a very energy-intensive process, but work on plants engineered to fix nitrogen directly could make a significant dent.

    These and several other advanced research areas may not all pan out, but some undoubtedly will, and will help curb climate change as well as create new jobs and reduce pollution.

    Though the problems we face are complex, they are not insurmountable, Zuber said. “We don’t need a miracle. What we need is to move along the two tracks I’ve outlined with determination, ingenuity, and fierce urgency.”

    The promise and challenges of hydrogen

    Other conference speakers took on some of the less-discussed but crucial areas that also need to be addressed in order to limit global warming to 1.5 degrees. Heavy transportation, and aviation in particular, have been considered especially challenging. In his keynote address, Glenn Llewellyn, vice president for zero-emission aircraft at Airbus, outlined several approaches his company is working on to develop competitive midrange alternative airliners by 2035 that use either batteries or fuel cells powered by hydrogen. The early-stage designs demonstrate that, contrary to some projections, there is a realistic pathway to weaning that industry from its present reliance on fossil fuel, chiefly kerosene.

    Hydrogen has real potential as an aviation fuel, he said, either directly for use in fuel cells for power or burned directly for propulsion, or indirectly as a feedstock for synthetic fuels. Both are being studied by the company, he said, including a hybrid model that uses both hydrogen fuel cells and hydrogen-fueled jet engines. The company projects a range of 2,000 nautical miles for a jet carrying 200 to 300 passengers, he said — all with no direct emissions and no contrails.

    But this vision will not be practical, Llewellyn said, unless economies of scale help to significantly lower the cost of hydrogen production. “Hydrogen is at the hub of aviation decarbonization,” he said. But that kind of price reduction seems quite feasible, he said, given that other major industries are also seriously looking at the use of hydrogen for their own decarbonization plans, including the production of steel and cement.

    Such uses were the subject of a panel discussion entitled “Deploying the Hydrogen Economy.” Hydrogen production technology exists, but not nearly at the scale that’s needed, which is about 500 million tons a year, pointed out moderator Dharik Mallapragada of the MIT Energy Initiative.

    Yet in some applications, the use of hydrogen both reduces emissions and is economically competitive. Preeti Pande of Plug Power said that her company, which produces hydrogen fuel cells, has found a significant market in an unexpected place: fork lifts, used in warehouses and factories worldwide. It turns out that replacing current battery-operated versions with fuel cell versions is a win-win for the companies that use them, saving money while helping to meet decarbonization goals.

    Lindsay Ashby of Avangrid Renewables said that the company has installed fuel-cell buses in Barcelona that run entirely on hydrogen generated by solar panels. The company is also building a 100-megawatt solar facility to produce hydrogen for the production of fertilizer, another major industry in need of decarbonization because of its large emissions footprint. And Brett Perleman of the Center for Houston’s Future said of his city that “we’re already a hydrogen hub today, just not green hydrogen” since the gas is currently mostly produced as a byproduct of fossil fuels. But that is changing rapidly, he said, and Houston, along with several other cities, aims to be a center of activity for hydrogen produced from renewable, non-carbon-emitting sources. They aim to be producing 1,000 tons a day by 2028, “and I think we’ll end up exceeding that,” he said.

    For industries that can switch to renewably generated electricity, that is typically the best choice, Perleman said. “But for those that can’t, hydrogen is a great option,” and that includes aviation, shipping, and rail. “The big oil companies all have plans in place” to develop clean hydrogen production, he said. “It’s not just a dream, but a reality.”

    For shipping, which tends to rely on bunker fuel, a particularly high-emissions fossil fuel, another potential option could be a new generation of small nuclear plants, said Jeff Navin of Terrapower, a company currently developing such units. “Finding replacements for coal, oil, or natural gas for industrial purposes is very hard,” he said, but often what these processes require is consistent high heat, which nuclear can deliver, as long as costs and regulatory issues can be resolved.  

    MIT professor of nuclear engineering Jacopo Buongiorno pointed out that the primary reasons for delays and cost overruns in nuclear plants have had to do with issues at the construction site, many of which could be alleviated by having smaller, factory-built modular plants, or by building multiple units at a time of a standardized design. If the government would take on the nuclear waste disposal, as some other countries have done, then nuclear power could play an important part in the decarbonization of many industries, he said.

    Student-led startups

    The two-day conference concluded with the final round of the annual MIT Climate and Energy Prize, consisting of the five finalist teams presenting brief pitches for their startup company ideas, followed by questions from the panel of judges. This year’s finalists included a team called Muket, dedicated to finding ways of reducing methane emissions from cattle and dairy farms. Feed additives or other measures could cut the emissions by 50 percent, the team estimates.

    A team called Ivu Biologics described a system for incorporating nitrogen-fixing microbes into the coatings of seeds, thereby reducing the need for added fertilizers, whose production is a major greenhouse gas source. The company is making use of seed-coating technology developed at MIT over the last few years. Another team, called Mesophase, also based on MIT-developed technology, aims to replace the condensers used in power plants and other industrial systems with much more efficient versions, thus increasing the energy output from a given amount of fuel or other heat source.

    A team called TerraTrade aims to facilitate the adoption of power purchase agreements by companies, institutions and governments, by acting as a kind of broker to create and administer such agreements, making it easier for even smaller entities to take part in these plans, which help to enable rapid development of renewable fossil-fuel-free energy production.

    The grand prize of $100,000 was awarded to a team called Ultropia, which is developing a combined clothes washer and drier that uses ultrasound instead of water for its cleaning. The system does use a small amount of water, but this can be recycled, making these usable even in areas where water availability is limited. The devices could have a great impact on the estimated 6 billion people in the world today who are still limited to washing clothes by hand, the team says, and because the machines would be so efficient, they would require very little energy to run — a significant improvement over the wider adoption of conventional washers and driers. More