More stories

  • in

    New maps show airplane contrails over the U.S. dropped steeply in 2020

    As Covid-19’s initial wave crested around the world, travel restrictions and a drop in passengers led to a record number of grounded flights in 2020. The air travel reduction cleared the skies of not just jets but also the fluffy white contrails they produce high in the atmosphere.

    MIT engineers have mapped the contrails that were generated over the United States in 2020, and compared the results to prepandemic years. They found that on any given day in 2018, and again in 2019, contrails covered a total area equal to Massachusetts and Connecticut combined. In 2020, this contrail coverage shrank by about 20 percent, mirroring a similar drop in U.S. flights.  

    While 2020’s contrail dip may not be surprising, the findings are proof that the team’s mapping technique works. Their study marks the first time researchers have captured the fine and ephemeral details of contrails over a large continental scale.

    Now, the researchers are applying the technique to predict where in the atmosphere contrails are likely to form. The cloud-like formations are known to play a significant role in aviation-related global warming. The team is working with major airlines to forecast regions in the atmosphere where contrails may form, and to reroute planes around these regions to minimize contrail production.

    “This kind of technology can help divert planes to prevent contrails, in real time,” says Steven Barrett, professor and associate head of MIT’s Department of Aeronautics and Astronautics. “There’s an unusual opportunity to halve aviation’s climate impact by eliminating most of the contrails produced today.”

    Barrett and his colleagues have published their results today in the journal Environmental Research Letters. His co-authors at MIT include graduate student Vincent Meijer, former graduate student Luke Kulik, research scientists Sebastian Eastham, Florian Allroggen, and Raymond Speth, and LIDS Director and professor Sertac Karaman.

    Trail training

    About half of the aviation industry’s contribution to global warming comes directly from planes’ carbon dioxide emissions. The other half is thought to be a consequence of their contrails. The signature white tails are produced when a plane’s hot, humid exhaust mixes with cool humid air high in the atmosphere. Emitted in thin lines, contrails quickly spread out and can act as blankets that trap the Earth’s outgoing heat.

    While a single contrail may not have much of a warming effect, taken together contrails have a significant impact. But the estimates of this effect are uncertain and based on computer modeling as well as limited satellite data. What’s more, traditional computer vision algorithms that analyze contrail data have a hard time discerning the wispy tails from natural clouds.

    To precisely pick out and track contrails over a large scale, the MIT team looked to images taken by NASA’s GOES-16, a geostationary satellite that hovers over the same swath of the Earth, including the United States, taking continuous, high-resolution images.

    The team first obtained about 100 images taken by the satellite, and trained a set of people to interpret remote sensing data and label each image’s pixel as either part of a contrail or not. They used this labeled dataset to train a computer-vision algorithm to discern a contrail from a cloud or other image feature.

    The researchers then ran the algorithm on about 100,000 satellite images, amounting to nearly 6 trillion pixels, each pixel representing an area of about 2 square kilometers. The images covered the contiguous U.S., along with parts of Canada and Mexico, and were taken about every 15 minutes, between Jan. 1, 2018, and Dec. 31, 2020.

    The algorithm automatically classified each pixel as either a contrail or not a contrail, and generated daily maps of contrails over the United States. These maps mirrored the major flight paths of most U.S. airlines, with some notable differences. For instance, contrail “holes” appeared around major airports, which reflects the fact that planes landing and taking off around airports are generally not high enough in the atmosphere for contrails to form.

    “The algorithm knows nothing about where planes fly, and yet when processing the satellite imagery, it resulted in recognizable flight routes,” Barrett says. “That’s one piece of evidence that says this method really does capture contrails over a large scale.”

    Cloudy patterns

    Based on the algorithm’s maps, the researchers calculated the total area covered each day by contrails in the US. On an average day in 2018 and in 2019, U.S. contrails took up about 43,000 square kilometers. This coverage dropped by 20 percent in March of 2020 as the pandemic set in. From then on, contrails slowly reappeared as air travel resumed through the year.

    The team also observed daily and seasonal patterns. In general, contrails appeared to peak in the morning and decline in the afternoon. This may be a training artifact: As natural cirrus clouds are more likely to form in the afternoon, the algorithm may have trouble discerning contrails amid the clouds later in the day. But it might also be an important indication about when contrails form most. Contrails also peaked in late winter and early spring, when more of the air is naturally colder and more conducive for contrail formation.

    The team has now adapted the technique to predict where contrails are likely to form in real time. Avoiding these regions, Barrett says, could take a significant, almost immediate chunk out of aviation’s global warming contribution.  

    “Most measures to make aviation sustainable take a long time,” Barrett says. “(Contrail avoidance) could be accomplished in a few years, because it requires small changes to how aircraft are flown, with existing airplanes and observational technology. It’s a near-term way of reducing aviation’s warming by about half.”

    The team is now working towards this objective of large-scale contrail avoidance using realtime satellite observations.

    This research was supported in part by NASA and the MIT Environmental Solutions Initiative. More

  • in

    Q&A: Climate Grand Challenges finalists on building equity and fairness into climate solutions

    Note: This is the first in a four-part interview series that will highlight the work of the Climate Grand Challenges finalists, ahead of the April announcement of several multiyear, flagship projects.

    The finalists in MIT’s first-ever Climate Grand Challenges competition each received $100,000 to develop bold, interdisciplinary research and innovation plans designed to attack some of the world’s most difficult and unresolved climate problems. The 27 teams are addressing four Grand Challenge problem areas: building equity and fairness into climate solutions; decarbonizing complex industries and processes; removing, managing, and storing greenhouse gases; and using data and science for improved climate risk forecasting.  

    In a conversation prepared for MIT News, faculty from three of the teams in the competition’s “Building equity and fairness into climate solutions” category share their thoughts on the need for inclusive solutions that prioritize disadvantaged and vulnerable populations, and discuss how they are working to accelerate their research to achieve the greatest impact. The following responses have been edited for length and clarity.

    The Equitable Resilience Framework

    Any effort to solve the most complex global climate problems must recognize the unequal burdens borne by different groups, communities, and societies — and should be equitable as well as effective. Janelle Knox-Hayes, associate professor in the Department of Urban Studies and Planning, leads a team that is developing processes and practices for equitable resilience, starting with a local pilot project in Boston over the next five years and extending to other cities and regions of the country. The Equitable Resilience Framework (ERF) is designed to create long-term economic, social, and environmental transformations by increasing the capacity of interconnected systems and communities to respond to a broad range of climate-related events. 

    Q: What is the problem you are trying to solve?

    A: Inequity is one of the severe impacts of climate change and resonates in both mitigation and adaptation efforts. It is important for climate strategies to address challenges of inequity and, if possible, to design strategies that enhance justice, equity, and inclusion, while also enhancing the efficacy of mitigation and adaptation efforts. Our framework offers a blueprint for how communities, cities, and regions can begin to undertake this work.

    Q: What are the most significant barriers that have impacted progress to date?

    A: There is considerable inertia in policymaking. Climate change requires a rethinking, not only of directives but pathways and techniques of policymaking. This is an obstacle and part of the reason our project was designed to scale up from local pilot projects. Another consideration is that the private sector can be more adaptive and nimble in its adoption of creative techniques. Working with the MIT Climate and Sustainability Consortium there may be ways in which we could modify the ERF to help companies address similar internal adaptation and resilience challenges.

    Protecting and enhancing natural carbon sinks

    Deforestation and forest degradation of strategic ecosystems in the Amazon, Central Africa, and Southeast Asia continue to reduce capacity to capture and store carbon through natural systems and threaten even the most aggressive decarbonization plans. John Fernandez, professor in the Department of Architecture and director of the Environmental Solutions Initiative, reflects on his work with Daniela Rus, professor of electrical engineering and computer science and director of the Computer Science and Artificial Intelligence Laboratory, and Joann de Zegher, assistant professor of Operations Management at MIT Sloan, to protect tropical forests by deploying a three-part solution that integrates targeted technology breakthroughs, deep community engagement, and innovative bioeconomic opportunities. 

    Q: Why is the problem you seek to address a “grand challenge”?

    A: We are trying to bring the latest technology to monitoring, assessing, and protecting tropical forests, as well as other carbon-rich and highly biodiverse ecosystems. This is a grand challenge because natural sinks around the world are threatening to release enormous quantities of stored carbon that could lead to runaway global warming. When combined with deep community engagement, particularly with indigenous and afro-descendant communities, this integrated approach promises to deliver substantially enhanced efficacy in conservation coupled to robust and sustainable local development.

    Q: What is known about this problem and what questions remain unanswered?

    A: Satellites, drones, and other technologies are acquiring more data about natural carbon sinks than ever before. The problem is well-described in certain locations such as the eastern Amazon, which has shifted from a net carbon sink to now a net positive carbon emitter. It is also well-known that indigenous peoples are the most effective stewards of the ecosystems that store the greatest amounts of carbon. One of the key questions that remains to be answered is determining the bioeconomy opportunities inherent within the natural wealth of tropical forests and other important ecosystems that are important to sustained protection and conservation.

    Reducing group-based disparities in climate adaptation

    Race, ethnicity, caste, religion, and nationality are often linked to vulnerability to the adverse effects of climate change, and if left unchecked, threaten to exacerbate long standing inequities. A team led by Evan Lieberman, professor of political science and director of the MIT Global Diversity Lab and MIT International Science and Technology Initiatives, Danielle Wood, assistant professor in the Program in Media Arts and Sciences and the Department of Aeronautics and Astronautics, and Siqi Zheng, professor of urban and real estate sustainability in the Center for Real Estate and the Department of Urban Studies and Planning, is seeking to  reduce ethnic and racial group-based disparities in the capacity of urban communities to adapt to the changing climate. Working with partners in nine coastal cities, they will measure the distribution of climate-related burdens and resiliency through satellites, a custom mobile app, and natural language processing of social media, to help design and test communication campaigns that provide accurate information about risks and remediation to impacted groups. 

    Q: How has this problem evolved?

    A: Group-based disparities continue to intensify within and across countries, owing in part to some randomness in the location of adverse climate events, as well as deep legacies of unequal human development. In turn, economically and politically privileged groups routinely hoard resources for adaptation. In a few cases — notably the United States, Brazil, and with respect to climate-related migrancy, in South Asia — there has been a great deal of research documenting the extent of such disparities. However, we lack common metrics, and for the most part, such disparities are only understood where key actors have politicized the underlying problems. In much of the world, relatively vulnerable and excluded groups may not even be fully aware of the nature of the challenges they face or the resources they require.

    Q: Who will benefit most from your research? 

    A: The greatest beneficiaries will be members of those vulnerable groups who lack the resources and infrastructure to withstand adverse climate shocks. We believe that it will be important to develop solutions such that relatively privileged groups do not perceive them as punitive or zero-sum, but rather as long-term solutions for collective benefit that are both sound and just. More

  • in

    Study reveals chemical link between wildfire smoke and ozone depletion

    The Australian wildfires in 2019 and 2020 were historic for how far and fast they spread, and for how long and powerfully they burned. All told, the devastating “Black Summer” fires blazed across more than 43 million acres of land, and extinguished or displaced nearly 3 billion animals. The fires also injected over 1 million tons of smoke particles into the atmosphere, reaching up to 35 kilometers above Earth’s surface — a mass and reach comparable to that of an erupting volcano.

    Now, atmospheric chemists at MIT have found that the smoke from those fires set off chemical reactions in the stratosphere that contributed to the destruction of ozone, which shields the Earth from incoming ultraviolet radiation. The team’s study, appearing this week in the Proceedings of the National Academy of Sciences, is the first to establish a chemical link between wildfire smoke and ozone depletion.

    In March 2020, shortly after the fires subsided, the team observed a sharp drop in nitrogen dioxide in the stratosphere, which is the first step in a chemical cascade that is known to end in ozone depletion. The researchers found that this drop in nitrogen dioxide directly correlates with the amount of smoke that the fires released into the stratosphere. They estimate that this smoke-induced chemistry depleted the column of ozone by 1 percent.

    To put this in context, they note that the phaseout of ozone-depleting gases under a worldwide agreement to stop their production has led to about a 1 percent ozone recovery from earlier ozone decreases over the past 10 years — meaning that the wildfires canceled those hard-won diplomatic gains for a short period. If future wildfires grow stronger and more frequent, as they are predicted to do with climate change, ozone’s projected recovery could be delayed by years. 

    “The Australian fires look like the biggest event so far, but as the world continues to warm, there is every reason to think these fires will become more frequent and more intense,” says lead author Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at MIT. “It’s another wakeup call, just as the Antarctic ozone hole was, in the sense of showing how bad things could actually be.”

    The study’s co-authors include Kane Stone, a research scientist in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, along with collaborators at multiple institutions including the University of Saskatchewan, Jinan University, the National Center for Atmospheric Research, and the University of Colorado at Boulder.

    Chemical trace

    Massive wildfires are known to generate pyrocumulonimbus — towering clouds of smoke that can reach into the stratosphere, the layer of the atmosphere that lies between about 15 and 50 kilometers above the Earth’s surface. The smoke from Australia’s wildfires reached well into the stratosphere, as high as 35 kilometers.

    In 2021, Solomon’s co-author, Pengfei Yu at Jinan University, carried out a separate study of the fires’ impacts and found that the accumulated smoke warmed parts of the stratosphere by as much as 2 degrees Celsius — a warming that persisted for six months. The study also found hints of ozone destruction in the Southern Hemisphere following the fires.

    Solomon wondered whether smoke from the fires could have depleted ozone through a chemistry similar to volcanic aerosols. Major volcanic eruptions can also reach into the stratosphere, and in 1989, Solomon discovered that the particles in these eruptions can destroy ozone through a series of chemical reactions. As the particles form in the atmosphere, they gather moisture on their surfaces. Once wet, the particles can react with circulating chemicals in the stratosphere, including dinitrogen pentoxide, which reacts with the particles to form nitric acid.

    Normally, dinitrogen pentoxide reacts with the sun to form various nitrogen species, including nitrogen dioxide, a compound that binds with chlorine-containing chemicals in the stratosphere. When volcanic smoke converts dinitrogen pentoxide into nitric acid, nitrogen dioxide drops, and the chlorine compounds take another path, morphing into chlorine monoxide, the main human-made agent that destroys ozone.

    “This chemistry, once you get past that point, is well-established,” Solomon says. “Once you have less nitrogen dioxide, you have to have more chlorine monoxide, and that will deplete ozone.”

    Cloud injection

    In the new study, Solomon and her colleagues looked at how concentrations of nitrogen dioxide in the stratosphere changed following the Australian fires. If these concentrations dropped significantly, it would signal that wildfire smoke depletes ozone through the same chemical reactions as some volcanic eruptions.

    The team looked to observations of nitrogen dioxide taken by three independent satellites that have surveyed the Southern Hemisphere for varying lengths of time. They compared each satellite’s record in the months and years leading up to and following the Australian fires. All three records showed a significant drop in nitrogen dioxide in March 2020. For one satellite’s record, the drop represented a record low among observations spanning the last 20 years.

    To check that the nitrogen dioxide decrease was a direct chemical effect of the fires’ smoke, the researchers carried out atmospheric simulations using a global, three-dimensional model that simulates hundreds of chemical reactions in the atmosphere, from the surface on up through the stratosphere.

    The team injected a cloud of smoke particles into the model, simulating what was observed from the Australian wildfires. They assumed that the particles, like volcanic aerosols, gathered moisture. They then ran the model multiple times and compared the results to simulations without the smoke cloud.

    In every simulation incorporating wildfire smoke, the team found that as the amount of smoke particles increased in the stratosphere, concentrations of nitrogen dioxide decreased, matching the observations of the three satellites.

    “The behavior we saw, of more and more aerosols, and less and less nitrogen dioxide, in both the model and the data, is a fantastic fingerprint,” Solomon says. “It’s the first time that science has established a chemical mechanism linking wildfire smoke to ozone depletion. It may only be one chemical mechanism among several, but it’s clearly there. It tells us these particles are wet and they had to have caused some ozone depletion.”

    She and her collaborators are looking into other reactions triggered by wildfire smoke that might further contribute to stripping ozone. For the time being, the major driver of ozone depletion remains chlorofluorocarbons, or CFCs — chemicals such as old refrigerants that have been banned under the Montreal Protocol, though they continue to linger in the stratosphere. But as global warming leads to stronger, more frequent wildfires, their smoke could have a serious, lasting impact on ozone.

    “Wildfire smoke is a toxic brew of organic compounds that are complex beasts,” Solomon says. “And I’m afraid ozone is getting pummeled by a whole series of reactions that we are now furiously working to unravel.”

    This research was supported in part by the National Science Foundation and NASA. More

  • in

    First-ever Climate Grand Challenges recognizes 27 finalists

    All-carbon buildings, climate-resilient crops, and new tools to improve the prediction of extreme weather events are just a few of the 27 bold, interdisciplinary research projects selected as finalists from a field of almost 100 proposals in the first MIT Climate Grand Challenges competition. Each of the finalist teams received $100,000 to develop a comprehensive research and innovation plan.

    A subset of the finalists will make up a portfolio of multiyear projects that will receive additional funding and other support to develop high-impact, science-based mitigation and adaptation solutions on an accelerated basis. These flagship projects, which will be announced later this spring, will augment the work of the many MIT units already pursuing climate-related research activities.

    “Climate change poses a suite of challenges of immense urgency, complexity and scale. At MIT, we are bringing our particular strengths to bear through our community — a rare concentration of ingenuity and determination, rooted in a vibrant innovation ecosystem,” President L. Rafael Reif says. “Through MIT’s Climate Grand Challenges, we are engaging hundreds of our brilliant faculty and researchers in the search for solutions with enormous potential for impact.”

    The Climate Grand Challenges launched in July 2020 with the goal of mobilizing the entire MIT research community around developing solutions to some of the most complex unsolved problems in emissions reduction, climate change adaptation and resilience, risk forecasting, carbon removal, and understanding the human impacts of climate change.

    An event in April will showcase the flagship projects, bringing together public and private sector partners with the MIT teams to begin assembling the necessary resources for developing, implementing, and scaling these solutions rapidly.

    A whole-of-MIT effort

    Part of a wide array of major climate programs outlined last year in “Fast Forward: MIT’s Climate Action Plan for the Decade,” the Climate Grand Challenges focuses on problems where progress depends on the application of forefront knowledge in the physical, life, and social sciences and the advancement of cutting-edge technologies.

    “We don’t have the luxury of time in responding to the intensifying climate crisis,” says Vice President for Research Maria Zuber, who oversees the implementation of MIT’s climate action plan. “The Climate Grand Challenges are about marshaling the wide and deep knowledge and methods of the MIT community around transformative research that can help accelerate our collective response to climate change.”

    If successful, the solutions will have tangible effects, changing the way people live and work. Examples of these new approaches range from developing cost-competitive long-term energy-storage systems to using drone technologies and artificial intelligence to study the role of the deep ocean in the climate crisis. Many projects also aim to increase the humanistic understanding of these phenomena, recognizing that technological advances alone will not address the widespread impacts of climate change, and a comparable behavioral and cultural shift is needed to stave off future threats.

    “To achieve net-zero emissions later this century we must deploy the tools and technologies we already have,” says Richard Lester, associate provost for international activities. “But we’re still far from having everything needed to get there in ways that are equitable and affordable. Nor do we have the solutions in hand that will allow communities — especially the most vulnerable ones — to adapt to the disruptions that will occur even if the world does get to net-zero. Climate Grand Challenges is creating a new opportunity for the MIT research community to attack some of these hard, unsolved problems, and to engage with partners in industry, government, and the nonprofit sector to accelerate the whole cycle of activities needed to implement solutions at scale.” 

    Selecting the finalist projects

    A 24-person faculty committee convened by Lester and Zuber with members from all five of MIT’s schools and the MIT Schwarzman College of Computing led the planning and initial call for ideas. A smaller group of committee members was charged with evaluating nearly 100 letters of interest, representing 90 percent of MIT departments and ​​involving almost 400 MIT faculty members and senior researchers as well as colleagues from other research institutions.

    “Effectively confronting the climate emergency requires risk taking and sustained investment over a period of many decades,” says Anantha Chandrakasan, dean of the School of Engineering. “We have a responsibility to use our incredible resources and expertise to tackle some of the most challenging problems in climate mitigation and adaptation, and the opportunity to make major advances globally.”

    Lester and Zuber charged a second faculty committee with organizing a rigorous and thorough evaluation of the plans developed by the 27 finalist teams. Drawing on an extensive review process involving international panels of prominent experts, MIT will announce a small group of flagship Grand Challenge projects in April. 

    Each of the 27 finalist teams is addressing one of four broad Grand Challenge problems:

    Building equity and fairness into climate solutions

    Policy innovation and experimentation for effective and equitable climate solutions, led by Abhijit Banerjee, Iqbal Dhaliwal, and Claire Walsh
    Protecting and enhancing natural carbon sinks – Natural Climate and Community Solutions (NCCS), led by John Fernandez, Daniela Rus, and Joann de Zegher
    Reducing group-based disparities in climate adaptation, led by Evan Lieberman, Danielle Wood, and Siqi Zheng
    Reinventing climate change adaptation – The Climate Resilience Early Warning System (CREWSnet), led by John Aldridge and Elfatih Eltahir
    The Deep Listening Project: Communication infrastructure for collaborative adaptation, led by Eric Gordon, Yihyun Lim, and James Paradis
    The Equitable Resilience Framework, led by Janelle Knox-Hayes

    Decarbonizing complex industries and processes

    Carbon >Building, led by Mark Goulthorpe
    Center for Electrification and Decarbonization of Industry, led by Yet-Ming Chiang and Bilge Yildiz
    Decarbonizing and strengthening the global energy infrastructure using nuclear batteries, led by Jacopo Buongiorno
    Emissions reduction through innovation in the textile industry, led by Yuly Fuentes-Medel and Greg Rutledge
    Rapid decarbonization of freight mobility, led by Yossi Sheffi and Matthias Winkenbach
    Revolutionizing agriculture with low-emissions, resilient crops, led by Christopher Voigt
    Solar fuels as a vector for climate change mitigation, led by Yuriy Román-Leshkov and Yogesh Surendranath
    The MIT Low-Carbon Co-Design Institute, led by Audun Botterud, Dharik Mallapragada, and Robert Stoner
    Tough to Decarbonize Transportation, led by Steven Barrett and William Green

    Removing, managing, and storing greenhouse gases

    Demonstrating safe, globally distributed geological CO2 storage at scale, led by Bradford Hager, Howard Herzog, and Ruben Juanes
    Deploying versatile carbon capture technologies and storage at scale, led by Betar Gallant, Bradford Hager, and T. Alan Hatton
    Directed Evolution of Biological Carbon Fixation Working Group at MIT (DEBC-MIT), led by Edward Boyden and Matthew Shoulders
    Managing sources and sinks of carbon in terrestrial and coastal ecosystems, led by Charles Harvey, Tami Lieberman, and Heidi Nepf
    Strategies to Reduce Atmospheric Methane, led by Desiree Plata

    The Advanced Carbon Mineralization Initiative, led by Edward Boyden, Matěj Peč, and Yogesh Surendranath

    Using data and science to forecast climate-related risk

    Bringing computation to the climate challenge, led by Noelle Eckley Selin and Raffaele Ferrari
    Ocean vital signs, led by Christopher Hill and Ryan Woosley
    Preparing for a new world of weather and climate extremes, led by Kerry Emanuel, Miho Mazereeuw, and Paul O’Gorman
    Quantifying and managing the risks of sea-level rise, led by Brent Minchew
    Stratospheric Airborne Climate Observatory System to initiate a climate risk forecasting revolution, led by R. John Hansman and Brent Minchew
    The future of coasts – Changing flood risk for coastal communities in the developing world, led by Dara Entekhabi, Miho Mazereeuw, and Danielle Wood

    To learn more about the MIT Climate Grand Challenges, visit climategrandchallenges.mit.edu. More

  • in

    Overcoming a bottleneck in carbon dioxide conversion

    If researchers could find a way to chemically convert carbon dioxide into fuels or other products, they might make a major dent in greenhouse gas emissions. But many such processes that have seemed promising in the lab haven’t performed as expected in scaled-up formats that would be suitable for use with a power plant or other emissions sources.

    Now, researchers at MIT have identified, quantified, and modeled a major reason for poor performance in such conversion systems. The culprit turns out to be a local depletion of the carbon dioxide gas right next to the electrodes being used to catalyze the conversion. The problem can be alleviated, the team found, by simply pulsing the current off and on at specific intervals, allowing time for the gas to build back up to the needed levels next to the electrode.

    The findings, which could spur progress on developing a variety of materials and designs for electrochemical carbon dioxide conversion systems, were published today in the journal Langmuir, in a paper by MIT postdoc Álvaro Moreno Soto, graduate student Jack Lake, and professor of mechanical engineering Kripa Varanasi.

    “Carbon dioxide mitigation is, I think, one of the important challenges of our time,” Varanasi says. While much of the research in the area has focused on carbon capture and sequestration, in which the gas is pumped into some kind of deep underground reservoir or converted to an inert solid such as limestone, another promising avenue has been converting the gas into other carbon compounds such as methane or ethanol, to be used as fuel, or ethylene, which serves as a precursor to useful polymers.

    There are several ways to do such conversions, including electrochemical, thermocatalytic, photothermal, or photochemical processes. “Each of these has problems or challenges,” Varanasi says. The thermal processes require very high temperature, and they don’t produce very high-value chemical products, which is a challenge with the light-activated processes as well, he says. “Efficiency is always at play, always an issue.”

    The team has focused on the electrochemical approaches, with a goal of getting “higher-C products” — compounds that contain more carbon atoms and tend to be higher-value fuels because of their energy per weight or volume. In these reactions, the biggest challenge has been curbing competing reactions that can take place at the same time, especially the splitting of water molecules into oxygen and hydrogen.

    The reactions take place as a stream of liquid electrolyte with the carbon dioxide dissolved in it passes over a metal catalytic surface that is electrically charged. But as the carbon dioxide gets converted, it leaves behind a region in the electrolyte stream where it has essentially been used up, and so the reaction within this depleted zone turns toward water splitting instead. This unwanted reaction uses up energy and greatly reduces the overall efficiency of the conversion process, the researchers found.

    “There’s a number of groups working on this, and a number of catalysts that are out there,” Varanasi says. “In all of these, I think the hydrogen co-evolution becomes a bottleneck.”

    One way of counteracting this depletion, they found, can be achieved by a pulsed system — a cycle of simply turning off the voltage, stopping the reaction and giving the carbon dioxide time to spread back into the depleted zone and reach usable levels again, and then resuming the reaction.

    Often, the researchers say, groups have found promising catalyst materials but haven’t run their lab tests long enough to observe these depletion effects, and thus have been frustrated in trying to scale up their systems. Furthermore, the concentration of carbon dioxide next to the catalyst dictates the products that are made. Hence, depletion can also change the mix of products that are produced and can make the process unreliable. “If you want to be able to make a system that works at industrial scale, you need to be able to run things over a long period of time,” Varanasi says, “and you need to not have these kinds of effects that reduce the efficiency or reliability of the process.”

    The team studied three different catalyst materials, including copper, and “we really focused on making sure that we understood and can quantify the depletion effects,” Lake says. In the process they were able to develop a simple and reliable way of monitoring the efficiency of the conversion process as it happens, by measuring the changing pH levels, a measure of acidity, in the system’s electrolyte.

    In their tests, they used more sophisticated analytical tools to characterize reaction products, including gas chromatography for analysis of the gaseous products, and nuclear magnetic resonance characterization for the system’s liquid products. But their analysis showed that the simple pH measurement of the electrolyte next to the electrode during operation could provide a sufficient measure of the efficiency of the reaction as it progressed.

    This ability to easily monitor the reaction in real-time could ultimately lead to a system optimized by machine-learning methods, controlling the production rate of the desired compounds through continuous feedback, Moreno Soto says.

    Now that the process is understood and quantified, other approaches to mitigating the carbon dioxide depletion might be developed, the researchers say, and could easily be tested using their methods.

    This work shows, Lake says, that “no matter what your catalyst material is” in such an electrocatalytic system, “you’ll be affected by this problem.” And now, by using the model they developed, it’s possible to determine exactly what kind of time window needs to be evaluated to get an accurate sense of the material’s overall efficiency and what kind of system operations could maximize its effectiveness.

    The research was supported by Shell, through the MIT Energy Initiative. More

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More

  • in

    At UN climate change conference, trying to “keep 1.5 alive”

    After a one-year delay caused by the Covid-19 pandemic, negotiators from nearly 200 countries met this month in Glasgow, Scotland, at COP26, the United Nations climate change conference, to hammer out a new global agreement to reduce greenhouse gas emissions and prepare for climate impacts. A delegation of approximately 20 faculty, staff, and students from MIT was on hand to observe the negotiations, share and conduct research, and launch new initiatives.

    On Saturday, Nov. 13, following two weeks of negotiations in the cavernous Scottish Events Campus, countries’ representatives agreed to the Glasgow Climate Pact. The pact reaffirms the goal of the 2015 Paris Agreement “to pursue efforts” to limit the global average temperature increase to 1.5 degrees Celsius above preindustrial levels, and recognizes that achieving this goal requires “reducing global carbon dioxide emissions by 45 percent by 2030 relative to the 2010 level and to net zero around mid-century.”

    “On issues like the need to reach net-zero emissions, reduce methane pollution, move beyond coal power, and tighten carbon accounting rules, the Glasgow pact represents some meaningful progress, but we still have so much work to do,” says Maria Zuber, MIT’s vice president for research, who led the Institute’s delegation to COP26. “Glasgow showed, once again, what a wicked complex problem climate change is, technically, economically, and politically. But it also underscored the determination of a global community of people committed to addressing it.”

    An “ambition gap”

    Both within the conference venue and at protests that spilled through the streets of Glasgow, one rallying cry was “keep 1.5 alive.” Alok Sharma, who was appointed by the UK government to preside over COP26, said in announcing the Glasgow pact: “We can now say with credibility that we have kept 1.5 degrees alive. But, its pulse is weak and it will only survive if we keep our promises and translate commitments into rapid action.”

    In remarks delivered during the first week of the conference, Sergey Paltsev, deputy director of MIT’s Joint Program on the Science and Policy of Global Change, presented findings from the latest MIT Global Change Outlook, which showed a wide gap between countries’ nationally determined contributions (NDCs) — the UN’s term for greenhouse gas emissions reduction pledges — and the reductions needed to put the world on track to meet the goals of the Paris Agreement and, now, the Glasgow pact.

    Pointing to this ambition gap, Paltsev called on all countries to do more, faster, to cut emissions. “We could dramatically reduce overall climate risk through more ambitious policy measures and investments,” says Paltsev. “We need to employ an integrated approach of moving to zero emissions in energy and industry, together with sustainable development and nature-based solutions, simultaneously improving human well-being and providing biodiversity benefits.”

    Finalizing the Paris rulebook

    A key outcome of COP26 (COP stands for “conference of the parties” to the UN Framework Convention on Climate Change, held for the 26th time) was the development of a set of rules to implement Article 6 of the Paris Agreement, which provides a mechanism for countries to receive credit for emissions reductions that they finance outside their borders, and to cooperate by buying and selling emissions reductions on international carbon markets.

    An agreement on this part of the Paris “rulebook” had eluded negotiators in the years since the Paris climate conference, in part because negotiators were concerned about how to prevent double-counting, wherein both buyers and sellers would claim credit for the emissions reductions.

    Michael Mehling, the deputy director of MIT’s Center for Energy and Environmental Policy Research (CEEPR) and an expert on international carbon markets, drew on a recent CEEPR working paper to describe critical negotiation issues under Article 6 during an event at the conference on Nov. 10 with climate negotiators and private sector representatives.

    He cited research that finds that Article 6, by leveraging the cost-efficiency of global carbon markets, could cut in half the cost that countries would incur to achieve their nationally determined contributions. “Which, seen from another angle, means you could double the ambition of these NDCs at no additional cost,” Mehling noted in his talk, adding that, given the persistent ambition gap, “any such opportunity is bitterly needed.”

    Andreas Haupt, a graduate student in the Institute for Data, Systems, and Society, joined MIT’s COP26 delegation to follow Article 6 negotiations. Haupt described the final days of negotiations over Article 6 as a “roller coaster.” Once negotiators reached an agreement, he says, “I felt relieved, but also unsure how strong of an effect the new rules, with all their weaknesses, will have. I am curious and hopeful regarding what will happen in the next year until the next large-scale negotiations in 2022.”

    Nature-based climate solutions

    World leaders also announced new agreements on the sidelines of the formal UN negotiations. One such agreement, a declaration on forests signed by more than 100 countries, commits to “working collectively to halt and reverse forest loss and land degradation by 2030.”

    A team from MIT’s Environmental Solutions Initiative (ESI), which has been working with policymakers and other stakeholders on strategies to protect tropical forests and advance other nature-based climate solutions in Latin America, was at COP26 to discuss their work and make plans for expanding it.

    Marcela Angel, a research associate at ESI, moderated a panel discussion featuring John Fernández, professor of architecture and ESI’s director, focused on protecting and enhancing natural carbon sinks, particularly tropical forests such as the Amazon that are at risk of deforestation, forest degradation, and biodiversity loss.

    “Deforestation and associated land use change remain one of the main sources of greenhouse gas emissions in most Amazonian countries, such as Brazil, Peru, and Colombia,” says Angel. “Our aim is to support these countries, whose nationally determined contributions depend on the effectiveness of policies to prevent deforestation and promote conservation, with an approach based on the integration of targeted technology breakthroughs, deep community engagement, and innovative bioeconomic opportunities for local communities that depend on forests for their livelihoods.”

    Energy access and renewable energy

    Worldwide, an estimated 800 million people lack access to electricity, and billions more have only limited or erratic electrical service. Providing universal access to energy is one of the UN’s sustainable development goals, creating a dual challenge: how to boost energy access without driving up greenhouse gas emissions.

    Rob Stoner, deputy director for science and technology of the MIT Energy Initiative (MITEI), and Ignacio Pérez-Arriaga, a visiting professor at the Sloan School of Management, attended COP26 to share their work as members of the Global Commission to End Energy Poverty, a collaboration between MITEI and the Rockefeller Foundation. It brings together global energy leaders from industry, the development finance community, academia, and civil society to identify ways to overcome barriers to investment in the energy sectors of countries with low energy access.

    The commission’s work helped to motivate the formation, announced at COP26 on Nov. 2, of the Global Energy Alliance for People and Planet, a multibillion-dollar commitment by the Rockefeller and IKEA foundations and Bezos Earth Fund to support access to renewable energy around the world.

    Another MITEI member of the COP26 delegation, Martha Broad, the initiative’s executive director, spoke about MIT research to inform the U.S. goal of scaling offshore wind energy capacity from approximately 30 megawatts today to 30 gigawatts by 2030, including significant new capacity off the coast of New England.

    Broad described research, funded by MITEI member companies, on a coating that can be applied to the blades of wind turbines to prevent icing that would require the turbines’ shutdown; the use of machine learning to inform preventative turbine maintenance; and methodologies for incorporating the effects of climate change into projections of future wind conditions to guide wind farm siting decisions today. She also spoke broadly about the need for public and private support to scale promising innovations.

    “Clearly, both the public sector and the private sector have a role to play in getting these technologies to the point where we can use them in New England, and also where we can deploy them affordably for the developing world,” Broad said at an event sponsored by America Is All In, a coalition of nonprofit and business organizations.

    Food and climate alliance

    Food systems around the world are increasingly at risk from the impacts of climate change. At the same time, these systems, which include all activities from food production to consumption and food waste, are responsible for about one-third of the human-caused greenhouse gas emissions warming the planet.

    At COP26, MIT’s Abdul Latif Jameel Water and Food Systems Lab announced the launch of a new alliance to drive research-based innovation that will make food systems more resilient and sustainable, called the Food and Climate Systems Transformation (FACT) Alliance. With 16 member institutions, the FACT Alliance will better connect researchers to farmers, food businesses, policymakers, and other food systems stakeholders around the world.

    Looking ahead

    By the end of 2022, the Glasgow pact asks countries to revisit their nationally determined contributions and strengthen them to bring them in line with the temperature goals of the Paris Agreement. The pact also “notes with deep regret” the failure of wealthier countries to collectively provide poorer countries $100 billion per year in climate financing that they pledged in 2009 to begin in 2020.

    These and other issues will be on the agenda for COP27, to be held in Sharm El-Sheikh, Egypt, next year.

    “Limiting warming to 1.5 degrees is broadly accepted as a critical goal to avoiding worsening climate consequences, but it’s clear that current national commitments will not get us there,” says ESI’s Fernández. “We will need stronger emissions reductions pledges, especially from the largest greenhouse gas emitters. At the same time, expanding creativity, innovation, and determination from every sector of society, including research universities, to get on with real-world solutions is essential. At Glasgow, MIT was front and center in energy systems, cities, nature-based solutions, and more. The year 2030 is right around the corner so we can’t afford to let up for one minute.” More

  • in

    MIT makes strides on climate action plan

    Two recent online events related to MIT’s ambitious new climate action plan highlighted several areas of progress, including uses of the campus as a real-life testbed for climate impact research, the creation of new planning bodies with opportunities for input from all parts of the MIT community, and a variety of moves toward reducing the Institute’s own carbon footprint in ways that may also provide a useful model for others.

    On Monday, MIT’s Office of Sustainability held its seventh annual “Sustainability Connect” event, bringing together students, faculty, staff, and alumni to learn about and share ideas for addressing climate change. This year’s virtual event emphasized the work toward carrying out the climate plan, titled “Fast Forward: MIT’s Climate Action Plan for the Decade,” which was announced in May. An earlier event, the “MIT Climate Tune-in” on Nov. 3, provided an overview of the many areas of MIT’s work to tackle climate change and featured a video message from Maria Zuber, MIT’s vice president for research, who was attending the COP26 international climate meeting in Glasgow, Scotland, as part of an 18-member team from MIT.

    Zuber pointed out some significant progress that was made at the conference, including a broad agreement by over 100 nations to end deforestation by the end of the decade; she also noted that the U.S. and E.U. are leading a global coalition of countries committed to curbing methane emissions by 30 percent from 2020 levels by decade’s end. “It’s easy to be pessimistic,” she said, “but being here in Glasgow, I’m actually cautiously optimistic, seeing the thousands and thousands of people here who are working toward meaningful climate action. And I know that same spirit exists on our own campus also.”

    As for MIT’s own climate plan, Zuber emphasized three points: “We’re committed to action; second of all, we’re committed to moving fast; and third, we’ve organized ourselves better for success.” That organization includes the creation of the MIT Climate Steering Committee, to oversee and coordinate MIT’s strategies on climate change; the Climate Nucleus, to oversee the management and implementation of the new plan; and three working groups that are forming now, to involve all parts of the MIT community.

    The “Fast Forward” plan calls for reducing the campus’s net greenhouse gas emissions to zero by 2026 and eliminating all such emissions, including indirect ones, by 2050. At Monday’s event, Director of Sustainability Julie Newman pointed out that the climate plan includes no less than 14 specific commitments related to the campus itself. These can be grouped into five broad areas, she said: mitigation, resiliency, electric vehicle infrastructure, investment portfolio sustainability, and climate leadership. “Each of these commitments has due dates, and they range from the tactical to the strategic,” she said. “We’re in the midst of activating our internal teams” to address these commitments, she added, noting that there are 30 teams that involve 75 faculty and researcher members, plus up to eight student positions.

    One specific project that is well underway involves preparing a detailed map of the flood risks to the campus as sea levels rise and storm surges increase. While previous attempts to map out the campus flooding risks had treated buildings essentially as uniform blocks, the new project has already mapped out in detail the location, elevation, and condition of every access point — doors, windows, and drains — in every building in the main campus, and now plans to extend the work to the residence buildings and outlying parts of campus. The project’s methods for identifying and quantifying the risks to specific parts of the campus, Newman said, represents “part of our mission for leveraging the campus as a test bed” by creating a map that is “true to the nature of the topography and the infrastructure,” in order to be prepared for the effects of climate change.

    Also speaking at the Sustainability Connect event, Vice President for Campus Services and Stewardship Joe Higgins outlined a variety of measures that are underway to cut the carbon footprint of the campus as much as possible, as quickly as possible. Part of that, he explained, involves using the campus as a testbed for the development of the equivalent of a “smart thermostat” system for campus buildings. While such products exist commercially for homeowners, there is no such system yet for large institutional or commercial buildings.

    There is a team actively developing such a pilot program in some MIT buildings, he said, focusing on some large lab buildings that have especially high energy usage. They are examining the use of artificial intelligence to reduce energy consumption, he noted. By adding systems to monitor energy use, temperatures, occupancy, and so on, and to control heating, lighting and air conditioning systems, Higgins said at least a 3 to 5 percent reduction in energy use can be realized. “It may be well beyond that,” he added. “There’s a huge opportunity here.”

    Higgins also outlined the ongoing plan to convert the existing steam distribution system for campus heating into a hot water system. Though the massive undertaking may take decades to complete, he said that project alone may reduce campus carbon emissions by 10 percent. Other efforts include the installation of an additional 400 kilowatts of rooftop solar installations.

    Jeremy Gregory, executive director of MIT’s climate and sustainability consortium, described efforts to deal with the most far-reaching areas of greenhouse gas emission, the so-called Scope 3 emissions. He explained that Scope 1 is the direct emissions from the campus itself, from buildings and vehicles; Scope 2 includes indirect emissions from the generation of electricity; and Scope 3 is “everything else.” That includes employee travel, buildings that MIT leases from others and to others, and all goods and services, he added, “so it includes a lot of different categories of emissions.” Gregory said his team, including several student fellows, is actively investigating and quantifying these Scope 3 emissions at MIT, along with potential methods of reducing them.

    Professor Noelle Selin, who was recently named as co-chair of the new Climate Nucleus along with Professor Anne White, outlined their plans for the coming year, including the setting up of the three working groups.

    Selin said the nucleus consists of representatives of departments, labs, centers, and institutes that have significant responsibilities under the climate plan. That body will make recommendations to the steering committee, which includes the deans of all five of MIT’s schools and the MIT Schwarzman College of Computing, “about how to amplify MIT’s impact in the climate sphere. We have an implementation role, but we also have an accelerator pedal that can really make MIT’s climate impact more ambitious, and really push the buttons and make sure that the Institute’s commitments are actually borne out in reality.”

    The MIT Climate Tune-In also featured Selin and White, as well as a presentation on MIT’s expanded educational offerings on climate and sustainability, from Sarah Meyers, ESI’s education program manager; students Derek Allmond and Natalie Northrup; and postdoc Peter Godart. Professor Dennis Whyte also spoke about MIT and Commonwealth Fusion Systems’ recent historical advance toward commercial fusion energy. Organizers said that the Climate Tune-In event is the first of what they hope will be many opportunities to hear updates on the wide range of work happening across campus to implement the Fast Forward plan, and to spark conversations within the MIT community. More