More stories

  • in

    Earth can regulate its own temperature over millennia, new study finds

    The Earth’s climate has undergone some big changes, from global volcanism to planet-cooling ice ages and dramatic shifts in solar radiation. And yet life, for the last 3.7 billion years, has kept on beating.

    Now, a study by MIT researchers in Science Advances confirms that the planet harbors a “stabilizing feedback” mechanism that acts over hundreds of thousands of years to pull the climate back from the brink, keeping global temperatures within a steady, habitable range.

    Just how does it accomplish this? A likely mechanism is “silicate weathering” — a geological process by which the slow and steady weathering of silicate rocks involves chemical reactions that ultimately draw carbon dioxide out of the atmosphere and into ocean sediments, trapping the gas in rocks.

    Scientists have long suspected that silicate weathering plays a major role in regulating the Earth’s carbon cycle. The mechanism of silicate weathering could provide a geologically constant force in keeping carbon dioxide — and global temperatures — in check. But there’s never been direct evidence for the continual operation of such a feedback, until now.

    The new findings are based on a study of paleoclimate data that record changes in average global temperatures over the last 66 million years. The MIT team applied a mathematical analysis to see whether the data revealed any patterns characteristic of stabilizing phenomena that reined in global temperatures on a  geologic timescale.

    They found that indeed there appears to be a consistent pattern in which the Earth’s temperature swings are dampened over timescales of hundreds of thousands of years. The duration of this effect is similar to the timescales over which silicate weathering is predicted to act.

    The results are the first to use actual data to confirm the existence of a stabilizing feedback, the mechanism of which is likely silicate weathering. This stabilizing feedback would explain how the Earth has remained habitable through dramatic climate events in the geologic past.

    “On the one hand, it’s good because we know that today’s global warming will eventually be canceled out through this stabilizing feedback,” says Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “But on the other hand, it will take hundreds of thousands of years to happen, so not fast enough to solve our present-day issues.”

    The study is co-authored by Arnscheidt and Daniel Rothman, professor of geophysics at MIT.

    Stability in data

    Scientists have previously seen hints of a climate-stabilizing effect in the Earth’s carbon cycle: Chemical analyses of ancient rocks have shown that the flux of carbon in and out of Earth’s surface environment has remained relatively balanced, even through dramatic swings in global temperature. Furthermore, models of silicate weathering predict that the process should have some stabilizing effect on the global climate. And finally, the fact of the Earth’s enduring habitability points to some inherent, geologic check on extreme temperature swings.

    “You have a planet whose climate was subjected to so many dramatic external changes. Why did life survive all this time? One argument is that we need some sort of stabilizing mechanism to keep temperatures suitable for life,” Arnscheidt says. “But it’s never been demonstrated from data that such a mechanism has consistently controlled Earth’s climate.”

    Arnscheidt and Rothman sought to confirm whether a stabilizing feedback has indeed been at work, by looking at data of global temperature fluctuations through geologic history. They worked with a range of global temperature records compiled by other scientists, from the chemical composition of ancient marine fossils and shells, as well as preserved Antarctic ice cores.

    “This whole study is only possible because there have been great advances in improving the resolution of these deep-sea temperature records,” Arnscheidt notes. “Now we have data going back 66 million years, with data points at most thousands of years apart.”

    Speeding to a stop

    To the data, the team applied the mathematical theory of stochastic differential equations, which is commonly used to reveal patterns in widely fluctuating datasets.

    “We realized this theory makes predictions for what you would expect Earth’s temperature history to look like if there had been feedbacks acting on certain timescales,” Arnscheidt explains.

    Using this approach, the team analyzed the history of average global temperatures over the last 66 million years, considering the entire period over different timescales, such as tens of thousands of years versus hundreds of thousands, to see whether any patterns of stabilizing feedback emerged within each timescale.

    “To some extent, it’s like your car is speeding down the street, and when you put on the brakes, you slide for a long time before you stop,” Rothman says. “There’s a timescale over which frictional resistance, or a stabilizing feedback, kicks in, when the system returns to a steady state.”

    Without stabilizing feedbacks, fluctuations of global temperature should grow with timescale. But the team’s analysis revealed a regime in which fluctuations did not grow, implying that a stabilizing mechanism reigned in the climate before fluctuations grew too extreme. The timescale for this stabilizing effect — hundreds of thousands of years — coincides with what scientists predict for silicate weathering.

    Interestingly, Arnscheidt and Rothman found that on longer timescales, the data did not reveal any stabilizing feedbacks. That is, there doesn’t appear to be any recurring pull-back of global temperatures on timescales longer than a million years. Over these longer timescales, then, what has kept global temperatures in check?

    “There’s an idea that chance may have played a major role in determining why, after more than 3 billion years, life still exists,” Rothman offers.

    In other words, as the Earth’s temperatures fluctuate over longer stretches, these fluctuations may just happen to be small enough in the geologic sense, to be within a range that a stabilizing feedback, such as silicate weathering, could periodically keep the climate in check, and more to the point, within a habitable zone.

    “There are two camps: Some say random chance is a good enough explanation, and others say there must be a stabilizing feedback,” Arnscheidt says. “We’re able to show, directly from data, that the answer is probably somewhere in between. In other words, there was some stabilization, but pure luck likely also played a role in keeping Earth continuously habitable.”

    This research was supported, in part, by a MathWorks fellowship and the National Science Foundation. More

  • in

    Advancing the energy transition amidst global crises

    “The past six years have been the warmest on the planet, and our track record on climate change mitigation is drastically short of what it needs to be,” said Robert C. Armstrong, MIT Energy Initiative (MITEI) director and the Chevron Professor of Chemical Engineering, introducing MITEI’s 15th Annual Research Conference.

    At the symposium, participants from academia, industry, and finance acknowledged the deepening difficulties of decarbonizing a world rocked by geopolitical conflicts and suffering from supply chain disruptions, energy insecurity, inflation, and a persistent pandemic. In spite of this grim backdrop, the conference offered evidence of significant progress in the energy transition. Researchers provided glimpses of a low-carbon future, presenting advances in such areas as long-duration energy storage, carbon capture, and renewable technologies.

    In his keynote remarks, Ernest J. Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus, founding director of MITEI, and former U.S. secretary of energy, highlighted “four areas that have materially changed in the last year” that could shake up, and possibly accelerate, efforts to address climate change.

    Extreme weather seems to be propelling the public and policy makers of both U.S. parties toward “convergence … at least in recognition of the challenge,” Moniz said. He perceives a growing consensus that climate goals will require — in diminishing order of certainty — firm (always-on) power to complement renewable energy sources, a fuel (such as hydrogen) flowing alongside electricity, and removal of atmospheric carbon dioxide (CO2).

    Russia’s invasion of Ukraine, with its “weaponization of natural gas” and global energy impacts, underscores the idea that climate, energy security, and geopolitics “are now more or less recognized widely as one conversation.” Moniz pointed as well to new U.S. laws on climate change and infrastructure that will amplify the role of science and technology and “address the drive to technological dominance by China.”

    The rapid transformation of energy systems will require a comprehensive industrial policy, Moniz said. Government and industry must select and rapidly develop low-carbon fuels, firm power sources (possibly including nuclear power), CO2 removal systems, and long-duration energy storage technologies. “We will need to make progress on all fronts literally in this decade to come close to our goals for climate change mitigation,” he concluded.

    Global cooperation?

    Over two days, conference participants delved into many of the issues Moniz raised. In one of the first panels, scholars pondered whether the international community could forge a coordinated climate change response. The United States’ rift with China, especially over technology trade policies, loomed large.

    “Hatred of China is a bipartisan hobby and passion, but a blanket approach isn’t right, even for the sake of national security,” said Yasheng Huang, the Epoch Foundation Professor of Global Economics and Management at the MIT Sloan School of Management. “Although the United States and China working together would have huge effects for both countries, it is politically unpalatable in the short term,” said F. Taylor Fravel, the Arthur and Ruth Sloan Professor of Political Science and director of the MIT Security Studies Program. John E. Parsons, deputy director for research at the MIT Center for Energy and Environmental Policy Research, suggested that the United States should use this moment “to get our own act together … and start doing things,” such as building nuclear power plants in a cost-effective way.

    Debating carbon removal

    Several panels took up the matter of carbon emissions and the most promising technologies for contending with them. Charles Harvey, MIT professor of civil and environmental engineering, and Howard Herzog, a senior research engineer at MITEI, set the stage early, debating whether capturing carbon was essential to reaching net-zero targets.

    “I have no trouble getting to net zero without carbon capture and storage,” said David Keith, the Gordon McKay Professor of Applied Physics at Harvard University, in a subsequent roundtable. Carbon capture seems more risky to Keith than solar geoengineering, which involves injecting sulfur into the stratosphere to offset CO2 and its heat-trapping impacts.

    There are new ways of moving carbon from where it’s a problem to where it’s safer. Kripa K. Varanasi, MIT professor of mechanical engineering, described a process for modulating the pH of ocean water to remove CO2. Timothy Krysiek, managing director for Equinor Ventures, talked about construction of a 900-kilometer pipeline transporting CO2 from northern Germany to a large-scale storage site located in Norwegian waters 3,000 meters below the seabed. “We can use these offshore Norwegian assets as a giant carbon sink for Europe,” he said.

    A startup showcase featured additional approaches to the carbon challenge. Mantel, which received MITEI Seed Fund money, is developing molten salt material to capture carbon for long-term storage or for use in generating electricity. Verdox has come up with an electrochemical process for capturing dilute CO2 from the atmosphere.

    But while much of the global warming discussion focuses on CO2, other greenhouse gases are menacing. Another panel discussed measuring and mitigating these pollutants. “Methane has 82 times more warming power than CO2 from the point of emission,” said Desirée L. Plata, MIT associate professor of civil and environmental engineering. “Cutting methane is the strongest lever we have to slow climate change in the next 25 years — really the only lever.”

    Steven Hamburg, chief scientist and senior vice president of the Environmental Defense Fund, cautioned that emission of hydrogen molecules into the atmosphere can cause increases in other greenhouse gases such as methane, ozone, and water vapor. As researchers and industry turn to hydrogen as a fuel or as a feedstock for commercial processes, “we will need to minimize leakage … or risk increasing warming,” he said.

    Supply chains, markets, and new energy ventures

    In panels on energy storage and the clean energy supply chain, there were interesting discussions of challenges ahead. High-density energy materials such as lithium, cobalt, nickel, copper, and vanadium for grid-scale energy storage, electric vehicles (EVs), and other clean energy technologies, can be difficult to source. “These often come from water-stressed regions, and we need to be super thoughtful about environmental stresses,” said Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in Materials Science and Engineering. She also noted that in light of the explosive growth in demand for metals such as lithium, recycling EVs won’t be of much help. “The amount of material coming back from end-of-life batteries is minor,” she said, until EVs are much further along in their adoption cycle.

    Arvind Sanger, founder and managing partner of Geosphere Capital, said that the United States should be developing its own rare earths and minerals, although gaining the know-how will take time, and overcoming “NIMBYism” (not in my backyard-ism) is a challenge. Sanger emphasized that we must continue to use “denser sources of energy” to catalyze the energy transition over the next decade. In particular, Sanger noted that “for every transition technology, steel is needed,” and steel is made in furnaces that use coal and natural gas. “It’s completely woolly-headed to think we can just go to a zero-fossil fuel future in a hurry,” he said.

    The topic of power markets occupied another panel, which focused on ways to ensure the distribution of reliable and affordable zero-carbon energy. Integrating intermittent resources such as wind and solar into the grid requires a suite of retail markets and new digital tools, said Anuradha Annaswamy, director of MIT’s Active-Adaptive Control Laboratory. Tim Schittekatte, a postdoc at the MIT Sloan School of Management, proposed auctions as a way of insuring consumers against periods of high market costs.

    Another panel described the very different investment needs of new energy startups, such as longer research and development phases. Hooisweng Ow, technology principal at Eni Next LLC Ventures, which is developing drilling technology for geothermal energy, recommends joint development and partnerships to reduce risk. Michael Kearney SM ’11, PhD ’19, SM ’19 is a partner at The Engine, a venture firm built by MIT investing in path-breaking technology to solve the toughest challenges in climate and other problems. Kearney believes the emergence of new technologies and markets will bring on “a labor transition on an order of magnitude never seen before in this country,” he said. “Workforce development is not a natural zone for startups … and this will have to change.”

    Supporting the global South

    The opportunities and challenges of the energy transition look quite different in the developing world. In conversation with Robert Armstrong, Luhut Binsar Pandjaitan, the coordinating minister for maritime affairs and investment of the Republic of Indonesia, reported that his “nation is rich with solar, wind, and energy transition minerals like nickel and copper,” but cannot on its own tackle developing renewable energy or reducing carbon emissions and improving grid infrastructure. “Education is a top priority, and we are very far behind in high technologies,” he said. “We need help and support from MIT to achieve our target,” he said.

    Technologies that could springboard Indonesia and other nations of the global South toward their climate goals are emerging in MITEI-supported projects and at young companies MITEI helped spawn. Among the promising innovations unveiled at the conference are new materials and designs for cooling buildings in hot climates and reducing the environmental costs of construction, and a sponge-like substance that passively sucks moisture out of the air to lower the energy required for running air conditioners in humid climates.

    Other ideas on the move from lab to market have great potential for industrialized nations as well, such as a computational framework for maximizing the energy output of ocean-based wind farms; a process for using ammonia as a renewable fuel with no CO2 emissions; long-duration energy storage derived from the oxidation of iron; and a laser-based method for unlocking geothermal steam to drive power plants. More

  • in

    Methane research takes on new urgency at MIT

    One of the most notable climate change provisions in the 2022 Inflation Reduction Act is the first U.S. federal tax on a greenhouse gas (GHG). That the fee targets methane (CH4), rather than carbon dioxide (CO2), emissions is indicative of the urgency the scientific community has placed on reducing this short-lived but powerful gas. Methane persists in the air about 12 years — compared to more than 1,000 years for CO2 — yet it immediately causes about 120 times more warming upon release. The gas is responsible for at least a quarter of today’s gross warming. 

    “Methane has a disproportionate effect on near-term warming,” says Desiree Plata, the director of MIT Methane Network. “CH4 does more damage than CO2 no matter how long you run the clock. By removing methane, we could potentially avoid critical climate tipping points.” 

    Because GHGs have a runaway effect on climate, reductions made now will have a far greater impact than the same reductions made in the future. Cutting methane emissions will slow the thawing of permafrost, which could otherwise lead to massive methane releases, as well as reduce increasing emissions from wetlands.  

    “The goal of MIT Methane Network is to reduce methane emissions by 45 percent by 2030, which would save up to 0.5 degree C of warming by 2100,” says Plata, an associate professor of civil and environmental engineering at MIT and director of the Plata Lab. “When you consider that governments are trying for a 1.5-degree reduction of all GHGs by 2100, this is a big deal.” 

    Under normal concentrations, methane, like CO2, poses no health risks. Yet methane assists in the creation of high levels of ozone. In the lower atmosphere, ozone is a key component of air pollution, which leads to “higher rates of asthma and increased emergency room visits,” says Plata. 

    Methane-related projects at the Plata Lab include a filter made of zeolite — the same clay-like material used in cat litter — designed to convert methane into CO2 at dairy farms and coal mines. At first glance, the technology would appear to be a bit of a hard sell, since it converts one GHG into another. Yet the zeolite filter’s low carbon and dollar costs, combined with the disproportionate warming impact of methane, make it a potential game-changer.

    The sense of urgency about methane has been amplified by recent studies that show humans are generating far more methane emissions than previously estimated, and that the rates are rising rapidly. Exactly how much methane is in the air is uncertain. Current methods for measuring atmospheric methane, such as ground, drone, and satellite sensors, “are not readily abundant and do not always agree with each other,” says Plata.  

    The Plata Lab is collaborating with Tim Swager in the MIT Department of Chemistry to develop low-cost methane sensors. “We are developing chemiresisitive sensors that cost about a dollar that you could place near energy infrastructure to back-calculate where leaks are coming from,” says Plata.  

    The researchers are working on improving the accuracy of the sensors using machine learning techniques and are planning to integrate internet-of-things technology to transmit alerts. Plata and Swager are not alone in focusing on data collection: the Inflation Reduction Act adds significant funding for methane sensor research. 

    Other research at the Plata Lab includes the development of nanomaterials and heterogeneous catalysis techniques for environmental applications. The lab also explores mitigation solutions for industrial waste, particularly those related to the energy transition. Plata is the co-founder of an lithium-ion battery recycling startup called Nth Cycle. 

    On a more fundamental level, the Plata Lab is exploring how to develop products with environmental and social sustainability in mind. “Our overarching mission is to change the way that we invent materials and processes so that environmental objectives are incorporated along with traditional performance and cost metrics,” says Plata. “It is important to do that rigorous assessment early in the design process.”

    Play video

    MIT amps up methane research 

    The MIT Methane Network brings together 26 researchers from MIT along with representatives of other institutions “that are dedicated to the idea that we can reduce methane levels in our lifetime,” says Plata. The organization supports research such as Plata’s zeolite and sensor projects, as well as designing pipeline-fixing robots, developing methane-based fuels for clean hydrogen, and researching the capture and conversion of methane into liquid chemical precursors for pharmaceuticals and plastics. Other members are researching policies to encourage more sustainable agriculture and land use, as well as methane-related social justice initiatives. 

    “Methane is an especially difficult problem because it comes from all over the place,” says Plata. A recent Global Carbon Project study estimated that half of methane emissions are caused by humans. This is led by waste and agriculture (28 percent), including cow and sheep belching, rice paddies, and landfills.  

    Fossil fuels represent 18 percent of the total budget. Of this, about 63 percent is derived from oil and gas production and pipelines, 33 percent from coal mining activities, and 5 percent from industry and transportation. Human-caused biomass burning, primarily from slash-and-burn agriculture, emits about 4 percent of the global total.  

    The other half of the methane budget includes natural methane emissions from wetlands (20 percent) and other natural sources (30 percent). The latter includes permafrost melting and natural biomass burning, such as forest fires started by lightning.  

    With increases in global warming and population, the line between anthropogenic and natural causes is getting fuzzier. “Human activities are accelerating natural emissions,” says Plata. “Climate change increases the release of methane from wetlands and permafrost and leads to larger forest and peat fires.”  

    The calculations can get complicated. For example, wetlands provide benefits from CO2 capture, biological diversity, and sea level rise resiliency that more than compensate for methane releases. Meanwhile, draining swamps for development increases emissions. 

    Over 100 nations have signed onto the U.N.’s Global Methane Pledge to reduce at least 30 percent of anthropogenic emissions within the next 10 years. The U.N. report estimates that this goal can be achieved using proven technologies and that about 60 percent of these reductions can be accomplished at low cost. 

    Much of the savings would come from greater efficiencies in fossil fuel extraction, processing, and delivery. The methane fees in the Inflation Reduction Act are primarily focused on encouraging fossil fuel companies to accelerate ongoing efforts to cap old wells, flare off excess emissions, and tighten pipeline connections.  

    Fossil fuel companies have already made far greater pledges to reduce methane than they have with CO2, which is central to their business. This is due, in part, to the potential savings, as well as in preparation for methane regulations expected from the Environmental Protection Agency in late 2022. The regulations build upon existing EPA oversight of drilling operations, and will likely be exempt from the U.S. Supreme Court’s ruling that limits the federal government’s ability to regulate GHGs. 

    Zeolite filter targets methane in dairy and coal 

    The “low-hanging fruit” of gas stream mitigation addresses most of the 20 percent of total methane emissions in which the gas is released in sufficiently high concentrations for flaring. Plata’s zeolite filter aims to address the thornier challenge of reducing the 80 percent of non-flammable dilute emissions. 

    Plata found inspiration in decades-old catalysis research for turning methane into methanol. One strategy has been to use an abundant, low-cost aluminosilicate clay called zeolite.  

    “The methanol creation process is challenging because you need to separate a liquid, and it has very low efficiency,” says Plata. “Yet zeolite can be very efficient at converting methane into CO2, and it is much easier because it does not require liquid separation. Converting methane to CO2 sounds like a bad thing, but there is a major anti-warming benefit. And because methane is much more dilute than CO2, the relative CO2 contribution is minuscule.”  

    Using zeolite to create methanol requires highly concentrated methane, high temperatures and pressures, and industrial processing conditions. Yet Plata’s process, which dopes the zeolite with copper, operates in the presence of oxygen at much lower temperatures under typical pressures. “We let the methane proceed the way it wants from a thermodynamic perspective from methane to methanol down to CO2,” says Plata. 

    Researchers around the world are working on other dilute methane removal technologies. Projects include spraying iron salt aerosols into sea air where they react with natural chlorine or bromine radicals, thereby capturing methane. Most of these geoengineering solutions, however, are difficult to measure and would require massive scale to make a difference.  

    Plata is focusing her zeolite filters on environments where concentrations are high, but not so high as to be flammable. “We are trying to scale zeolite into filters that you could snap onto the side of a cross-ventilation fan in a dairy barn or in a ventilation air shaft in a coal mine,” says Plata. “For every packet of air we bring in, we take a lot of methane out, so we get more bang for our buck.”  

    The major challenge is creating a filter that can handle high flow rates without getting clogged or falling apart. Dairy barn air handlers can push air at up to 5,000 cubic feet per minute and coal mine handlers can approach 500,000 CFM. 

    Plata is exploring engineering options including fluidized bed reactors with floating catalyst particles. Another filter solution, based in part on catalytic converters, features “higher-order geometric structures where you have a porous material with a long path length where the gas can interact with the catalyst,” says Plata. “This avoids the challenge with fluidized beds of containing catalyst particles in the reactor. Instead, they are fixed within a structured material.”  

    Competing technologies for removing methane from mine shafts “operate at temperatures of 1,000 to 1,200 degrees C, requiring a lot of energy and risking explosion,” says Plata. “Our technology avoids safety concerns by operating at 300 to 400 degrees C. It reduces energy use and provides more tractable deployment costs.” 

    Potentially, energy and dollar costs could be further reduced in coal mines by capturing the heat generated by the conversion process. “In coal mines, you have enrichments above a half-percent methane, but below the 4 percent flammability threshold,” says Plata. “The excess heat from the process could be used to generate electricity using off-the-shelf converters.” 

    Plata’s dairy barn research is funded by the Gerstner Family Foundation and the coal mining project by the U.S. Department of Energy. “The DOE would like us to spin out the technology for scale-up within three years,” says Plata. “We cannot guarantee we will hit that goal, but we are trying to develop this as quickly as possible. Our society needs to start reducing methane emissions now.”  More

  • in

    3Q: Why Europe is so vulnerable to heat waves

    This year saw high-temperature records shattered across much of Europe, as crops withered in the fields due to widespread drought. Is this a harbinger of things to come as the Earth’s climate steadily warms up?

    Elfatih Eltahir, MIT professor of civil and environmental engineering and H. M. King Bhumibol Professor of Hydrology and Climate, and former doctoral student Alexandre Tuel PhD ’20 recently published a piece in the Bulletin of the Atomic Scientists describing how their research helps explain this anomalous European weather. The findings are based in part on analyses described in their book “Future Climate of the Mediterranean and Europe,” published earlier this year. MIT News asked the two authors to describe the dynamics behind these extreme weather events.

    Q: Was the European heat wave this summer anticipated based on existing climate models?

    Eltahir: Climate models project increasingly dry summers over Europe. This is especially true for the second half of the 21st century, and for southern Europe. Extreme dryness is often associated with hot conditions and heat waves, since any reduction in evaporation heats the soil and the air above it. In general, models agree in making such projections about European summers. However, understanding the physical mechanisms responsible for these projections is an active area of research.

    The same models that project dry summers over southern Europe also project dry winters over the neighboring Mediterranean Sea. In fact, the Mediterranean Sea stands out as one of the most significantly impacted regions — a literal “hot spot” — for winter droughts triggered by climate change. Again, until recently, the association between the projections of summer dryness over Europe and dry winters over the Mediterranean was not understood.

    In recent MIT doctoral research, carried out in the Department of Civil and Environmental Engineering, a hypothesis was developed to explain why the Mediterranean stands out as a hot spot for winter droughts under climate change. Further, the same theory offers a mechanistic understanding that connects the projections of dry summers over southern Europe and dry winters over the Mediterranean.

    What is exciting about the observed climate over Europe last summer is the fact that the observed drought started and developed with spatial and temporal patterns that are consistent with our proposed theory, and in particular the connection to the dry conditions observed over the Mediterranean during the previous winter.

    Q: What is it about the area around the Mediterranean basin that produces such unusual weather extremes?

    Eltahir: Multiple factors come together to cause extreme heat waves such as the one that Europe has experienced this summer, as well as previously, in 2003, 2015, 2018, 2019, and 2020. Among these, however, mutual influences between atmospheric dynamics and surface conditions, known as land-atmosphere feedbacks, seem to play a very important role.

    In the current climate, southern Europe is located in the transition zone between the dry subtropics (the Sahara Desert in North Africa) and the relatively wet midlatitudes (with a climate similar to that of the Pacific Northwest). High summertime temperatures tend to make the precipitation that falls to the ground evaporate quickly, and as a consequence soil moisture during summer is very dependent on springtime precipitation. A dry spring in Europe (such as the 2022 one) causes dry soils in late spring and early summer. This lack of surface water in turn limits surface evaporation during summer. Two important consequences follow: First, incoming radiative energy from the sun preferentially goes into increasing air temperature rather than evaporating water; and second, the inflow of water into air layers near the surface decreases, which makes the air drier and precipitation less likely. Combined, these two influences increase the likelihood of heat waves and droughts.

    Tuel: Through land-atmosphere feedbacks, dry springs provide a favorable environment for persistent warm and dry summers but are of course not enough to directly cause heat waves. A spark is required to ignite the fuel. In Europe and elsewhere, this spark is provided by large-scale atmospheric dynamics. If an anticyclone sets over an area with very dry soils, surface temperature can quickly shoot up as land-atmosphere feedbacks come into play, developing into a heat wave that can persist for weeks.

    The sensitivity to springtime precipitation makes southern Europe and the Mediterranean particularly prone to persistent summer heat waves. This will play an increasingly important role in the future, as spring precipitation is expected to decline, making scorching summers even more likely in this corner of the world. The decline in spring precipitation, which originates as an anomalously dry winter around the Mediterranean, is very robust across climate projections. Southern Europe and the Mediterranean really stand out from most other land areas, where precipitation will on average increase with global warming.

    In our work, we showed that this Mediterranean winter decline was driven by two independent factors: on the one hand, trends in the large-scale circulation, notably stationary atmospheric waves, and on the other hand, reduced warming of the Mediterranean Sea relative to the surrounding continents — a well-known feature of global warming. Both factors lead to increased surface air pressure and reduced precipitation over the Mediterranean and Southern Europe.

    Q: What can we expect over the coming decades in terms of the frequency and severity of these kinds of droughts, floods, and other extremes in European weather?

    Tuel: Climate models have long shown that the frequency and intensity of heat waves was bound to increase as the global climate warms, and Europe is no exception. The reason is simple: As the global temperature rises, the temperature distribution shifts toward higher values, and heat waves become more intense and more frequent. Southern Europe and the Mediterranean, however, will be hit particularly hard. The reason for this is related to the land-atmosphere feedbacks we just discussed. Winter precipitation over the Mediterranean and spring precipitation over southern Europe will decline significantly, which will lead to a decrease in early summer soil moisture over southern Europe and will push average summer temperatures even higher; the region will become a true climate change hot spot. In that sense, 2022 may really be a taste of the future. The succession of recent heat waves in Europe, however, suggests that things may be going faster than climate model projections imply. Decadal variability or badly understood trends in large-scale atmospheric dynamics may play a role here, though that is still debated. Another possibility is that climate models tend to underestimate the magnitude of land-atmosphere feedbacks and downplay the influence of dry soil moisture anomalies on summertime weather.

    Potential trends in floods are more difficult to assess because floods result from a multiplicity of factors, like extreme precipitation, soil moisture levels, or land cover. Extreme precipitation is generally expected to increase in most regions, but very high uncertainties remain, notably because extreme precipitation is highly dependent on atmospheric dynamics about which models do not always agree. What is almost certain is that with warming, the water content of the atmosphere increases (following a law of thermodynamics known as the Clausius-Clapeyron relationship). Thus, if the dynamics are favorable to precipitation, a lot more of it may fall in a warmer climate. Last year’s floods in Germany, for example, were triggered by unprecedented heavy rainfall which climate change made more likely. More

  • in

    Getting the carbon out of India’s heavy industries

    The world’s third largest carbon emitter after China and the United States, India ranks seventh in a major climate risk index. Unless India, along with the nearly 200 other signatory nations of the Paris Agreement, takes aggressive action to keep global warming well below 2 degrees Celsius relative to preindustrial levels, physical and financial losses from floods, droughts, and cyclones could become more severe than they are today. So, too, could health impacts associated with the hazardous air pollution levels now affecting more than 90 percent of its population.  

    To address both climate and air pollution risks and meet its population’s escalating demand for energy, India will need to dramatically decarbonize its energy system in the coming decades. To that end, its initial Paris Agreement climate policy pledge calls for a reduction in carbon dioxide intensity of GDP by 33-35 percent by 2030 from 2005 levels, and an increase in non-fossil-fuel-based power to about 40 percent of cumulative installed capacity in 2030. At the COP26 international climate change conference, India announced more aggressive targets, including the goal of achieving net-zero emissions by 2070.

    Meeting its climate targets will require emissions reductions in every economic sector, including those where emissions are particularly difficult to abate. In such sectors, which involve energy-intensive industrial processes (production of iron and steel; nonferrous metals such as copper, aluminum, and zinc; cement; and chemicals), decarbonization options are limited and more expensive than in other sectors. Whereas replacing coal and natural gas with solar and wind could lower carbon dioxide emissions in electric power generation and transportation, no easy substitutes can be deployed in many heavy industrial processes that release CO2 into the air as a byproduct.

    However, other methods could be used to lower the emissions associated with these processes, which draw upon roughly 50 percent of India’s natural gas, 25 percent of its coal, and 20 percent of its oil. Evaluating the potential effectiveness of such methods in the next 30 years, a new study in the journal Energy Economics led by researchers at the MIT Joint Program on the Science and Policy of Global Change is the first to explicitly explore emissions-reduction pathways for India’s hard-to-abate sectors.

    Using an enhanced version of the MIT Economic Projection and Policy Analysis (EPPA) model, the study assesses existing emissions levels in these sectors and projects how much they can be reduced by 2030 and 2050 under different policy scenarios. Aimed at decarbonizing industrial processes, the scenarios include the use of subsidies to increase electricity use, incentives to replace coal with natural gas, measures to improve industrial resource efficiency, policies to put a price on carbon, carbon capture and storage (CCS) technology, and hydrogen in steel production.

    The researchers find that India’s 2030 Paris Agreement pledge may still drive up fossil fuel use and associated greenhouse gas emissions, with projected carbon dioxide emissions from hard-to-abate sectors rising by about 2.6 times from 2020 to 2050. But scenarios that also promote electrification, natural gas support, and resource efficiency in hard-to-abate sectors can lower their CO2 emissions by 15-20 percent.

    While appearing to move the needle in the right direction, those reductions are ultimately canceled out by increased demand for the products that emerge from these sectors. So what’s the best path forward?

    The researchers conclude that only the incentive of carbon pricing or the advance of disruptive technology can move hard-to-abate sector emissions below their current levels. To achieve significant emissions reductions, they maintain, the price of carbon must be high enough to make CCS economically viable. In that case, reductions of 80 percent below current levels could be achieved by 2050.

    “Absent major support from the government, India will be unable to reduce carbon emissions in its hard-to-abate sectors in alignment with its climate targets,” says MIT Joint Program deputy director Sergey Paltsev, the study’s lead author. “A comprehensive government policy could provide robust incentives for the private sector in India and generate favorable conditions for foreign investments and technology advances. We encourage decision-makers to use our findings to design efficient pathways to reduce emissions in those sectors, and thereby help lower India’s climate and air pollution-related health risks.” More

  • in

    Kerry Emanuel: A climate scientist and meteorologist in the eye of the storm

    Kerry Emanuel once joked that whenever he retired, he would start a “hurricane safari” so other people could experience what it’s like to fly into the eye of a hurricane.

    “All of a sudden, the turbulence stops, the sun comes out, bright sunshine, and it’s amazingly calm. And you’re in this grand stadium [of clouds miles high],” he says. “It’s quite an experience.”

    While the hurricane safari is unlikely to come to fruition — “You can’t just conjure up a hurricane,” he explains — Emanuel, a world-leading expert on links between hurricanes and climate change, is retiring from teaching in the Department of Earth Atmospheric and Planetary Sciences (EAPS) at MIT after a more than 40-year career.

    Best known for his foundational contributions to the science of tropical cyclones, climate, and links between them, Emanuel has also been a prominent voice in public debates on climate change, and what we should do about it.

    “Kerry has had an enormous effect on the world through the students and junior scientists he has trained,” says William Boos PhD ’08, an atmospheric scientist at the University of California at Berkeley. “He’s a brilliant enough scientist and theoretician that he didn’t need any of us to accomplish what he has, but he genuinely cares about educating new generations of scientists and helping to launch their careers.”

    In recognition of Emanuel’s teaching career and contributions to science, a symposium was held in his honor at MIT on June 21 and 22, organized by several of his former students and collaborators, including Boos. Research presented at the symposium focused on the many fields influenced by Emanuel’s more than 200 published research papers — on everything from forecasting the risks posed by tropical cyclones to understanding how rainfall is produced by continent-sized patterns of atmospheric circulation.

    Emanuel’s career observing perturbations of Earth’s atmosphere started earlier than he can remember. “According to my older brother, from the age of 2, I would crawl to the window whenever there was a thunderstorm,” he says. At first, those were the rolling thunderheads of the Midwest where he grew up, then it was the edges of hurricanes during a few teenage years in Florida. Eventually, he would find himself watching from the very eye of the storm, both physically and mathematically.

    Emanuel attended MIT both as an undergraduate studying Earth and planetary sciences, and for his PhD in meteorology, writing a dissertation on thunderstorms that form ahead of cold fronts. Within the department, he worked with some of the central figures of modern meteorology such as Jule Charney, Fred Sanders, and Edward Lorenz — the founder of chaos theory.

    After receiving his PhD in 1978, Emanuel joined the faculty of the University of California at Los Angeles. During this period, he also took a semester sabbatical to film the wind speeds of tornadoes in Texas and Oklahoma. After three years, he returned to MIT and joined the Department of Meteorology in 1981. Two years later, the department merged with Earth and Planetary Sciences to form EAPS as it is known today, and where Emanuel has remained ever since.

    At MIT, he shifted scales. The thunderstorms and tornadoes that had been the focus of Emanuel’s research up to then were local atmospheric phenomena, or “mesoscale” in the language of meteorologists. The larger “synoptic scale” storms that are hurricanes blew into Emanuel’s research when as a young faculty member he was asked to teach a class in tropical meteorology; in prepping for the class, Emanuel found his notes on hurricanes from graduate school no longer made sense.

    “I realized I didn’t understand them because they couldn’t have been correct,” he says. “And so I set out to try to find a much better theoretical formulation for hurricanes.”

    He soon made two important contributions. In 1986, his paper “An Air-Sea Interaction Theory for Tropical Cyclones. Part 1: Steady-State Maintenance” developed a new theory for upper limits of hurricane intensity given atmospheric conditions. This work in turn led to even larger-scale questions to address. “That upper bound had to be dependent on climate, and it was likely to go up if we were to warm the climate,” Emanuel says — a phenomenon he explored in another paper, “The Dependence of Hurricane Intensity on Climate,” which showed how warming sea surface temperatures and changing atmospheric conditions from a warming climate would make hurricanes more destructive.

    “In my view, this is among the most remarkable achievements in theoretical geophysics,” says Adam Sobel PhD ’98, an atmospheric scientist at Columbia University who got to know Emanuel after he graduated and became interested in tropical meteorology. “From first principles, using only pencil-and-paper analysis and physical reasoning, he derives a quantitative bound on hurricane intensity that has held up well over decades of comparison to observations” and underpins current methods of predicting hurricane intensity and how it changes with climate.

    This and diverse subsequent work led to numerous honors, including membership to the American Philosophical Society, the National Academy of Sciences, and the American Academy of Arts and Sciences.

    Emanuel’s research was never confined to academic circles, however; when politicians and industry leaders voiced loud opposition to the idea that human-caused climate change posed a threat, he spoke up.

    “I felt kind of a duty to try to counter that,” says Emanuel. “I thought it was an interesting challenge to see if you could go out and convince what some people call climate deniers, skeptics, that this was a serious risk and we had to treat it as such.”

    In addition to many public lectures and media appearances discussing climate change, Emanuel penned a book for general audiences titled “What We Know About Climate Change,” in addition to a widely-read primer on climate change and risk assessment designed to influence business leaders.

    “Kerry has an unmatched physical understanding of tropical climate phenomena,” says Emanuel’s colleague, Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at EAPS. “But he’s also a great communicator and has generously given his time to public outreach. His book ‘What We Know About Climate Change’ is a beautiful piece of work that is readily understandable and has captivated many a non-expert reader.”

    Along with a number of other prominent climate scientists, Emanuel also began advocating for expanding nuclear power as the most rapid path to decarbonizing the world’s energy systems.

    “I think the impediment to nuclear is largely irrational in the United States,” he says. “So, I’ve been trying to fight that just like I’ve been trying to fight climate denial.”

    One lesson Emanuel has taken from his public work on climate change is that skeptical audiences often respond better to issues framed in positive terms than to doom and gloom; he’s found emphasizing the potential benefits rather than the sacrifices involved in the energy transition can engage otherwise wary audiences.

    “It’s really not opposition to science, per se,” he says. “It’s fear of the societal changes they think are required to do something about it.”

    He has also worked to raise awareness about how insurance companies significantly underestimate climate risks in their policies, in particular by basing hurricane risk on unreliable historical data. One recent practical result has been a project by the First Street Foundation to assess the true flood risk of every property in the United States using hurricane models Emanuel developed.

    “I think it’s transformative,” Emanuel says of the project with First Street. “That may prove to be the most substantive research I’ve done.”

    Though Emanuel is retiring from teaching, he has no plans to stop working. “When I say ‘retire’ it’s in quotes,” he says. In 2011, Emanuel and Professor of Geophysics Daniel Rothman founded the Lorenz Center, a climate research center at MIT in honor of Emanuel’s mentor and friend Edward Lorenz. Emanuel will continue to participate in work at the center, which aims to counter what Emanuel describes as a trend away from “curiosity-driven” work in climate science.

    “Even if there were no such thing as global warming, [climate science] would still be a really, really exciting field,” says Emanuel. “There’s so much to understand about climate, about the climates of the past, about the climates of other planets.”

    In addition to work with the Lorenz Center, he’s become interested once again in tornadoes and severe local storms, and understanding whether climate also controls such local phenomena. He’s also involved in two of MIT’s Climate Grand Challenges projects focused on translating climate hazards to explicit financial and health risks — what will bring the dangers of climate change home to people, he says, is for the public to understand more concrete risks, like agricultural failure, water shortages, electricity shortages, and severe weather events. Capturing that will drive the next few years of his work.

    “I’m going to be stepping up research in some respects,” he says, now living full-time at his home in Maine.

    Of course, “retiring” does mean a bit more free time for new pursuits, like learning a language or an instrument, and “rediscovering the art of sailing,” says Emanuel. He’s looking forward to those days on the water, whatever storms are to come. More

  • in

    Cracking the case of Arctic sea ice breakup

    Despite its below-freezing temperatures, the Arctic is warming twice as fast as the rest of the planet. As Arctic sea ice melts, fewer bright surfaces are available to reflect sunlight back into space. When fractures open in the ice cover, the water underneath gets exposed. Dark, ice-free water absorbs the sun’s energy, heating the ocean and driving further melting — a vicious cycle. This warming in turn melts glacial ice, contributing to rising sea levels.

    Warming climate and rising sea levels endanger the nearly 40 percent of the U.S. population living in coastal areas, the billions of people who depend on the ocean for food and their livelihoods, and species such as polar bears and Artic foxes. Reduced ice coverage is also making the once-impassable region more accessible, opening up new shipping lanes and ports. Interest in using these emerging trans-Arctic routes for product transit, extraction of natural resources (e.g., oil and gas), and military activity is turning an area traditionally marked by low tension and cooperation into one of global geopolitical competition.

    As the Arctic opens up, predicting when and where the sea ice will fracture becomes increasingly important in strategic decision-making. However, huge gaps exist in our understanding of the physical processes contributing to ice breakup. Researchers at MIT Lincoln Laboratory seek to help close these gaps by turning a data-sparse environment into a data-rich one. They envision deploying a distributed set of unattended sensors across the Arctic that will persistently detect and geolocate ice fracturing events. Concurrently, the network will measure various environmental conditions, including water temperature and salinity, wind speed and direction, and ocean currents at different depths. By correlating these fracturing events and environmental conditions, they hope to discover meaningful insights about what is causing the sea ice to break up. Such insights could help predict the future state of Arctic sea ice to inform climate modeling, climate change planning, and policy decision-making at the highest levels.

    “We’re trying to study the relationship between ice cracking, climate change, and heat flow in the ocean,” says Andrew March, an assistant leader of Lincoln Laboratory’s Advanced Undersea Systems and Technology Group. “Do cracks in the ice cause warm water to rise and more ice to melt? Do undersea currents and waves cause cracking? Does cracking cause undersea waves? These are the types of questions we aim to investigate.”

    Arctic access

    In March 2022, Ben Evans and Dave Whelihan, both researchers in March’s group, traveled for 16 hours across three flights to Prudhoe Bay, located on the North Slope of Alaska. From there, they boarded a small specialized aircraft and flew another 90 minutes to a three-and-a-half-mile-long sheet of ice floating 160 nautical miles offshore in the Arctic Ocean. In the weeks before their arrival, the U.S. Navy’s Arctic Submarine Laboratory had transformed this inhospitable ice floe into a temporary operating base called Ice Camp Queenfish, named after the first Sturgeon-class submarine to operate under the ice and the fourth to reach the North Pole. The ice camp featured a 2,500-foot-long runway, a command center, sleeping quarters to accommodate up to 60 personnel, a dining tent, and an extremely limited internet connection.

    At Queenfish, for the next four days, Evans and Whelihan joined U.S. Navy, Army, Air Force, Marine Corps, and Coast Guard members, and members of the Royal Canadian Air Force and Navy and United Kingdom Royal Navy, who were participating in Ice Exercise (ICEX) 2022. Over the course of about three weeks, more than 200 personnel stationed at Queenfish, Prudhoe Bay, and aboard two U.S. Navy submarines participated in this biennial exercise. The goals of ICEX 2022 were to assess U.S. operational readiness in the Arctic; increase our country’s experience in the region; advance our understanding of the Arctic environment; and continue building relationships with other services, allies, and partner organizations to ensure a free and peaceful Arctic. The infrastructure provided for ICEX concurrently enables scientists to conduct research in an environment — either in person or by sending their research equipment for exercise organizers to deploy on their behalf — that would be otherwise extremely difficult and expensive to access.

    In the Arctic, windchill temperatures can plummet to as low as 60 degrees Fahrenheit below zero, cold enough to freeze exposed skin within minutes. Winds and ocean currents can drift the entire camp beyond the reach of nearby emergency rescue aircraft, and the ice can crack at any moment. To ensure the safety of participants, a team of Navy meteorological specialists continually monitors the ever-changing conditions. The original camp location for ICEX 2022 had to be evacuated and relocated after a massive crack formed in the ice, delaying Evans’ and Whelihan’s trip. Even the newly selected site had a large crack form behind the camp and another crack that necessitated moving a number of tents.

    “Such cracking events are only going to increase as the climate warms, so it’s more critical now than ever to understand the physical processes behind them,” Whelihan says. “Such an understanding will require building technology that can persist in the environment despite these incredibly harsh conditions. So, it’s a challenge not only from a scientific perspective but also an engineering one.”

    “The weather always gets a vote, dictating what you’re able to do out here,” adds Evans. “The Arctic Submarine Laboratory does a lot of work to construct the camp and make it a safe environment where researchers like us can come to do good science. ICEX is really the only opportunity we have to go onto the sea ice in a place this remote to collect data.”

    A legacy of sea ice experiments

    Though this trip was Whelihan’s and Evans’ first to the Arctic region, staff from the laboratory’s Advanced Undersea Systems and Technology Group have been conducting experiments at ICEX since 2018. However, because of the Arctic’s remote location and extreme conditions, data collection has rarely been continuous over long periods of time or widespread across large areas. The team now hopes to change that by building low-cost, expendable sensing platforms consisting of co-located devices that can be left unattended for automated, persistent, near-real-time monitoring. 

    “The laboratory’s extensive expertise in rapid prototyping, seismo-acoustic signal processing, remote sensing, and oceanography make us a natural fit to build this sensor network,” says Evans.

    In the months leading up to the Arctic trip, the team collected seismometer data at Firepond, part of the laboratory’s Haystack Observatory site in Westford, Massachusetts. Through this local data collection, they aimed to gain a sense of what anthropogenic (human-induced) noise would look like so they could begin to anticipate the kinds of signatures they might see in the Arctic. They also collected ice melting/fracturing data during a thaw cycle and correlated these data with the weather conditions (air temperature, humidity, and pressure). Through this analysis, they detected an increase in seismic signals as the temperature rose above 32 F — an indication that air temperature and ice cracking may be related.

    A sensing network

    At ICEX, the team deployed various commercial off-the-shelf sensors and new sensors developed by the laboratory and University of New Hampshire (UNH) to assess their resiliency in the frigid environment and to collect an initial dataset.

    “One aspect that differentiates these experiments from those of the past is that we concurrently collected seismo-acoustic data and environmental parameters,” says Evans.

    The commercial technologies were seismometers to detect the vibrational energy released when sea ice fractures or collides with other ice floes; a hydrophone (underwater microphone) array to record the acoustic energy created by ice-fracturing events; a sound speed profiler to measure the speed of sound through the water column; and a conductivity, temperature, and depth (CTD) profiler to measure the salinity (related to conductivity), temperature, and pressure (related to depth) throughout the water column. The speed of sound in the ocean primarily depends on these three quantities. 

    To precisely measure the temperature across the entire water column at one location, they deployed an array of transistor-based temperature sensors developed by the laboratory’s Advanced Materials and Microsystems Group in collaboration with the Advanced Functional Fabrics of America Manufacturing Innovation Institute. The small temperature sensors run along the length of a thread-like polymer fiber embedded with multiple conductors. This fiber platform, which can support a broad range of sensors, can be unspooled hundreds of feet below the water’s surface to concurrently measure temperature or other water properties — the fiber deployed in the Arctic also contained accelerometers to measure depth — at many points in the water column. Traditionally, temperature profiling has required moving a device up and down through the water column.

    The team also deployed a high-frequency echosounder supplied by Anthony Lyons and Larry Mayer, collaborators at UNH’s Center for Coastal and Ocean Mapping. This active sonar uses acoustic energy to detect internal waves, or waves occurring beneath the ocean’s surface.

    “You may think of the ocean as a homogenous body of water, but it’s not,” Evans explains. “Different currents can exist as you go down in depth, much like how you can get different winds when you go up in altitude. The UNH echosounder allows us to see the different currents in the water column, as well as ice roughness when we turn the sensor to look upward.”

    “The reason we care about currents is that we believe they will tell us something about how warmer water from the Atlantic Ocean is coming into contact with sea ice,” adds Whelihan. “Not only is that water melting ice but it also has lower salt content, resulting in oceanic layers and affecting how long ice lasts and where it lasts.”

    Back home, the team has begun analyzing their data. For the seismic data, this analysis involves distinguishing any ice events from various sources of anthropogenic noise, including generators, snowmobiles, footsteps, and aircraft. Similarly, the researchers know their hydrophone array acoustic data are contaminated by energy from a sound source that another research team participating in ICEX placed in the water. Based on their physics, icequakes — the seismic events that occur when ice cracks — have characteristic signatures that can be used to identify them. One approach is to manually find an icequake and use that signature as a guide for finding other icequakes in the dataset.

    From their water column profiling sensors, they identified an interesting evolution in the sound speed profile 30 to 40 meters below the ocean surface, related to a mass of colder water moving in later in the day. The group’s physical oceanographer believes this change in the profile is due to water coming up from the Bering Sea, water that initially comes from the Atlantic Ocean. The UNH-supplied echosounder also generated an interesting signal at a similar depth.

    “Our supposition is that this result has something to do with the large sound speed variation we detected, either directly because of reflections off that layer or because of plankton, which tend to rise on top of that layer,” explains Evans.  

    A future predictive capability

    Going forward, the team will continue mining their collected data and use these data to begin building algorithms capable of automatically detecting and localizing — and ultimately predicting — ice events correlated with changes in environmental conditions. To complement their experimental data, they have initiated conversations with organizations that model the physical behavior of sea ice, including the National Oceanic and Atmospheric Administration and the National Ice Center. Merging the laboratory’s expertise in sensor design and signal processing with their expertise in ice physics would provide a more complete understanding of how the Arctic is changing.

    The laboratory team will also start exploring cost-effective engineering approaches for integrating the sensors into packages hardened for deployment in the harsh environment of the Arctic.

    “Until these sensors are truly unattended, the human factor of usability is front and center,” says Whelihan. “Because it’s so cold, equipment can break accidentally. For example, at ICEX 2022, our waterproof enclosure for the seismometers survived, but the enclosure for its power supply, which was made out of a cheaper plastic, shattered in my hand when I went to pick it up.”

    The sensor packages will not only need to withstand the frigid environment but also be able to “phone home” over some sort of satellite data link and sustain their power. The team plans to investigate whether waste heat from processing can keep the instruments warm and how energy could be harvested from the Arctic environment.

    Before the next ICEX scheduled for 2024, they hope to perform preliminary testing of their sensor packages and concepts in Arctic-like environments. While attending ICEX 2022, they engaged with several other attendees — including the U.S. Navy, Arctic Submarine Laboratory, National Ice Center, and University of Alaska Fairbanks (UAF) — and identified cold room experimentation as one area of potential collaboration. Testing can also be performed at outdoor locations a bit closer to home and more easily accessible, such as the Great Lakes in Michigan and a UAF-maintained site in Barrow, Alaska. In the future, the laboratory team may have an opportunity to accompany U.S. Coast Guard personnel on ice-breaking vessels traveling from Alaska to Greenland. The team is also thinking about possible venues for collecting data far removed from human noise sources.

    “Since I’ve told colleagues, friends, and family I was going to the Arctic, I’ve had a lot of interesting conversations about climate change and what we’re doing there and why we’re doing it,” Whelihan says. “People don’t have an intrinsic, automatic understanding of this environment and its impact because it’s so far removed from us. But the Arctic plays a crucial role in helping to keep the global climate in balance, so it’s imperative we understand the processes leading to sea ice fractures.”

    This work is funded through Lincoln Laboratory’s internally administered R&D portfolio on climate. More

  • in

    Lama Willa Baker challenges MIT audience to look beyond technology to solve the climate crises

    Buddhist teacher Willa Blythe Baker called for an “embodied revolution,” in speaking to an MIT audience on May 5, to create a world in which we realize we are connected and interdependent with each other and with our natural environment. She envisioned a world in which we always ask of every question: “How will this affect our bodies, trees, plants, mosses, water, air around us?”

    Authorized as a dharma teacher and lineage holder (lama) in the Kagyu lineage of Tibetan Buddhism, Baker holds a PhD in religion form Harvard University and is founder and spiritual co-director of the Natural Dharma Fellowship in Boston. As experts warn of warming oceans, rising sea levels, turbulent weather, mass extinctions, droughts, hunger, and global pandemics, she said, “Much is made of what we must do, but little is made of how we must live and who we must become”

    The climate crisis has been “framed as a set of problems that need to be solved through intellectual ingenuity, engineering, and technology. These solutions are critical, but they do not require grappling with the underlying issue … They do not look beyond doing, to being.’“

    Part of the problem, Baker pointed out, is that in discussing climate change, we frequently approach it in terms of what we must give up to live more sustainably — but not in terms of what we gain by living simply and mindfully.


    Baker outlined her view that “disembodiment” is a key underlying cause of the global environmental crisis. This disembodied state causes us to feel separate from our ecosystem, and from one another, and from our own bodies, leading to a state of constant worry about the past or the future, and to a constant desire or ambition for more. Disembodiment  is the state of being “up in the head” and out of touch with the body, and being disconnected from the here and now.

    The climate crisis, Baker put forward, is in part a result of society’s long journey away from the embodied ways of being in earlier agrarian societies in which there was a more intimate relationship between humans and their natural world.

    The contemplative tradition

    Baker said the contemplative perspective, and the practices of meditation and mindfulness, have much to offer climate activists. Rather than viewing meditation, prayer, or contemplation as passive acts, these practices are active pursuits, according to Baker, as “engagements of attention and embodiment that steward novel ways of knowing and being.”

    She explained further how an “embodied contemplative perspective” re-frames the climate crisis. Instead of viewing the crisis as external, the climate crisis calls for us to look inward to our motivations and values. “It is asking us to inquire into who and what we are, not just what we do.” Rather than seeing ourselves as “stewards” of the planet, we should see ourselves as part of the planet.

    “The idea of embodiment gets us to explore who we are in the deepest sense … Embodiment is a journey from our isolated sense of separateness, our sense of limited cognitive identity, back to the body and senses, back to our animal wisdom, back to the earthly organic identity of being bound by gravity.”

    Baker pointed to the central Buddhist tenet that we live with the illusion of separateness, and, she said, “the task of this human life is to see beyond the veil of that illusion.”

    Embodiment will bring us “back to the body and senses; back to our animal wisdom; back to the earthly organic identity of being bound by gravity. These wisdoms remind us of who we are — that we are of the Earth.”

    How much is enough?

    A lively discussion was held following the presentation. One audience member asked how to reconcile the idea of looking to the body for wisdom, when some of the climate crisis is fueled by our need for bodily comfort. Baker replied, “We have started to associate comfort with plenty … That’s a point of reflection. How much is enough?” She said that part of the Buddhist path is the cultivation of knowing that whatever you have is enough.

    One MIT student studying mechanical engineering asked how to reconcile these ideas with a capitalistic society. He pointed out that “a lot of industry is driven by the need to accumulate more capital … Every year, you want to increase your balance sheet … How do you tell companies that what you have is enough?”

    Baker agreed that that our current economic system constantly encourages us to want “more.” “Human happiness is at stake, in addition to our planet’s survival. If we’re told that the ‘next thing’ will make us happy, we will be seeking happiness externally. I think the system will change eventually. I don’t think we have any choice. The planet cannot sustain a world where we’re producing and producing more and more stuff for us to need and want.”

    One audience member asked how to meet the challenge of being embodied in our busy world. Baker said that “embodiment and disembodiment is a continuum. Sometimes we have to be in our head. We’re taking a test, or writing a paper. But we can get ‘up there’ so much that we forget we have a body.” She called for ‘bringing your attention down. Pausing and bring attention all the way down, and feeling the Earth below your feet … There’s a calming and centering that comes with coming down and connecting with the Earth below. Being present and grounded and in tune.”

    Baker said the body can show us, “Just here. Just now. Just this.”

    The speaker was introduced by Professor Emma J. Teng, the T.T. and Wei Fong Chao Professor of Asian Civilizations at MIT. This spring, Teng introduced a new class 21G.015 (Introduction to Buddhism, Mindfulness, and Meditation), a half-term subject that met with the class PE.0534 (Fitness and Meditation), taught by Sarah Johnson, so that students learned basic ideas of Buddhism and its history while having a chance to learn and practice mindfulness and meditation techniques.

    This event was the latest in the T.T. and W.F. Chao Distinguished Buddhist Lecture Series. This series engages the rich history of Buddhist thought and ethical action to advance critical dialogues on ethics, humanity, and MIT’s mission “to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.”

    Baker’s books include “Essence of Ambrosia” (2005), “Everyday Dharma”(2009), “The Arts of Contemplative Care” (2012) and “The Wakeful Body” (2021). Her guided meditations can be found here. More