More stories

  • in

    MIT students combat climate anxiety through extracurricular teams

    Climate anxiety affects nearly half of young people aged 16-25. Students like second-year Rachel Mohammed find hope and inspiration through her involvement in innovative climate solutions, working alongside peers who share her determination. “I’ve met so many people at MIT who are dedicated to finding climate solutions in ways that I had never imagined, dreamed of, or heard of. That is what keeps me going, and I’m doing my part,” she says.Hydrogen-fueled enginesHydrogen offers the potential for zero or near-zero emissions, with the ability to reduce greenhouse gases and pollution by 29 percent. However, the hydrogen industry faces many challenges related to storage solutions and costs.Mohammed leads the hydrogen team on MIT’s Electric Vehicle Team (EVT), which is dedicated to harnessing hydrogen power to build a cleaner, more sustainable future. EVT is one of several student-led build teams at the Edgerton Center focused on innovative climate solutions. Since its founding in 1992, the Edgerton Center has been a hub for MIT students to bring their ideas to life.Hydrogen is mostly used in large vehicles like trucks and planes because it requires a lot of storage space. EVT is building their second iteration of a motorcycle based on what Mohammed calls a “goofy hypothesis” that you can use hydrogen to power a small vehicle. The team employs a hydrogen fuel cell system, which generates electricity by combining hydrogen with oxygen. However, the technology faces challenges, particularly in storage, which EVT is tackling with innovative designs for smaller vehicles.Presenting at the 2024 World Hydrogen Summit reaffirmed Mohammed’s confidence in this project. “I often encounter skepticism, with people saying it’s not practical. Seeing others actively working on similar initiatives made me realize that we can do it too,” Mohammed says.The team’s first successful track test last October allowed them to evaluate the real-world performance of their hydrogen-powered motorcycle, marking a crucial step in proving the feasibility and efficiency of their design.MIT’s Sustainable Engine Team (SET), founded by junior Charles Yong, uses the combustion method to generate energy with hydrogen. This is a promising technology route for high-power-density applications, like aviation, but Yong believes it hasn’t received enough attention. Yong explains, “In the hydrogen power industry, startups choose fuel cell routes instead of combustion because gas turbine industry giants are 50 years ahead. However, these giants are moving very slowly toward hydrogen due to its not-yet-fully-developed infrastructure. Working under the Edgerton Center allows us to take risks and explore advanced tech directions to demonstrate that hydrogen combustion can be readily available.”Both EVT and SET are publishing their research and providing detailed instructions for anyone interested in replicating their results.Running on sunshineThe Solar Electric Vehicle Team powers a car built from scratch with 100 percent solar energy.The team’s single-occupancy car Nimbus won the American Solar Challenge two years in a row. This year, the team pushed boundaries further with Gemini, a multiple-occupancy vehicle that challenges conventional perceptions of solar-powered cars.Senior Andre Greene explains, “the challenge comes from minimizing how much energy you waste because you work with such little energy. It’s like the equivalent power of a toaster.”Gemini looks more like a regular car and less like a “spaceship,” as NBC’s 1st Look affectionately called Nimbus. “It more resembles what a fully solar-powered car could look like versus the single-seaters. You don’t see a lot of single-seater cars on the market, so it’s opening people’s minds,” says rising junior Tessa Uviedo, team captain.All-electric since 2013The MIT Motorsports team switched to an all-electric powertrain in 2013. Captain Eric Zhou takes inspiration from China, the world’s largest market for electric vehicles. “In China, there is a large government push towards electric, but there are also five or six big companies almost as large as Tesla size, building out these electric vehicles. The competition drives the majority of vehicles in China to become electric.”The team is also switching to four-wheel drive and regenerative braking next year, which reduces the amount of energy needed to run. “This is more efficient and better for power consumption because the torque from the motors is applied straight to the tires. It’s more efficient than having a rear motor that must transfer torque to both rear tires. Also, you’re taking advantage of all four tires in terms of producing grip, while you can only rely on the back tires in a rear-wheel-drive car,” Zhou says.Zhou adds that Motorsports wants to help prepare students for the electric vehicle industry. “A large majority of upperclassmen on the team have worked, or are working, at Tesla or Rivian.”Former Motorsports powertrain lead Levi Gershon ’23, SM ’24 recently founded CRABI Robotics — a fully autonomous marine robotic system designed to conduct in-transit cleaning of marine vessels by removing biofouling, increasing vessels’ fuel efficiency.An Indigenous approach to sustainable rocketsFirst Nations Launch, the all-Indigenous student rocket team, recently won the Grand Prize in the 2024 NASA First Nations Launch High-Power Rocket Competition. Using Indigenous methodologies, this team considers the environment in the materials and methods they employ.“The environmental impact is always something that we consider when we’re making design decisions and operational decisions. We’ve thought about things like biodegradable composites and parachutes,” says rising junior Hailey Polson, team captain. “Aerospace has been a very wasteful industry in the past. There are huge leaps and bounds being made with forward progress in regard to reusable rockets, which is definitely lowering the environmental impact.”Collecting climate change data with autonomous boatsArcturus, the recent first-place winner in design at the 16th Annual RoboBoat Competition, is developing autonomous surface vehicles that can greatly aid in marine research. “The ocean is one of our greatest resources to combat climate change; thus, the accessibility of data will help scientists understand climate patterns and predict future trends. This can help people learn how to prepare for potential disasters and how to reduce each of our carbon footprints,” says Arcturus captain and rising junior Amy Shi.“We are hoping to expand our outreach efforts to incorporate more sustainability-related programs. This can include more interactions with local students to introduce them to how engineering can make a positive impact in the climate space or other similar programs,” Shi says.Shi emphasizes that hope is a crucial force in the battle against climate change. “There are great steps being taken every day to combat this seemingly impending doom we call the climate crisis. It’s important to not give up hope, because this hope is what’s driving the leaps and bounds of innovation happening in the climate community. The mainstream media mostly reports on the negatives, but the truth is there is a lot of positive climate news every day. Being more intentional about where you seek your climate news can really help subside this feeling of doom about our planet.” More

  • in

    Tracking emissions to help companies reduce their environmental footprint

    Amidst a global wave of corporate pledges to decarbonize or reach net-zero emissions, a system for verifying actual greenhouse gas reductions has never been more important. Context Labs, founded by former MIT Sloan Fellow and serial entrepreneur Dan Harple SM ’13, is rising to meet that challenge with an analytics platform that brings more transparency to emissions data.The company’s platform adds context to data from sources like equipment sensors and satellites, provides third-party verification, and records all that information on a blockchain. Context Labs also provides an interactive view of emissions across every aspect of a company’s operations, allowing leaders to pinpoint the dirtiest parts of their business.“There’s an old adage: Unless you measure something, you can’t change it,” says Harple, who is the firm’s CEO. “I think of what we’re doing as an AI-driven digital lens into what’s happening across organizations. Our goal is to help the planet get better, faster.”Context Labs is already working with some of the largest energy companies in the world — including EQT, Williams Companies, and Coterra Energy — to verify emissions reductions. A partnership with Microsoft, announced at last year’s COP28 United Nations climate summit, allows any organization on Microsoft’s Azure cloud to integrate their sensor data into Context Lab’s platform to get a granular view of their environmental impact.Harple says the progress enables more informed sustainability initiatives at scale. He also sees the work as a way to combat overly vague statements about sustainable practices that don’t lead to actual emissions reductions, or what’s known as “greenwashing.”“Just producing data isn’t good enough, and our customers realize that, because they know even if they have good intentions to reduce emissions, no one is going to believe them,” Harple says. “One way to think about our platform is as antigreenwashing insurance, because if you get attacked for your emissions, we unbundle the data like it’s in shrink-wrap and roll it back through time on the blockchain. You can click on it and see exactly where and how it was measured, monitored, timestamped, its serial number, everything. It’s really the gold standard of proof.”An unconventional master’sHarple came to MIT as a serial founder whose companies had pioneered several foundational internet technologies, including real-time video streaming technology still used in applications like Zoom and Netflix, as well as some of the core technology for the popular Chinese microblogging website Weibo.Harple’s introduction to MIT started with a paper he wrote for his venture capital contacts in the U.S. to make the case for investment in the Netherlands, where he was living with his family. The paper caught the attention of MIT Professor Stuart Madnick, the John Norris Maguire Professor of Information Technology at the MIT Sloan School of Management, who suggested Harple come to MIT as a Sloan Fellow to further develop his ideas about what makes a strong innovation ecosystem.Having successfully founded and exited multiple companies, Harple was not a typical MIT student when he began the Sloan Fellows program in 2011. At one point, he held a summit at MIT for a group of leading Dutch entrepreneurs and government officials that included tours of major labs and a meeting with former MIT President L. Rafael Reif.“Everyone was super enamored with MIT, and that kicked off what became a course that I started at MIT called REAL, Regional Entrepreneurial Acceleration Lab,” Harple says. REAL was eventually absorbed by what is now REAP — the Regional Entrepreneurship Acceleration Program, which has worked with communities around the world.Harple describes REAL as a framework vehicle to put his theories on supporting innovation into action. Over his time at MIT, which also included collaborating with the Media Lab, he systematized those theories into what he calls pentalytics, which is a way to measure and predict the resilience of innovation ecosystems.“My sense was MIT should be analytical and data-driven,” Harple says. “The thesis I wrote was a framework for AI-driven network graph analytics. So, you can model things using analytics, and you can use AI to do predictive analytics to see where the innovation ecosystem is going to thrive.”Once Harple’s pentalytics theory was established, he wanted to put it to the test with a company. His initial idea for Context Labs was to build a verification platform to combat fake news, deepfakes, and other misinformation on the internet. Around 2018, Harple met climate investor Jeremy Grantham, who he says helped him realize the most important data are about the planet. Harple began to believe that U.S. Environmental Protection Agency (EPA) emissions estimates for things like driving a car or operating an oil rig were just that — estimates — and left room for improvement.“Our approach was very MIT-ish,” Harple says. “We said, ‘Let’s, measure it and let’s monitor it, and then let’s contextualize that data so you can never go back and say they faked it. I think there’s a lot of fakery that’s happened, and that’s why the voluntary carbon markets cratered in the last year. Our view is they cratered because the data wasn’t empirical enough.”Context Labs’ solution starts with a technology platform it calls Immutably that continuously combines disparate data streams, encrypts that information, and records it on a blockchain. Immutably also verifies the information with one or more third parties. (Context Labs has partnered with the global accounting firm KPMG.)On top of Immutably, Context Labs has built applications, including a product called Decarbonization-as-a-Service (DaaS), which uses Immutably’s data to give companies a digital twin of their entire operations. Customers can use DaaS to explore the emissions of their assets and create a certificate of verified CO2-equivalent emissions, which can be used in carbon credit markets.Putting emissions data into contextContext Labs is working with oil and gas companies, utilities, data centers, and large industrial operators, some using the platform to analyze more than 3 billion data points each day. For instance, EQT, the largest natural gas producer in the U.S., uses Context Labs to verify its lower-emission products and create carbon credits. Other customers include the nonprofits Rocky Mountain Institute and the Environmental Defense Fund.“I often get asked how big the total addressable market is,” Harple says. “My view is it’s the largest market in history. Why? Because every country needs a decarbonization plan, along with instrumentation and a digital platform to execute, as does every company.”With its headquarters in Kendall Square in Cambridge, Massachusetts, Context Labs is also serving as a test for Harple’s pentalytics theory for innovation ecosystems. It also has operations in Houston and Amsterdam.“This company is a living lab for pentalytics,” Harple says. “I believe Kendall Square 1.0 was factory buildings, Kendall Square 2.0 is biotech, and Kendall Square 3.0 will be climate tech.” More

  • in

    Scientists find a human “fingerprint” in the upper troposphere’s increasing ozone

    Ozone can be an agent of good or harm, depending on where you find it in the atmosphere. Way up in the stratosphere, the colorless gas shields the Earth from the sun’s harsh ultraviolet rays. But closer to the ground, ozone is a harmful air pollutant that can trigger chronic health problems including chest pain, difficulty breathing, and impaired lung function.And somewhere in between, in the upper troposphere — the layer of the atmosphere just below the stratosphere, where most aircraft cruise — ozone contributes to warming the planet as a potent greenhouse gas.There are signs that ozone is continuing to rise in the upper troposphere despite efforts to reduce its sources at the surface in many nations. Now, MIT scientists confirm that much of ozone’s increase in the upper troposphere is likely due to humans.In a paper appearing today in the journal Environmental Science and Technology, the team reports that they detected a clear signal of human influence on upper tropospheric ozone trends in a 17-year satellite record starting in 2005.“We confirm that there’s a clear and increasing trend in upper tropospheric ozone in the northern midlatitudes due to human beings rather than climate noise,” says study lead author Xinyuan Yu, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).“Now we can do more detective work and try to understand what specific human activities are leading to this ozone trend,” adds co-author Arlene Fiore, the Peter H. Stone and Paola Malanotte Stone Professor in Earth, Atmospheric and Planetary Sciences.The study’s MIT authors include Sebastian Eastham and Qindan Zhu, along with Benjamin Santer at the University of California at Los Angeles, Gustavo Correa of Columbia University, Jean-François Lamarque at the National Center for Atmospheric Research, and Jerald Zimeke at NASA Goddard Space Flight Center.Ozone’s tangled webUnderstanding ozone’s causes and influences is a challenging exercise. Ozone is not emitted directly, but instead is a product of “precursors” — starting ingredients, such as nitrogen oxides and volatile organic compounds (VOCs), that react in the presence of sunlight to form ozone. These precursors are generated from vehicle exhaust, power plants, chemical solvents, industrial processes, aircraft emissions, and other human-induced activities.Whether and how long ozone lingers in the atmosphere depends on a tangle of variables, including the type and extent of human activities in a given area, as well as natural climate variability. For instance, a strong El Niño year could nudge the atmosphere’s circulation in a way that affects ozone’s concentrations, regardless of how much ozone humans are contributing to the atmosphere that year.Disentangling the human- versus climate-driven causes of ozone trend, particularly in the upper troposphere, is especially tricky. Complicating matters is the fact that in the lower troposphere — the lowest layer of the atmosphere, closest to ground level — ozone has stopped rising, and has even fallen in some regions at northern midlatitudes in the last few decades. This decrease in lower tropospheric ozone is mainly a result of efforts in North America and Europe to reduce industrial sources of air pollution.“Near the surface, ozone has been observed to decrease in some regions, and its variations are more closely linked to human emissions,” Yu notes. “In the upper troposphere, the ozone trends are less well-monitored but seem to decouple with those near the surface, and ozone is more easily influenced by climate variability. So, we don’t know whether and how much of that increase in observed ozone in the upper troposphere is attributed to humans.”A human signal amid climate noiseYu and Fiore wondered whether a human “fingerprint” in ozone levels, caused directly by human activities, could be strong enough to be detectable in satellite observations in the upper troposphere. To see such a signal, the researchers would first have to know what to look for.For this, they looked to simulations of the Earth’s climate and atmospheric chemistry. Following approaches developed in climate science, they reasoned that if they could simulate a number of possible climate variations in recent decades, all with identical human-derived sources of ozone precursor emissions, but each starting with a slightly different climate condition, then any differences among these scenarios should be due to climate noise. By inference, any common signal that emerged when averaging over the simulated scenarios should be due to human-driven causes. Such a signal, then, would be a “fingerprint” revealing human-caused ozone, which the team could look for in actual satellite observations.With this strategy in mind, the team ran simulations using a state-of-the-art chemistry climate model. They ran multiple climate scenarios, each starting from the year 1950 and running through 2014.From their simulations, the team saw a clear and common signal across scenarios, which they identified as a human fingerprint. They then looked to tropospheric ozone products derived from multiple instruments aboard NASA’s Aura satellite.“Quite honestly, I thought the satellite data were just going to be too noisy,” Fiore admits. “I didn’t expect that the pattern would be robust enough.”But the satellite observations they used gave them a good enough shot. The team looked through the upper tropospheric ozone data derived from the satellite products, from the years 2005 to 2021, and found that, indeed, they could see the signal of human-caused ozone that their simulations predicted. The signal is especially pronounced over Asia, where industrial activity has risen significantly in recent decades and where abundant sunlight and frequent weather events loft pollution, including ozone and its precursors, to the upper troposphere.Yu and Fiore are now looking to identify the specific human activities that are leading to ozone’s increase in the upper troposphere.“Where is this increasing trend coming from? Is it the near-surface emissions from combusting fossil fuels in vehicle engines and power plants? Is it the aircraft that are flying in the upper troposphere? Is it the influence of wildland fires? Or some combination of all of the above?” Fiore says. “Being able to separate human-caused impacts from natural climate variations can help to inform strategies to address climate change and air pollution.”This research was funded, in part, by NASA. More

  • in

    China-based emissions of three potent climate-warming greenhouse gases spiked in past decade

    When it comes to heating up the planet, not all greenhouse gases are created equal. They vary widely in their global warming potential (GWP), a measure of how much infrared thermal radiation a greenhouse gas would absorb over a given time frame once it enters the atmosphere. For example, measured over a 100-year period, the GWP of methane is about 28 times that of carbon dioxide (CO2), and the GWPs of a class of greenhouse gases known as perfluorocarbons (PFCs) are thousands of times that of CO2. The lifespans in the atmosphere of different greenhouse gases also vary widely. Methane persists in the atmosphere for around 10 years; CO2 for over 100 years, and PFCs for up to tens of thousands of years.Given the high GWPs and lifespans of PFCs, their emissions could pose a major roadblock to achieving the aspirational goal of the Paris Agreement on climate change — to limit the increase in global average surface temperature to 1.5 degrees Celsius above preindustrial levels. Now, two new studies based on atmospheric observations inside China and high-resolution atmospheric models show a rapid rise in Chinese emissions over the last decade (2011 to 2020 or 2021) of three PFCs: tetrafluoromethane (PFC-14) and hexafluoroethane (PFC-116) (results in PNAS), and perfluorocyclobutane (PFC-318) (results in Environmental Science & Technology).Both studies find that Chinese emissions have played a dominant role in driving up global emission levels for all three PFCs.The PNAS study identifies substantial PFC-14 and PFC-116 emission sources in the less-populated western regions of China from 2011 to 2021, likely due to the large amount of aluminum industry in these regions. The semiconductor industry also contributes to some of the emissions detected in the more economically developed eastern regions. These emissions are byproducts from aluminum smelting, or occur during the use of the two PFCs in the production of semiconductors and flat panel displays. During the observation period, emissions of both gases in China rose by 78 percent, accounting for most of the increase in global emissions of these gases.The ES&T study finds that during 2011-20, a 70 percent increase in Chinese PFC-318 emissions (contributing more than half of the global emissions increase of this gas) — originated primarily in eastern China. The regions with high emissions of PFC-318 in China overlap with geographical areas densely populated with factories that produce polytetrafluoroethylene (PTFE, commonly used for nonstick cookware coatings), implying that PTFE factories are major sources of PFC-318 emissions in China. In these factories, PFC-318 is formed as a byproduct.“Using atmospheric observations from multiple monitoring sites, we not only determined the magnitudes of PFC emissions, but also pinpointed the possible locations of their sources,” says Minde An, a postdoc at the MIT Center for Global Change Science (CGCS), and corresponding author of both studies. “Identifying the actual source industries contributing to these PFC emissions, and understanding the reasons for these largely byproduct emissions, can provide guidance for developing region- or industry-specific mitigation strategies.”“These three PFCs are largely produced as unwanted byproducts during the manufacture of otherwise widely used industrial products,” says MIT professor of atmospheric sciences Ronald Prinn, director of both the MIT Joint Program on the Science and Policy of Global Change and CGCS, and a co-author of both studies. “Phasing out emissions of PFCs as early as possible is highly beneficial for achieving global climate mitigation targets and is likely achievable by recycling programs and targeted technological improvements in these industries.”Findings in both studies were obtained, in part, from atmospheric observations collected from nine stations within a Chinese network, including one station from the Advanced Global Atmospheric Gases Experiment (AGAGE) network. For comparison, global total emissions were determined from five globally distributed, relatively unpolluted “background” AGAGE stations, as reported in the latest United Nations Environment Program and World Meteorological Organization Ozone Assessment report. More

  • in

    Q&A: What past environmental success can teach us about solving the climate crisis

    Susan Solomon, MIT professor of Earth, atmospheric, and planetary sciences (EAPS) and of chemistry, played a critical role in understanding how a class of chemicals known as chlorofluorocarbons were creating a hole in the ozone layer. Her research was foundational to the creation of the Montreal Protocol, an international agreement established in the 1980s that phased out products releasing chlorofluorocarbons. Since then, scientists have documented signs that the ozone hole is recovering thanks to these measures.Having witnessed this historical process first-hand, Solomon, the Lee and Geraldine Martin Professor of Environmental Studies, is aware of how people can come together to make successful environmental policy happen. Using her story, as well as other examples of success — including combating smog, getting rid of DDT, and more — Solomon draws parallels from then to now as the climate crisis comes into focus in her new book, “Solvable: How we Healed the Earth and How we can do it Again.”Solomon took a moment to talk about why she picked the stories in her book, the students who inspired her, and why we need hope and optimism now more than ever.Q: You have first-hand experience seeing how we’ve altered the Earth, as well as the process of creating international environmental policy. What prompted you to write a book about your experiences?A: Lots of things, but one of the main ones is the things that I see in teaching. I have taught a class called Science, Politics and Environmental Policy for many years here at MIT. Because my emphasis is always on how we’ve actually fixed problems, students come away from that class feeling hopeful, like they really want to stay engaged with the problem.It strikes me that students today have grown up in a very contentious and difficult era in which they feel like nothing ever gets done. But stuff does get done, even now. Looking at how we did things so far really helps you to see how we can do things in the future.Q: In the book, you use five different stories as examples of successful environmental policy, and then end talking about how we can apply these lessons to climate change. Why did you pick these five stories?A: I picked some of them because I’m closer to those problems in my own professional experience, like ozone depletion and smog. I did other issues partly because I wanted to show that even in the 21st century, we’ve actually got some stuff done — that’s the story of the Kigali Amendment to the Montreal Protocol, which is a binding international agreement on some greenhouse gases.Another chapter is on DDT. One of the reasons I included that is because it had an enormous effect on the birth of the environmental movement in the United States. Plus, that story allows you to see how important the environmental groups can be.Lead in gasoline and paint is the other one. I find it a very moving story because the idea that we were poisoning millions of children and not even realizing it is so very, very sad. But it’s so uplifting that we did figure out the problem, and it happened partly because of the civil rights movement, that made us aware that the problem was striking minority communities much more than non-minority communities.Q: What surprised you the most during your research for the book?A: One of the things that that I didn’t realize and should have, was the outsized role played by one single senator, Ed Muskie of Maine. He made pollution control his big issue and devoted incredible energy to it. He clearly had the passion and wanted to do it for many years, but until other factors helped him, he couldn’t. That’s where I began to understand the role of public opinion and the way in which policy is only possible when public opinion demands change.Another thing about Muskie was the way in which his engagement with these issues demanded that science be strong. When I read what he put into congressional testimony I realized how highly he valued the science. Science alone is never enough, but it’s always necessary. Over the years, science got a lot stronger, and we developed ways of evaluating what the scientific wisdom across many different studies and many different views actually is. That’s what scientific assessment is all about, and it’s crucial to environmental progress.Q: Throughout the book you argue that for environmental action to succeed, three things must be met which you call the three Ps: a threat much be personal, perceptible, and practical. Where did this idea come from?A: My observations. You have to perceive the threat: In the case of the ozone hole, you could perceive it because those false-color images of the ozone loss were so easy to understand, and it was personal because few things are scarier than cancer, and a reduced ozone layer leads to too much sun, increasing skin cancers. Science plays a role in communicating what can be readily understood by the public, and that’s important to them perceiving it as a serious problem.Nowadays, we certainly perceive the reality of climate change. We also see that it’s personal. People are dying because of heat waves in much larger numbers than they used to; there are horrible problems in the Boston area, for example, with flooding and sea level rise. People perceive the reality of the problem and they feel personally threatened.The third P is practical: People have to believe that there are practical solutions. It’s interesting to watch how the battle for hearts and minds has shifted. There was a time when the skeptics would just attack the whole idea that the climate was changing. Eventually, they decided ‘we better accept that because people perceive it, so let’s tell them that it’s not caused by human activity.’ But it’s clear enough now that human activity does play a role. So they’ve moved on to attacking that third P, that somehow it’s not practical to have any kind of solutions. This is progress! So what about that third P?What I tried to do in the book is to point out some of the ways in which the problem has also become eminently practical to deal with in the last 10 years, and will continue to move in that direction. We’re right on the cusp of success, and we just have to keep going. People should not give in to eco despair; that’s the worst thing you could do, because then nothing will happen. If we continue to move at the rate we have, we will certainly get to where we need to be.Q: That ties in very nicely with my next question. The book is very optimistic; what gives you hope?A: I’m optimistic because I’ve seen so many examples of where we have succeeded, and because I see so many signs of movement right now that are going to push us in the same direction.If we had kept conducting business as usual as we had been in the year 2000, we’d be looking at 4 degrees of future warming. Right now, I think we’re looking at 3 degrees. I think we can get to 2 degrees. We have to really work on it, and we have to get going seriously in the next decade, but globally right now over 30 percent of our energy is from renewables. That’s fantastic! Let’s just keep going.Q: Throughout the book, you show that environmental problems won’t be solved by individual actions alone, but requires policy and technology driving. What individual actions can people take to help push for those bigger changes?A: A big one is choose to eat more sustainably; choose alternative transportation methods like public transportation or reducing the amount of trips that you make. Older people usually have retirement investments, you can shift them over to a social choice funds and away from index funds that end up funding companies that you might not be interested in. You can use your money to put pressure: Amazon has been under a huge amount of pressure to cut down on their plastic packaging, mainly coming from consumers. They’ve just announced they’re not going to use those plastic pillows anymore. I think you can see lots of ways in which people really do matter, and we can matter more.Q: What do you hope people take away from the book?A: Hope for their future and resolve to do the best they can getting engaged with it. More

  • in

    Study finds health risks in switching ships from diesel to ammonia fuel

    As container ships the size of city blocks cross the oceans to deliver cargo, their huge diesel engines emit large quantities of air pollutants that drive climate change and have human health impacts. It has been estimated that maritime shipping accounts for almost 3 percent of global carbon dioxide emissions and the industry’s negative impacts on air quality cause about 100,000 premature deaths each year.Decarbonizing shipping to reduce these detrimental effects is a goal of the International Maritime Organization, a U.N. agency that regulates maritime transport. One potential solution is switching the global fleet from fossil fuels to sustainable fuels such as ammonia, which could be nearly carbon-free when considering its production and use.But in a new study, an interdisciplinary team of researchers from MIT and elsewhere caution that burning ammonia for maritime fuel could worsen air quality further and lead to devastating public health impacts, unless it is adopted alongside strengthened emissions regulations.Ammonia combustion generates nitrous oxide (N2O), a greenhouse gas that is about 300 times more potent than carbon dioxide. It also emits nitrogen in the form of nitrogen oxides (NO and NO2, referred to as NOx), and unburnt ammonia may slip out, which eventually forms fine particulate matter in the atmosphere. These tiny particles can be inhaled deep into the lungs, causing health problems like heart attacks, strokes, and asthma.The new study indicates that, under current legislation, switching the global fleet to ammonia fuel could cause up to about 600,000 additional premature deaths each year. However, with stronger regulations and cleaner engine technology, the switch could lead to about 66,000 fewer premature deaths than currently caused by maritime shipping emissions, with far less impact on global warming.“Not all climate solutions are created equal. There is almost always some price to pay. We have to take a more holistic approach and consider all the costs and benefits of different climate solutions, rather than just their potential to decarbonize,” says Anthony Wong, a postdoc in the MIT Center for Global Change Science and lead author of the study.His co-authors include Noelle Selin, an MIT professor in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences (EAPS); Sebastian Eastham, a former principal research scientist who is now a senior lecturer at Imperial College London; Christine Mounaïm-Rouselle, a professor at the University of Orléans in France; Yiqi Zhang, a researcher at the Hong Kong University of Science and Technology; and Florian Allroggen, a research scientist in the MIT Department of Aeronautics and Astronautics. The research appears this week in Environmental Research Letters.Greener, cleaner ammoniaTraditionally, ammonia is made by stripping hydrogen from natural gas and then combining it with nitrogen at extremely high temperatures. This process is often associated with a large carbon footprint. The maritime shipping industry is betting on the development of “green ammonia,” which is produced by using renewable energy to make hydrogen via electrolysis and to generate heat.“In theory, if you are burning green ammonia in a ship engine, the carbon emissions are almost zero,” Wong says.But even the greenest ammonia generates nitrous oxide (N2O), nitrogen oxides (NOx) when combusted, and some of the ammonia may slip out, unburnt. This nitrous oxide would escape into the atmosphere, where the greenhouse gas would remain for more than 100 years. At the same time, the nitrogen emitted as NOx and ammonia would fall to Earth, damaging fragile ecosystems. As these emissions are digested by bacteria, additional N2O  is produced.NOx and ammonia also mix with gases in the air to form fine particulate matter. A primary contributor to air pollution, fine particulate matter kills an estimated 4 million people each year.“Saying that ammonia is a ‘clean’ fuel is a bit of an overstretch. Just because it is carbon-free doesn’t necessarily mean it is clean and good for public health,” Wong says.A multifaceted modelThe researchers wanted to paint the whole picture, capturing the environmental and public health impacts of switching the global fleet to ammonia fuel. To do so, they designed scenarios to measure how pollutant impacts change under certain technology and policy assumptions.From a technological point of view, they considered two ship engines. The first burns pure ammonia, which generates higher levels of unburnt ammonia but emits fewer nitrogen oxides. The second engine technology involves mixing ammonia with hydrogen to improve combustion and optimize the performance of a catalytic converter, which controls both nitrogen oxides and unburnt ammonia pollution.They also considered three policy scenarios: current regulations, which only limit NOx emissions in some parts of the world; a scenario that adds ammonia emission limits over North America and Western Europe; and a scenario that adds global limits on ammonia and NOx emissions.The researchers used a ship track model to calculate how pollutant emissions change under each scenario and then fed the results into an air quality model. The air quality model calculates the impact of ship emissions on particulate matter and ozone pollution. Finally, they estimated the effects on global public health.One of the biggest challenges came from a lack of real-world data, since no ammonia-powered ships are yet sailing the seas. Instead, the researchers relied on experimental ammonia combustion data from collaborators to build their model.“We had to come up with some clever ways to make that data useful and informative to both the technology and regulatory situations,” he says.A range of outcomesIn the end, they found that with no new regulations and ship engines that burn pure ammonia, switching the entire fleet would cause 681,000 additional premature deaths each year.“While a scenario with no new regulations is not very realistic, it serves as a good warning of how dangerous ammonia emissions could be. And unlike NOx, ammonia emissions from shipping are currently unregulated,” Wong says.However, even without new regulations, using cleaner engine technology would cut the number of premature deaths down to about 80,000, which is about 20,000 fewer than are currently attributed to maritime shipping emissions. With stronger global regulations and cleaner engine technology, the number of people killed by air pollution from shipping could be reduced by about 66,000.“The results of this study show the importance of developing policies alongside new technologies,” Selin says. “There is a potential for ammonia in shipping to be beneficial for both climate and air quality, but that requires that regulations be designed to address the entire range of potential impacts, including both climate and air quality.”Ammonia’s air quality impacts would not be felt uniformly across the globe, and addressing them fully would require coordinated strategies across very different contexts. Most premature deaths would occur in East Asia, since air quality regulations are less stringent in this region. Higher levels of existing air pollution cause the formation of more particulate matter from ammonia emissions. In addition, shipping volume over East Asia is far greater than elsewhere on Earth, compounding these negative effects.In the future, the researchers want to continue refining their analysis. They hope to use these findings as a starting point to urge the marine industry to share engine data they can use to better evaluate air quality and climate impacts. They also hope to inform policymakers about the importance and urgency of updating shipping emission regulations.This research was funded by the MIT Climate and Sustainability Consortium. More

  • in

    Study: Weaker ocean circulation could enhance CO2 buildup in the atmosphere

    As climate change advances, the ocean’s overturning circulation is predicted to weaken substantially. With such a slowdown, scientists estimate the ocean will pull down less carbon dioxide from the atmosphere. However, a slower circulation should also dredge up less carbon from the deep ocean that would otherwise be released back into the atmosphere. On balance, the ocean should maintain its role in reducing carbon emissions from the atmosphere, if at a slower pace.However, a new study by an MIT researcher finds that scientists may have to rethink the relationship between the ocean’s circulation and its long-term capacity to store carbon. As the ocean gets weaker, it could release more carbon from the deep ocean into the atmosphere instead.The reason has to do with a previously uncharacterized feedback between the ocean’s available iron, upwelling carbon and nutrients, surface microorganisms, and a little-known class of molecules known generally as “ligands.” When the ocean circulates more slowly, all these players interact in a self-perpetuating cycle that ultimately increases the amount of carbon that the ocean outgases back to the atmosphere.“By isolating the impact of this feedback, we see a fundamentally different relationship between ocean circulation and atmospheric carbon levels, with implications for the climate,” says study author Jonathan Lauderdale, a research scientist in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “What we thought is going on in the ocean is completely overturned.”Lauderdale says the findings show that “we can’t count on the ocean to store carbon in the deep ocean in response to future changes in circulation. We must be proactive in cutting emissions now, rather than relying on these natural processes to buy us time to mitigate climate change.”His study appears today in the journal Nature Communications.Box flowIn 2020, Lauderdale led a study that explored ocean nutrients, marine organisms, and iron, and how their interactions influence the growth of phytoplankton around the world. Phytoplankton are microscopic, plant-like organisms that live on the ocean surface and consume a diet of carbon and nutrients that upwell from the deep ocean and iron that drifts in from desert dust.The more phytoplankton that can grow, the more carbon dioxide they can absorb from the atmosphere via photosynthesis, and this plays a large role in the ocean’s ability to sequester carbon.For the 2020 study, the team developed a simple “box” model, representing conditions in different parts of the ocean as general boxes, each with a different balance of nutrients, iron, and ligands — organic molecules that are thought to be byproducts of phytoplankton. The team modeled a general flow between the boxes to represent the ocean’s larger circulation — the way seawater sinks, then is buoyed back up to the surface in different parts of the world.This modeling revealed that, even if scientists were to “seed” the oceans with extra iron, that iron wouldn’t have much of an effect on global phytoplankton growth. The reason was due to a limit set by ligands. It turns out that, if left on its own, iron is insoluble in the ocean and therefore unavailable to phytoplankton. Iron only becomes soluble at “useful” levels when linked with ligands, which keep iron in a form that plankton can consume. Lauderdale found that adding iron to one ocean region to consume additional nutrients robs other regions of nutrients that phytoplankton there need to grow. This lowers the production of ligands and the supply of iron back to the original ocean region, limiting the amount of extra carbon that would be taken up from the atmosphere.Unexpected switchOnce the team published their study, Lauderdale worked the box model into a form that he could make publicly accessible, including ocean and atmosphere carbon exchange and extending the boxes to represent more diverse environments, such as conditions similar to the Pacific, the North Atlantic, and the Southern Ocean. In the process, he tested other interactions within the model, including the effect of varying ocean circulation.He ran the model with different circulation strengths, expecting to see less atmospheric carbon dioxide with weaker ocean overturning — a relationship that previous studies have supported, dating back to the 1980s. But what he found instead was a clear and opposite trend: The weaker the ocean’s circulation, the more CO2 built up in the atmosphere.“I thought there was some mistake,” Lauderdale recalls. “Why were atmospheric carbon levels trending the wrong way?”When he checked the model, he found that the parameter describing ocean ligands had been left “on” as a variable. In other words, the model was calculating ligand concentrations as changing from one ocean region to another.On a hunch, Lauderdale turned this parameter “off,” which set ligand concentrations as constant in every modeled ocean environment, an assumption that many ocean models typically make. That one change reversed the trend, back to the assumed relationship: A weaker circulation led to reduced atmospheric carbon dioxide. But which trend was closer to the truth?Lauderdale looked to the scant available data on ocean ligands to see whether their concentrations were more constant or variable in the actual ocean. He found confirmation in GEOTRACES, an international study that coordinates measurements of trace elements and isotopes across the world’s oceans, that scientists can use to compare concentrations from region to region. Indeed, the molecules’ concentrations varied. If ligand concentrations do change from one region to another, then his surprise new result was likely representative of the real ocean: A weaker circulation leads to more carbon dioxide in the atmosphere.“It’s this one weird trick that changed everything,” Lauderdale says. “The ligand switch has revealed this completely different relationship between ocean circulation and atmospheric CO2 that we thought we understood pretty well.”Slow cycleTo see what might explain the overturned trend, Lauderdale analyzed biological activity and carbon, nutrient, iron, and ligand concentrations from the ocean model under different circulation strengths, comparing scenarios where ligands were variable or constant across the various boxes.This revealed a new feedback: The weaker the ocean’s circulation, the less carbon and nutrients the ocean pulls up from the deep. Any phytoplankton at the surface would then have fewer resources to grow and would produce fewer byproducts (including ligands) as a result. With fewer ligands available, less iron at the surface would be usable, further reducing the phytoplankton population. There would then be fewer phytoplankton available to absorb carbon dioxide from the atmosphere and consume upwelled carbon from the deep ocean.“My work shows that we need to look more carefully at how ocean biology can affect the climate,” Lauderdale points out. “Some climate models predict a 30 percent slowdown in the ocean circulation due to melting ice sheets, particularly around Antarctica. This huge slowdown in overturning circulation could actually be a big problem: In addition to a host of other climate issues, not only would the ocean take up less anthropogenic CO2 from the atmosphere, but that could be amplified by a net outgassing of deep ocean carbon, leading to an unanticipated increase in atmospheric CO2 and unexpected further climate warming.”  More

  • in

    Reducing carbon emissions from long-haul trucks

    People around the world rely on trucks to deliver the goods they need, and so-called long-haul trucks play a critical role in those supply chains. In the United States, long-haul trucks moved 71 percent of all freight in 2022. But those long-haul trucks are heavy polluters, especially of the carbon emissions that threaten the global climate. According to U.S. Environmental Protection Agency estimates, in 2022 more than 3 percent of all carbon dioxide (CO2) emissions came from long-haul trucks.The problem is that long-haul trucks run almost exclusively on diesel fuel, and burning diesel releases high levels of CO2 and other carbon emissions. Global demand for freight transport is projected to as much as double by 2050, so it’s critical to find another source of energy that will meet the needs of long-haul trucks while also reducing their carbon emissions. And conversion to the new fuel must not be costly. “Trucks are an indispensable part of the modern supply chain, and any increase in the cost of trucking will be felt universally,” notes William H. Green, the Hoyt Hottel Professor in Chemical Engineering and director of the MIT Energy Initiative.For the past year, Green and his research team have been seeking a low-cost, cleaner alternative to diesel. Finding a replacement is difficult because diesel meets the needs of the trucking industry so well. For one thing, diesel has a high energy density — that is, energy content per pound of fuel. There’s a legal limit on the total weight of a truck and its contents, so using an energy source with a lower weight allows the truck to carry more payload — an important consideration, given the low profit margin of the freight industry. In addition, diesel fuel is readily available at retail refueling stations across the country — a critical resource for drivers, who may travel 600 miles in a day and sleep in their truck rather than returning to their home depot. Finally, diesel fuel is a liquid, so it’s easy to distribute to refueling stations and then pump into trucks.Past studies have examined numerous alternative technology options for powering long-haul trucks, but no clear winner has emerged. Now, Green and his team have evaluated the available options based on consistent and realistic assumptions about the technologies involved and the typical operation of a long-haul truck, and assuming no subsidies to tip the cost balance. Their in-depth analysis of converting long-haul trucks to battery electric — summarized below — found a high cost and negligible emissions gains in the near term. Studies of methanol and other liquid fuels from biomass are ongoing, but already a major concern is whether the world can plant and harvest enough biomass for biofuels without destroying the ecosystem. An analysis of hydrogen — also summarized below — highlights specific challenges with using that clean-burning fuel, which is a gas at normal temperatures.Finally, the team identified an approach that could make hydrogen a promising, low-cost option for long-haul trucks. And, says Green, “it’s an option that most people are probably unaware of.” It involves a novel way of using materials that can pick up hydrogen, store it, and then release it when and where it’s needed to serve as a clean-burning fuel.Defining the challenge: A realistic drive cycle, plus diesel values to beatThe MIT researchers believe that the lack of consensus on the best way to clean up long-haul trucking may have a simple explanation: Different analyses are based on different assumptions about the driving behavior of long-haul trucks. Indeed, some of them don’t accurately represent actual long-haul operations. So the first task for the MIT team was to define a representative — and realistic — “drive cycle” for actual long-haul truck operations in the United States. Then the MIT researchers — and researchers elsewhere — can assess potential replacement fuels and engines based on a consistent set of assumptions in modeling and simulation analyses.To define the drive cycle for long-haul operations, the MIT team used a systematic approach to analyze many hours of real-world driving data covering 58,000 miles. They examined 10 features and identified three — daily range, vehicle speed, and road grade — that have the greatest impact on energy demand and thus on fuel consumption and carbon emissions. The representative drive cycle that emerged covers a distance of 600 miles, an average vehicle speed of 55 miles per hour, and a road grade ranging from negative 6 percent to positive 6 percent.The next step was to generate key values for the performance of the conventional diesel “powertrain,” that is, all the components involved in creating power in the engine and delivering it to the wheels on the ground. Based on their defined drive cycle, the researchers simulated the performance of a conventional diesel truck, generating “benchmarks” for fuel consumption, CO2 emissions, cost, and other performance parameters.Now they could perform parallel simulations — based on the same drive-cycle assumptions — of possible replacement fuels and powertrains to see how the cost, carbon emissions, and other performance parameters would compare to the diesel benchmarks.The battery electric optionWhen considering how to decarbonize long-haul trucks, a natural first thought is battery power. After all, battery electric cars and pickup trucks are proving highly successful. Why not switch to battery electric long-haul trucks? “Again, the literature is very divided, with some studies saying that this is the best idea ever, and other studies saying that this makes no sense,” says Sayandeep Biswas, a graduate student in chemical engineering.To assess the battery electric option, the MIT researchers used a physics-based vehicle model plus well-documented estimates for the efficiencies of key components such as the battery pack, generators, motor, and so on. Assuming the previously described drive cycle, they determined operating parameters, including how much power the battery-electric system needs. From there they could calculate the size and weight of the battery required to satisfy the power needs of the battery electric truck.The outcome was disheartening. Providing enough energy to travel 600 miles without recharging would require a 2 megawatt-hour battery. “That’s a lot,” notes Kariana Moreno Sader, a graduate student in chemical engineering. “It’s the same as what two U.S. households consume per month on average.” And the weight of such a battery would significantly reduce the amount of payload that could be carried. An empty diesel truck typically weighs 20,000 pounds. With a legal limit of 80,000 pounds, there’s room for 60,000 pounds of payload. The 2 MWh battery would weigh roughly 27,000 pounds — significantly reducing the allowable capacity for carrying payload.Accounting for that “payload penalty,” the researchers calculated that roughly four electric trucks would be required to replace every three of today’s diesel-powered trucks. Furthermore, each added truck would require an additional driver. The impact on operating expenses would be significant.Analyzing the emissions reductions that might result from shifting to battery electric long-haul trucks also brought disappointing results. One might assume that using electricity would eliminate CO2 emissions. But when the researchers included emissions associated with making that electricity, that wasn’t true.“Battery electric trucks are only as clean as the electricity used to charge them,” notes Moreno Sader. Most of the time, drivers of long-haul trucks will be charging from national grids rather than dedicated renewable energy plants. According to Energy Information Agency statistics, fossil fuels make up more than 60 percent of the current U.S. power grid, so electric trucks would still be responsible for significant levels of carbon emissions. Manufacturing batteries for the trucks would generate additional CO2 emissions.Building the charging infrastructure would require massive upfront capital investment, as would upgrading the existing grid to reliably meet additional energy demand from the long-haul sector. Accomplishing those changes would be costly and time-consuming, which raises further concern about electrification as a means of decarbonizing long-haul freight.In short, switching today’s long-haul diesel trucks to battery electric power would bring major increases in costs for the freight industry and negligible carbon emissions benefits in the near term. Analyses assuming various types of batteries as well as other drive cycles produced comparable results.However, the researchers are optimistic about where the grid is going in the future. “In the long term, say by around 2050, emissions from the grid are projected to be less than half what they are now,” says Moreno Sader. “When we do our calculations based on that prediction, we find that emissions from battery electric trucks would be around 40 percent lower than our calculated emissions based on today’s grid.”For Moreno Sader, the goal of the MIT research is to help “guide the sector on what would be the best option.” With that goal in mind, she and her colleagues are now examining the battery electric option under different scenarios — for example, assuming battery swapping (a depleted battery isn’t recharged but replaced by a fully charged one), short-haul trucking, and other applications that might produce a more cost-competitive outcome, even for the near term.A promising option: hydrogenAs the world looks to get off reliance on fossil fuels for all uses, much attention is focusing on hydrogen. Could hydrogen be a good alternative for today’s diesel-burning long-haul trucks?To find out, the MIT team performed a detailed analysis of the hydrogen option. “We thought that hydrogen would solve a lot of the problems we had with battery electric,” says Biswas. It doesn’t have associated CO2 emissions. Its energy density is far higher, so it doesn’t create the weight problem posed by heavy batteries. In addition, existing compression technology can get enough hydrogen fuel into a regular-sized tank to cover the needed distance and range. “You can actually give drivers the range they want,” he says. “There’s no issue with ‘range anxiety.’”But while using hydrogen for long-haul trucking would reduce carbon emissions, it would cost far more than diesel. Based on their detailed analysis of hydrogen, the researchers concluded that the main source of incurred cost is in transporting it. Hydrogen can be made in a chemical facility, but then it needs to be distributed to refueling stations across the country. Conventionally, there have been two main ways of transporting hydrogen: as a compressed gas and as a cryogenic liquid. As Biswas notes, the former is “super high pressure,” and the latter is “super cold.” The researchers’ calculations show that as much as 80 percent of the cost of delivered hydrogen is due to transportation and refueling, plus there’s the need to build dedicated refueling stations that can meet new environmental and safety standards for handling hydrogen as a compressed gas or a cryogenic liquid.Having dismissed the conventional options for shipping hydrogen, they turned to a less-common approach: transporting hydrogen using “liquid organic hydrogen carriers” (LOHCs), special organic (carbon-containing) chemical compounds that can under certain conditions absorb hydrogen atoms and under other conditions release them.LOHCs are in use today to deliver small amounts of hydrogen for commercial use. Here’s how the process works: In a chemical plant, the carrier compound is brought into contact with hydrogen in the presence of a catalyst under elevated temperature and pressure, and the compound picks up the hydrogen. The “hydrogen-loaded” compound — still a liquid — is then transported under atmospheric conditions. When the hydrogen is needed, the compound is again exposed to a temperature increase and a different catalyst, and the hydrogen is released.LOHCs thus appear to be ideal hydrogen carriers for long-haul trucking. They’re liquid, so they can easily be delivered to existing refueling stations, where the hydrogen would be released; and they contain at least as much energy per gallon as hydrogen in a cryogenic liquid or compressed gas form. However, a detailed analysis of using hydrogen carriers showed that the approach would decrease emissions but at a considerable cost.The problem begins with the “dehydrogenation” step at the retail station. Releasing the hydrogen from the chemical carrier requires heat, which is generated by burning some of the hydrogen being carried by the LOHC. The researchers calculate that getting the needed heat takes 36 percent of that hydrogen. (In theory, the process would take only 27 percent — but in reality, that efficiency won’t be achieved.) So out of every 100 units of starting hydrogen, 36 units are now gone.But that’s not all. The hydrogen that comes out is at near-ambient pressure. So the facility dispensing the hydrogen will need to compress it — a process that the team calculates will use up 20-30 percent of the starting hydrogen.Because of the needed heat and compression, there’s now less than half of the starting hydrogen left to be delivered to the truck — and as a result, the hydrogen fuel becomes twice as expensive. The bottom line is that the technology works, but “when it comes to really beating diesel, the economics don’t work. It’s quite a bit more expensive,” says Biswas. In addition, the refueling stations would require expensive compressors and auxiliary units such as cooling systems. The capital investment and the operating and maintenance costs together imply that the market penetration of hydrogen refueling stations will be slow.A better strategy: onboard release of hydrogen from LOHCsGiven the potential benefits of using of LOHCs, the researchers focused on how to deal with both the heat needed to release the hydrogen and the energy needed to compress it. “That’s when we had the idea,” says Biswas. “Instead of doing the dehydrogenation [hydrogen release] at the refueling station and then loading the truck with hydrogen, why don’t we just take the LOHC and load that onto the truck?” Like diesel, LOHC is a liquid, so it’s easily transported and pumped into trucks at existing refueling stations. “We’ll then make hydrogen as it’s needed based on the power demands of the truck — and we can capture waste heat from the engine exhaust and use it to power the dehydrogenation process,” says Biswas.In their proposed plan, hydrogen-loaded LOHC is created at a chemical “hydrogenation” plant and then delivered to a retail refueling station, where it’s pumped into a long-haul truck. Onboard the truck, the loaded LOHC pours into the fuel-storage tank. From there it moves to the “dehydrogenation unit” — the reactor where heat and a catalyst together promote chemical reactions that separate the hydrogen from the LOHC. The hydrogen is sent to the powertrain, where it burns, producing energy that propels the truck forward.Hot exhaust from the powertrain goes to a “heat-integration unit,” where its waste heat energy is captured and returned to the reactor to help encourage the reaction that releases hydrogen from the loaded LOHC. The unloaded LOHC is pumped back into the fuel-storage tank, where it’s kept in a separate compartment to keep it from mixing with the loaded LOHC. From there, it’s pumped back into the retail refueling station and then transported back to the hydrogenation plant to be loaded with more hydrogen.Switching to onboard dehydrogenation brings down costs by eliminating the need for extra hydrogen compression and by using waste heat in the engine exhaust to drive the hydrogen-release process. So how does their proposed strategy look compared to diesel? Based on a detailed analysis, the researchers determined that using their strategy would be 18 percent more expensive than using diesel, and emissions would drop by 71 percent.But those results need some clarification. The 18 percent cost premium of using LOHC with onboard hydrogen release is based on the price of diesel fuel in 2020. In spring of 2023 the price was about 30 percent higher. Assuming the 2023 diesel price, the LOHC option is actually cheaper than using diesel.Both the cost and emissions outcomes are affected by another assumption: the use of “blue hydrogen,” which is hydrogen produced from natural gas with carbon capture and storage. Another option is to assume the use of “green hydrogen,” which is hydrogen produced using electricity generated from renewable sources, such as wind and solar. Green hydrogen is much more expensive than blue hydrogen, so then the costs would increase dramatically.If in the future the price of green hydrogen drops, the researchers’ proposed plan would shift to green hydrogen — and then the decline in emissions would no longer be 71 percent but rather close to 100 percent. There would be almost no emissions associated with the researchers’ proposed plan for using LHOCs with onboard hydrogen release.Comparing the options on cost and emissionsTo compare the options, Moreno Sader prepared bar charts showing the per-mile cost of shipping by truck in the United States and the CO2 emissions that result using each of the fuels and approaches discussed above: diesel fuel, battery electric, hydrogen as a cryogenic liquid or compressed gas, and LOHC with onboard hydrogen release. The LOHC strategy with onboard dehydrogenation looked promising on both the cost and the emissions charts. In addition to such quantitative measures, the researchers believe that their strategy addresses two other, less-obvious challenges in finding a less-polluting fuel for long-haul trucks.First, the introduction of the new fuel and trucks to use it must not disrupt the current freight-delivery setup. “You have to keep the old trucks running while you’re introducing the new ones,” notes Green. “You cannot have even a day when the trucks aren’t running because it’d be like the end of the economy. Your supermarket shelves would all be empty; your factories wouldn’t be able to run.” The researchers’ plan would be completely compatible with the existing diesel supply infrastructure and would require relatively minor retrofits to today’s long-haul trucks, so the current supply chains would continue to operate while the new fuel and retrofitted trucks are introduced.Second, the strategy has the potential to be adopted globally. Long-haul trucking is important in other parts of the world, and Moreno Sader thinks that “making this approach a reality is going to have a lot of impact, not only in the United States but also in other countries,” including her own country of origin, Colombia. “This is something I think about all the time.” The approach is compatible with the current diesel infrastructure, so the only requirement for adoption is to build the chemical hydrogenation plant. “And I think the capital expenditure related to that will be less than the cost of building a new fuel-supply infrastructure throughout the country,” says Moreno Sader.Testing in the lab“We’ve done a lot of simulations and calculations to show that this is a great idea,” notes Biswas. “But there’s only so far that math can go to convince people.” The next step is to demonstrate their concept in the lab.To that end, the researchers are now assembling all the core components of the onboard hydrogen-release reactor as well as the heat-integration unit that’s key to transferring heat from the engine exhaust to the hydrogen-release reactor. They estimate that this spring they’ll be ready to demonstrate their ability to release hydrogen and confirm the rate at which it’s formed. And — guided by their modeling work — they’ll be able to fine-tune critical components for maximum efficiency and best performance.The next step will be to add an appropriate engine, specially equipped with sensors to provide the critical readings they need to optimize the performance of all their core components together. By the end of 2024, the researchers hope to achieve their goal: the first experimental demonstration of a power-dense, robust onboard hydrogen-release system with highly efficient heat integration.In the meantime, they believe that results from their work to date should help spread the word, bringing their novel approach to the attention of other researchers and experts in the trucking industry who are now searching for ways to decarbonize long-haul trucking.Financial support for development of the representative drive cycle and the diesel benchmarks as well as the analysis of the battery electric option was provided by the MIT Mobility Systems Center of the MIT Energy Initiative. Analysis of LOHC-powered trucks with onboard dehydrogenation was supported by the MIT Climate and Sustainability Consortium. Sayandeep Biswas is supported by a fellowship from the Martin Family Society of Fellows for Sustainability, and Kariana Moreno Sader received fellowship funding from MathWorks through the MIT School of Science. More