More stories

  • in

    MIT Maritime Consortium sets sail

    Around 11 billion tons of goods, or about 1.5 tons per person worldwide, are transported by sea each year, representing about 90 percent of global trade by volume. Internationally, the merchant shipping fleet numbers around 110,000 vessels. These ships, and the ports that service them, are significant contributors to the local and global economy — and they’re significant contributors to greenhouse gas emissions.A new consortium, formalized in a signing ceremony at MIT last week, aims to address climate-harming emissions in the maritime shipping industry, while supporting efforts for environmentally friendly operation in compliance with the decarbonization goals set by the International Maritime Organization.“This is a timely collaboration with key stakeholders from the maritime industry with a very bold and interdisciplinary research agenda that will establish new technologies and evidence-based standards,” says Themis Sapsis, the William Koch Professor of Marine Technology at MIT and the director of MIT’s Center for Ocean Engineering. “It aims to bring the best from MIT in key areas for commercial shipping, such as nuclear technology for commercial settings, autonomous operation and AI methods, improved hydrodynamics and ship design, cybersecurity, and manufacturing.” Co-led by Sapsis and Fotini Christia, the Ford International Professor of the Social Sciences; director of the Institute for Data, Systems, and Society (IDSS); and director of the MIT Sociotechnical Systems Research Center, the newly-launched MIT Maritime Consortium (MC) brings together MIT collaborators from across campus, including the Center for Ocean Engineering, which is housed in the Department of Mechanical Engineering; IDSS, which is housed in the MIT Schwarzman College of Computing; the departments of Nuclear Science and Engineering and Civil and Environmental Engineering; MIT Sea Grant; and others, with a national and an international community of industry experts.The Maritime Consortium’s founding members are the American Bureau of Shipping (ABS), Capital Clean Energy Carriers Corp., and HD Korea Shipbuilding and Offshore Engineering. Innovation members are Foresight-Group, Navios Maritime Partners L.P., Singapore Maritime Institute, and Dorian LPG.“The challenges the maritime industry faces are challenges that no individual company or organization can address alone,” says Christia. “The solution involves almost every discipline from the School of Engineering, as well as AI and data-driven algorithms, and policy and regulation — it’s a true MIT problem.”Researchers will explore new designs for nuclear systems consistent with the techno-economic needs and constraints of commercial shipping, economic and environmental feasibility of alternative fuels, new data-driven algorithms and rigorous evaluation criteria for autonomous platforms in the maritime space, cyber-physical situational awareness and anomaly detection, as well as 3D printing technologies for onboard manufacturing. Collaborators will also advise on research priorities toward evidence-based standards related to MIT presidential priorities around climate, sustainability, and AI.MIT has been a leading center of ship research and design for over a century, and is widely recognized for contributions to hydrodynamics, ship structural mechanics and dynamics, propeller design, and overall ship design, and its unique educational program for U.S. Navy Officers, the Naval Construction and Engineering Program. Research today is at the forefront of ocean science and engineering, with significant efforts in fluid mechanics and hydrodynamics, acoustics, offshore mechanics, marine robotics and sensors, and ocean sensing and forecasting. The consortium’s academic home at MIT also opens the door to cross-departmental collaboration across the Institute.The MC will launch multiple research projects designed to tackle challenges from a variety of angles, all united by cutting-edge data analysis and computation techniques. Collaborators will research new designs and methods that improve efficiency and reduce greenhouse gas emissions, explore feasibility of alternative fuels, and advance data-driven decision-making, manufacturing and materials, hydrodynamic performance, and cybersecurity.“This consortium brings a powerful collection of significant companies that, together, has the potential to be a global shipping shaper in itself,” says Christopher J. Wiernicki SM ’85, chair and chief executive officer of ABS. “The strength and uniqueness of this consortium is the members, which are all world-class organizations and real difference makers. The ability to harness the members’ experience and know-how, along with MIT’s technology reach, creates real jet fuel to drive progress,” Wiernicki says. “As well as researching key barriers, bottlenecks, and knowledge gaps in the emissions challenge, the consortium looks to enable development of the novel technology and policy innovation that will be key. Long term, the consortium hopes to provide the gravity we will need to bend the curve.” More

  • in

    Study: Climate change will reduce the number of satellites that can safely orbit in space

    MIT aerospace engineers have found that greenhouse gas emissions are changing the environment of near-Earth space in ways that, over time, will reduce the number of satellites that can sustainably operate there.In a study appearing today in Nature Sustainability, the researchers report that carbon dioxide and other greenhouse gases can cause the upper atmosphere to shrink. An atmospheric layer of special interest is the thermosphere, where the International Space Station and most satellites orbit today. When the thermosphere contracts, the decreasing density reduces atmospheric drag — a force that pulls old satellites and other debris down to altitudes where they will encounter air molecules and burn up.Less drag therefore means extended lifetimes for space junk, which will litter sought-after regions for decades and increase the potential for collisions in orbit.The team carried out simulations of how carbon emissions affect the upper atmosphere and orbital dynamics, in order to estimate the “satellite carrying capacity” of low Earth orbit. These simulations predict that by the year 2100, the carrying capacity of the most popular regions could be reduced by 50-66 percent due to the effects of greenhouse gases.“Our behavior with greenhouse gases here on Earth over the past 100 years is having an effect on how we operate satellites over the next 100 years,” says study author Richard Linares, associate professor in MIT’s Department of Aeronautics and Astronautics (AeroAstro).“The upper atmosphere is in a fragile state as climate change disrupts the status quo,” adds lead author William Parker, a graduate student in AeroAstro. “At the same time, there’s been a massive increase in the number of satellites launched, especially for delivering broadband internet from space. If we don’t manage this activity carefully and work to reduce our emissions, space could become too crowded, leading to more collisions and debris.”The study includes co-author Matthew Brown of the University of Birmingham.Sky fallThe thermosphere naturally contracts and expands every 11 years in response to the sun’s regular activity cycle. When the sun’s activity is low, the Earth receives less radiation, and its outermost atmosphere temporarily cools and contracts before expanding again during solar maximum.In the 1990s, scientists wondered what response the thermosphere might have to greenhouse gases. Their preliminary modeling showed that, while the gases trap heat in the lower atmosphere, where we experience global warming and weather, the same gases radiate heat at much higher altitudes, effectively cooling the thermosphere. With this cooling, the researchers predicted that the thermosphere should shrink, reducing atmospheric density at high altitudes.In the last decade, scientists have been able to measure changes in drag on satellites, which has provided some evidence that the thermosphere is contracting in response to something more than the sun’s natural, 11-year cycle.“The sky is quite literally falling — just at a rate that’s on the scale of decades,” Parker says. “And we can see this by how the drag on our satellites is changing.”The MIT team wondered how that response will affect the number of satellites that can safely operate in Earth’s orbit. Today, there are over 10,000 satellites drifting through low Earth orbit, which describes the region of space up to 1,200 miles (2,000 kilometers), from Earth’s surface. These satellites deliver essential services, including internet, communications, navigation, weather forecasting, and banking. The satellite population has ballooned in recent years, requiring operators to perform regular collision-avoidance maneuvers to keep safe. Any collisions that do occur can generate debris that remains in orbit for decades or centuries, increasing the chance for follow-on collisions with satellites, both old and new.“More satellites have been launched in the last five years than in the preceding 60 years combined,” Parker says. “One of key things we’re trying to understand is whether the path we’re on today is sustainable.”Crowded shellsIn their new study, the researchers simulated different greenhouse gas emissions scenarios over the next century to investigate impacts on atmospheric density and drag. For each “shell,” or altitude range of interest, they then modeled the orbital dynamics and the risk of satellite collisions based on the number of objects within the shell. They used this approach to identify each shell’s “carrying capacity” — a term that is typically used in studies of ecology to describe the number of individuals that an ecosystem can support.“We’re taking that carrying capacity idea and translating it to this space sustainability problem, to understand how many satellites low Earth orbit can sustain,” Parker explains.The team compared several scenarios: one in which greenhouse gas concentrations remain at their level from the year 2000 and others where emissions change according to the Intergovernmental Panel on Climate Change (IPCC) Shared Socioeconomic Pathways (SSPs). They found that scenarios with continuing increases in emissions would lead to a significantly reduced carrying capacity throughout low Earth orbit.In particular, the team estimates that by the end of this century, the number of satellites safely accommodated within the altitudes of 200 and 1,000 kilometers could be reduced by 50 to 66 percent compared with a scenario in which emissions remain at year-2000 levels. If satellite capacity is exceeded, even in a local region, the researchers predict that the region will experience a “runaway instability,” or a cascade of collisions that would create so much debris that satellites could no longer safely operate there.Their predictions forecast out to the year 2100, but the team says that certain shells in the atmosphere today are already crowding up with satellites, particularly from recent “megaconstellations” such as SpaceX’s Starlink, which comprises fleets of thousands of small internet satellites.“The megaconstellation is a new trend, and we’re showing that because of climate change, we’re going to have a reduced capacity in orbit,” Linares says. “And in local regions, we’re close to approaching this capacity value today.”“We rely on the atmosphere to clean up our debris. If the atmosphere is changing, then the debris environment will change too,” Parker adds. “We show the long-term outlook on orbital debris is critically dependent on curbing our greenhouse gas emissions.”This research is supported, in part, by the U.S. National Science Foundation, the U.S. Air Force, and the U.K. Natural Environment Research Council. More

  • in

    Study: The ozone hole is healing, thanks to global reduction of CFCs

    A new MIT-led study confirms that the Antarctic ozone layer is healing, as a direct result of global efforts to reduce ozone-depleting substances.Scientists including the MIT team have observed signs of ozone recovery in the past. But the new study is the first to show, with high statistical confidence, that this recovery is due primarily to the reduction of ozone-depleting substances, versus other influences such as natural weather variability or increased greenhouse gas emissions to the stratosphere.“There’s been a lot of qualitative evidence showing that the Antarctic ozone hole is getting better. This is really the first study that has quantified confidence in the recovery of the ozone hole,” says study author Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies and Chemistry. “The conclusion is, with 95 percent confidence, it is recovering. Which is awesome. And it shows we can actually solve environmental problems.”The new study appears today in the journal Nature. Graduate student Peidong Wang from the Solomon group in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) is the lead author. His co-authors include Solomon and EAPS Research Scientist Kane Stone, along with collaborators from multiple other institutions.Roots of ozone recoveryWithin the Earth’s stratosphere, ozone is a naturally occurring gas that acts as a sort of sunscreen, protecting the planet from the sun’s harmful ultraviolet radiation. In 1985, scientists discovered a “hole” in the ozone layer over Antarctica that opened up during the austral spring, between September and December. This seasonal ozone depletion was suddenly allowing UV rays to filter down to the surface, leading to skin cancer and other adverse health effects.In 1986, Solomon, who was then working at the National Oceanic and Atmospheric Administration (NOAA), led expeditions to the Antarctic, where she and her colleagues gathered evidence that quickly confirmed the ozone hole’s cause: chlorofluorocarbons, or CFCs — chemicals that were then used in refrigeration, air conditioning, insulation, and aerosol propellants. When CFCs drift up into the stratosphere, they can break down ozone under certain seasonal conditions.The following year, those relevations led to the drafting of the Montreal Protocol — an international treaty that aimed to phase out the production of CFCs and other ozone-depleting substances, in hopes of healing the ozone hole.In 2016, Solomon led a study reporting key signs of ozone recovery. The ozone hole seemed to be shrinking with each year, especially in September, the time of year when it opens up. Still, these observations were qualitative. The study showed large uncertainties regarding how much of this recovery was due to concerted efforts to reduce ozone-depleting substances, or if the shrinking ozone hole was a result of other “forcings,” such as year-to-year weather variability from El Niño, La Niña, and the polar vortex.“While detecting a statistically significant increase in ozone is relatively straightforward, attributing these changes to specific forcings is more challenging,” says Wang.Anthropogenic healingIn their new study, the MIT team took a quantitative approach to identify the cause of Antarctic ozone recovery. The researchers borrowed a method from the climate change community, known as “fingerprinting,” which was pioneered by Klaus Hasselmann, who was awarded the Nobel Prize in Physics in 2021 for the technique. In the context of climate, fingerprinting refers to a method that isolates the influence of specific climate factors, apart from natural, meteorological noise. Hasselmann applied fingerprinting to identify, confirm, and quantify the anthropogenic fingerprint of climate change.Solomon and Wang looked to apply the fingerprinting method to identify another anthropogenic signal: the effect of human reductions in ozone-depleting substances on the recovery of the ozone hole.“The atmosphere has really chaotic variability within it,” Solomon says. “What we’re trying to detect is the emerging signal of ozone recovery against that kind of variability, which also occurs in the stratosphere.”The researchers started with simulations of the Earth’s atmosphere and generated multiple “parallel worlds,” or simulations of the same global atmosphere, under different starting conditions. For instance, they ran simulations under conditions that assumed no increase in greenhouse gases or ozone-depleting substances. Under these conditions, any changes in ozone should be the result of natural weather variability. They also ran simulations with only increasing greenhouse gases, as well as only decreasing ozone-depleting substances.They compared these simulations to observe how ozone in the Antarctic stratosphere changed, both with season, and across different altitudes, in response to different starting conditions. From these simulations, they mapped out the times and altitudes where ozone recovered from month to month, over several decades, and identified a key “fingerprint,” or pattern, of ozone recovery that was specifically due to conditions of declining ozone-depleting substances.The team then looked for this fingerprint in actual satellite observations of the Antarctic ozone hole from 2005 to the present day. They found that, over time, the fingerprint that they identified in simulations became clearer and clearer in observations. In 2018, the fingerprint was at its strongest, and the team could say with 95 percent confidence that ozone recovery was due mainly to reductions in ozone-depleting substances.“After 15 years of observational records, we see this signal to noise with 95 percent confidence, suggesting there’s only a very small chance that the observed pattern similarity can be explained by variability noise,” Wang says. “This gives us confidence in the fingerprint. It also gives us confidence that we can solve environmental problems. What we can learn from ozone studies is how different countries can swiftly follow these treaties to decrease emissions.”If the trend continues, and the fingerprint of ozone recovery grows stronger, Solomon anticipates that soon there will be a year, here and there, when the ozone layer stays entirely intact. And eventually, the ozone hole should stay shut for good.“By something like 2035, we might see a year when there’s no ozone hole depletion at all in the Antarctic. And that will be very exciting for me,” she says. “And some of you will see the ozone hole go away completely in your lifetimes. And people did that.”This research was supported, in part, by the National Science Foundation and NASA. More

  • in

    Reducing carbon emissions from residential heating: A pathway forward

    In the race to reduce climate-warming carbon emissions, the buildings sector is falling behind. While carbon dioxide (CO2) emissions in the U.S. electric power sector dropped by 34 percent between 2005 and 2021, emissions in the building sector declined by only 18 percent in that same time period. Moreover, in extremely cold locations, burning natural gas to heat houses can make up a substantial share of the emissions portfolio. Therefore, steps to electrify buildings in general, and residential heating in particular, are essential for decarbonizing the U.S. energy system.But that change will increase demand for electricity and decrease demand for natural gas. What will be the net impact of those two changes on carbon emissions and on the cost of decarbonizing? And how will the electric power and natural gas sectors handle the new challenges involved in their long-term planning for future operations and infrastructure investments?A new study by MIT researchers with support from the MIT Energy Initiative (MITEI) Future Energy Systems Center unravels the impacts of various levels of electrification of residential space heating on the joint power and natural gas systems. A specially devised modeling framework enabled them to estimate not only the added costs and emissions for the power sector to meet the new demand, but also any changes in costs and emissions that result for the natural gas sector.The analyses brought some surprising outcomes. For example, they show that — under certain conditions — switching 80 percent of homes to heating by electricity could cut carbon emissions and at the same time significantly reduce costs over the combined natural gas and electric power sectors relative to the case in which there is only modest switching. That outcome depends on two changes: Consumers must install high-efficiency heat pumps plus take steps to prevent heat losses from their homes, and planners in the power and the natural gas sectors must work together as they make long-term infrastructure and operations decisions. Based on their findings, the researchers stress the need for strong state, regional, and national policies that encourage and support the steps that homeowners and industry planners can take to help decarbonize today’s building sector.A two-part modeling approachTo analyze the impacts of electrification of residential heating on costs and emissions in the combined power and gas sectors, a team of MIT experts in building technology, power systems modeling, optimization techniques, and more developed a two-part modeling framework. Team members included Rahman Khorramfar, a senior postdoc in MITEI and the Laboratory for Information and Decision Systems (LIDS); Morgan Santoni-Colvin SM ’23, a former MITEI graduate research assistant, now an associate at Energy and Environmental Economics, Inc.; Saurabh Amin, a professor in the Department of Civil and Environmental Engineering and principal investigator in LIDS; Audun Botterud, a principal research scientist in LIDS; Leslie Norford, a professor in the Department of Architecture; and Dharik Mallapragada, a former MITEI principal research scientist, now an assistant professor at New York University, who led the project. They describe their new methods and findings in a paper published in the journal Cell Reports Sustainability on Feb. 6.The first model in the framework quantifies how various levels of electrification will change end-use demand for electricity and for natural gas, and the impacts of possible energy-saving measures that homeowners can take to help. “To perform that analysis, we built a ‘bottom-up’ model — meaning that it looks at electricity and gas consumption of individual buildings and then aggregates their consumption to get an overall demand for power and for gas,” explains Khorramfar. By assuming a wide range of building “archetypes” — that is, groupings of buildings with similar physical characteristics and properties — coupled with trends in population growth, the team could explore how demand for electricity and for natural gas would change under each of five assumed electrification pathways: “business as usual” with modest electrification, medium electrification (about 60 percent of homes are electrified), high electrification (about 80 percent of homes make the change), and medium and high electrification with “envelope improvements,” such as sealing up heat leaks and adding insulation.The second part of the framework consists of a model that takes the demand results from the first model as inputs and “co-optimizes” the overall electricity and natural gas system to minimize annual investment and operating costs while adhering to any constraints, such as limits on emissions or on resource availability. The modeling framework thus enables the researchers to explore the impact of each electrification pathway on the infrastructure and operating costs of the two interacting sectors.The New England case study: A challenge for electrificationAs a case study, the researchers chose New England, a region where the weather is sometimes extremely cold and where burning natural gas to heat houses contributes significantly to overall emissions. “Critics will say that electrification is never going to happen [in New England]. It’s just too expensive,” comments Santoni-Colvin. But he notes that most studies focus on the electricity sector in isolation. The new framework considers the joint operation of the two sectors and then quantifies their respective costs and emissions. “We know that electrification will require large investments in the electricity infrastructure,” says Santoni-Colvin. “But what hasn’t been well quantified in the literature is the savings that we generate on the natural gas side by doing that — so, the system-level savings.”Using their framework, the MIT team performed model runs aimed at an 80 percent reduction in building-sector emissions relative to 1990 levels — a target consistent with regional policy goals for 2050. The researchers defined parameters including details about building archetypes, the regional electric power system, existing and potential renewable generating systems, battery storage, availability of natural gas, and other key factors describing New England.They then performed analyses assuming various scenarios with different mixes of home improvements. While most studies assume typical weather, they instead developed 20 projections of annual weather data based on historical weather patterns and adjusted for the effects of climate change through 2050. They then analyzed their five levels of electrification.Relative to business-as-usual projections, results from the framework showed that high electrification of residential heating could more than double the demand for electricity during peak periods and increase overall electricity demand by close to 60 percent. Assuming that building-envelope improvements are deployed in parallel with electrification reduces the magnitude and weather sensitivity of peak loads and creates overall efficiency gains that reduce the combined demand for electricity plus natural gas for home heating by up to 30 percent relative to the present day. Notably, a combination of high electrification and envelope improvements resulted in the lowest average cost for the overall electric power-natural gas system in 2050.Lessons learnedReplacing existing natural gas-burning furnaces and boilers with heat pumps reduces overall energy consumption. Santoni-Colvin calls it “something of an intuitive result” that could be expected because heat pumps are “just that much more efficient than old, fossil fuel-burning systems. But even so, we were surprised by the gains.”Other unexpected results include the importance of homeowners making more traditional energy efficiency improvements, such as adding insulation and sealing air leaks — steps supported by recent rebate policies. Those changes are critical to reducing costs that would otherwise be incurred for upgrading the electricity grid to accommodate the increased demand. “You can’t just go wild dropping heat pumps into everybody’s houses if you’re not also considering other ways to reduce peak loads. So it really requires an ‘all of the above’ approach to get to the most cost-effective outcome,” says Santoni-Colvin.Testing a range of weather outcomes also provided important insights. Demand for heating fuel is very weather-dependent, yet most studies are based on a limited set of weather data — often a “typical year.” The researchers found that electrification can lead to extended peak electric load events that can last for a few days during cold winters. Accordingly, the researchers conclude that there will be a continuing need for a “firm, dispatchable” source of electricity; that is, a power-generating system that can be relied on to produce power any time it’s needed — unlike solar and wind systems. As examples, they modeled some possible technologies, including power plants fired by a low-carbon fuel or by natural gas equipped with carbon capture equipment. But they point out that there’s no way of knowing what types of firm generators will be available in 2050. It could be a system that’s not yet mature, or perhaps doesn’t even exist today.In presenting their findings, the researchers note several caveats. For one thing, their analyses don’t include the estimated cost to homeowners of installing heat pumps. While that cost is widely discussed and debated, that issue is outside the scope of their current project.In addition, the study doesn’t specify what happens to existing natural gas pipelines. “Some homes are going to electrify and get off the gas system and not have to pay for it, leaving other homes with increasing rates because the gas system cost now has to be divided among fewer customers,” says Khorramfar. “That will inevitably raise equity questions that need to be addressed by policymakers.”Finally, the researchers note that policies are needed to drive residential electrification. Current financial support for installation of heat pumps and steps to make homes more thermally efficient are a good start. But such incentives must be coupled with a new approach to planning energy infrastructure investments. Traditionally, electric power planning and natural gas planning are performed separately. However, to decarbonize residential heating, the two sectors should coordinate when planning future operations and infrastructure needs. Results from the MIT analysis indicate that such cooperation could significantly reduce both emissions and costs for residential heating — a change that would yield a much-needed step toward decarbonizing the buildings sector as a whole. More

  • in

    Pivot Bio is using microbial nitrogen to make agriculture more sustainable

    The Haber-Bosch process, which converts atmospheric nitrogen to make ammonia fertilizer, revolutionized agriculture and helped feed the world’s growing population, but it also created huge environmental problems. It is one of the most energy-intensive chemical processes in the world, responsible for 1-2 percent of global energy consumption. It also releases nitrous oxide, a potent greenhouse gas that harms the ozone layer. Excess nitrogen also routinely runs off farms into waterways, harming marine life and polluting groundwater.In place of synthetic fertilizer, Pivot Bio has engineered nitrogen-producing microbes to make farming more sustainable. The company, which was co-founded by Professor Chris Voigt, Karsten Temme, and Alvin Tamsir, has engineered its microbes to grow on plant roots, where they feed on the root’s sugars and precisely deliver nitrogen in return.Pivot’s microbial colonies grow with the plant and produce more nitrogen at exactly the time the plant needs it, minimizing nitrogen runoff.“The way we have delivered nutrients to support plant growth historically is fertilizer, but that’s an inefficient way to get all the nutrients you need,” says Temme, Pivot’s chief innovation officer. “We have the ability now to help farmers be more efficient and productive with microbes.”Farmers can replace up to 40 pounds per acre of traditional nitrogen with Pivot’s product, which amounts to about a quarter of the total nitrogen needed for a crop like corn.Pivot’s products are already being used to grow corn, wheat, barley, oats, and other grains across millions of acres of American farmland, eliminating hundreds of thousands of tons of CO2 equivalent in the process. The company’s impact is even more striking given its unlikely origins, which trace back to one of the most challenging times of Voigt’s career.A Pivot from despairThe beginning of every faculty member’s career can be a sink-or-swim moment, and by Voigt’s own account, he was drowning. As a freshly minted assistant professor at the University of California at San Francisco, Voigt was struggling to stand up his lab, attract funding, and get experiments started.Around 2008, Voigt joined a research group out of the University of California at Berkeley that was writing a grant proposal focused on photovoltaic materials. His initial role was minor, but a senior researcher pulled out of the group a week before the proposal had to be submitted, so Voigt stepped up.“I said ‘I’ll finish this section in a week,’” Voigt recalls. “It was my big chance.”For the proposal, Voigt detailed an ambitious plan to rearrange the genetics of biologic photosynthetic systems to make them more efficient. He barely submitted it in time.A few months went by, then the proposal reviews finally came back. Voigt hurried to the meeting with some of the most senior researchers at UC Berkeley to discuss the responses.“My part of the proposal got completely slammed,” Voigt says. “There were something like 15 reviews on it — they were longer than the actual grant — and it’s just one after another tearing into my proposal. All the most famous people are in this meeting, future energy secretaries, future leaders of the university, and it was totally embarrassing. After that meeting, I was considering leaving academia.”A few discouraging months later, Voigt got a call from Paul Ludden, the dean of the School of Science at UC Berkeley. He wanted to talk.“As I walk into Paul’s office, he’s reading my proposal,” Voigt recalls. “He sits me down and says, ‘Everybody’s telling me how terrible this is.’ I’m thinking, ‘Oh my God.’ But then he says, ‘I think there’s something here. Your idea is good, you just picked the wrong system.’”Ludden went on to explain to Voigt that he should apply his gene-swapping idea to nitrogen fixation. He even offered to send Voigt a postdoc from his lab, Dehua Zhao, to help. Voigt paired Zhao with Temme, and sure enough, the resulting 2011 paper of their work was well-received by the nitrogen fixation community.“Nitrogen fixation has been a holy grail for scientists, agronomists, and farmers for almost a century, ever since somebody discovered the first microbe that can fix nitrogen for legumes like soybeans,” Temme says. “Everybody always said that someday we’ll be able to do this for the cereal crops. The excitement with Pivot was this is the first time that technology became accessible.”Voigt had moved to MIT in 2010. When the paper came out, he founded Pivot Bio with Temme and another Berkeley researcher, Alvin Tamsir. Since then, Voigt, who is the Daniel I.C. Wang Professor at MIT and the head of the Department of Biological Engineering, has continued collaborating with Pivot on things like increasing nitrogen production, making strains more stable, and making them inducible to different signals from the plant. Pivot has licensed technology from MIT, and the research has also received support from MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS).Pivot’s first goals were to gain regulatory approval and prove themselves in the marketplace. To gain approval in the U.S., Pivot’s team focused on using DNA from within the same organism rather than bringing in totally new DNA, which simplified the approval process. It also partnered with independent corn seed dealers to get its product to farms. Early deployments occurred in 2019.Farmers apply Pivot’s product at planting, either as a liquid that gets sprayed on the soil or as a dry powder that is rehydrated and applied to the seeds as a coating. The microbes live on the surface of the growing root system, eating plant sugars and releasing nitrogen throughout the plant’s life cycle.“Today, our microbes colonize just a fraction of the total sugars provided by the plant,” Temme explains. “They’re also sharing ammonia with the plant, and all of those things are just a portion of what’s possible technically. Our team is always trying to figure out how to make those microbes more efficient at getting the energy they need to grow or at fixing nitrogen and sharing it with the crop.”In 2023, Pivot started the N-Ovator program to connect companies with growers who practice sustainable farming using Pivot’s microbial nitrogen. Through the program, companies buy nitrogen credits and farmers can get paid by verifying their practices. The program was named one of the Inventions of the Year by Time Magazine last year and has paid out millions of dollars to farmers to date.Microbial nitrogen and beyondPivot is currently selling to farmers across the U.S. and working with smallholder farmers in Kenya. It’s also hoping to gain approval for its microbial solution in Brazil and Canada, which it hopes will be its next markets.”How do we get the economics to make sense for everybody — the farmers, our partners, and the company?” Temme says of Pivot’s mission. “Because this truly can be a deflationary technology that upends the very expensive traditional way of making fertilizer.”Pivot’s team is also extending the product to cotton, and Temme says microbes can be a nitrogen source for any type of plant on the planet. Further down the line, the company believes it can help farmers with other nutrients essential to help their crops grow.“Now that we’ve established our technology, how can Pivot help farmers overcome all the other limitations they face with crop nutrients to maximize yields?” Temme asks. “That really starts to change the way a farmer thinks about managing the entire acre from a price, productivity, and sustainability perspective.” More

  • in

    Puzzling out climate change

    Shreyaa Raghavan’s journey into solving some of the world’s toughest challenges started with a simple love for puzzles. By high school, her knack for problem-solving naturally drew her to computer science. Through her participation in an entrepreneurship and leadership program, she built apps and twice made it to the semifinals of the program’s global competition.Her early successes made a computer science career seem like an obvious choice, but Raghavan says a significant competing interest left her torn.“Computer science sparks that puzzle-, problem-solving part of my brain,” says Raghavan ’24, an Accenture Fellow and a PhD candidate in MIT’s Institute for Data, Systems, and Society. “But while I always felt like building mobile apps was a fun little hobby, it didn’t feel like I was directly solving societal challenges.”Her perspective shifted when, as an MIT undergraduate, Raghavan participated in an Undergraduate Research Opportunity in the Photovoltaic Research Laboratory, now known as the Accelerated Materials Laboratory for Sustainability. There, she discovered how computational techniques like machine learning could optimize materials for solar panels — a direct application of her skills toward mitigating climate change.“This lab had a very diverse group of people, some from a computer science background, some from a chemistry background, some who were hardcore engineers. All of them were communicating effectively and working toward one unified goal — building better renewable energy systems,” Raghavan says. “It opened my eyes to the fact that I could use very technical tools that I enjoy building and find fulfillment in that by helping solve major climate challenges.”With her sights set on applying machine learning and optimization to energy and climate, Raghavan joined Cathy Wu’s lab when she started her PhD in 2023. The lab focuses on building more sustainable transportation systems, a field that resonated with Raghavan due to its universal impact and its outsized role in climate change — transportation accounts for roughly 30 percent of greenhouse gas emissions.“If we were to throw all of the intelligent systems we are exploring into the transportation networks, by how much could we reduce emissions?” she asks, summarizing a core question of her research.Wu, an associate professor in the Department of Civil and Environmental Engineering, stresses the value of Raghavan’s work.“Transportation is a critical element of both the economy and climate change, so potential changes to transportation must be carefully studied,” Wu says. “Shreyaa’s research into smart congestion management is important because it takes a data-driven approach to add rigor to the broader research supporting sustainability.”Raghavan’s contributions have been recognized with the Accenture Fellowship, a cornerstone of the MIT-Accenture Convergence Initiative for Industry and Technology. As an Accenture Fellow, she is exploring the potential impact of technologies for avoiding stop-and-go traffic and its emissions, using systems such as networked autonomous vehicles and digital speed limits that vary according to traffic conditions — solutions that could advance decarbonization in the transportation section at relatively low cost and in the near term.Raghavan says she appreciates the Accenture Fellowship not only for the support it provides, but also because it demonstrates industry involvement in sustainable transportation solutions.“It’s important for the field of transportation, and also energy and climate as a whole, to synergize with all of the different stakeholders,” she says. “I think it’s important for industry to be involved in this issue of incorporating smarter transportation systems to decarbonize transportation.”Raghavan has also received a fellowship supporting her research from the U.S. Department of Transportation.“I think it’s really exciting that there’s interest from the policy side with the Department of Transportation and from the industry side with Accenture,” she says.Raghavan believes that addressing climate change requires collaboration across disciplines. “I think with climate change, no one industry or field is going to solve it on its own. It’s really got to be each field stepping up and trying to make a difference,” she says. “I don’t think there’s any silver-bullet solution to this problem. It’s going to take many different solutions from different people, different angles, different disciplines.”With that in mind, Raghavan has been very active in the MIT Energy and Climate Club since joining about three years ago, which, she says, “was a really cool way to meet lots of people who were working toward the same goal, the same climate goals, the same passions, but from completely different angles.”This year, Raghavan is on the community and education team, which works to build the community at MIT that is working on climate and energy issues. As part of that work, Raghavan is launching a mentorship program for undergraduates, pairing them with graduate students who help the undergrads develop ideas about how they can work on climate using their unique expertise.“I didn’t foresee myself using my computer science skills in energy and climate,” Raghavan says, “so I really want to give other students a clear pathway, or a clear sense of how they can get involved.”Raghavan has embraced her area of study even in terms of where she likes to think.“I love working on trains, on buses, on airplanes,” she says. “It’s really fun to be in transit and working on transportation problems.”Anticipating a trip to New York to visit a cousin, she holds no dread for the long train trip.“I know I’m going to do some of my best work during those hours,” she says. “Four hours there. Four hours back.” More

  • in

    3 Questions: What the laws of physics tell us about CO2 removal

    Human activities continue to pump billions of tons of carbon dioxide into the atmosphere each year, raising global temperatures and driving extreme weather events. As countries grapple with climate impacts and ways to significantly reduce carbon emissions, there have been various efforts to advance carbon dioxide removal (CDR) technologies that directly remove carbon dioxide from the air and sequester it for long periods of time.Unlike carbon capture and storage technologies, which are designed to remove carbon dioxide at point sources such as fossil-fuel plants, CDR aims to remove carbon dioxide molecules that are already circulating in the atmosphere.A new report by the American Physical Society and led by an MIT physicist provides an overview of the major experimental CDR approaches and determines their fundamental physical limits. The report focuses on methods that have the biggest potential for removing carbon dioxide, at the scale of gigatons per year, which is the magnitude that would be required to have a climate-stabilizing impact.The new report was commissioned by the American Physical Society’s Panel on Public Affairs, and appeared last week in the journal PRX. The report was chaired by MIT professor of physics Washington Taylor, who spoke with MIT News about CDR’s physical limitations and why it’s worth pursuing in tandem with global efforts to reduce carbon emissions.Q: What motivated you to look at carbon dioxide removal systems from a physical science perspective?A: The number one thing driving climate change is the fact that we’re taking carbon that has been stuck in the ground for 100 million years, and putting it in the atmosphere, and that’s causing warming. In the last few years there’s been a lot of interest both by the government and private entities in finding technologies to directly remove the CO2 from the air.How to manage atmospheric carbon is the critical question in dealing with our impact on Earth’s climate. So, it’s very important for us to understand whether we can affect the carbon levels not just by changing our emissions profile but also by directly taking carbon out of the atmosphere. Physics has a lot to say about this because the possibilities are very strongly constrained by thermodynamics, mass issues, and things like that.Q: What carbon dioxide removal methods did you evaluate?A: They’re all at an early stage. It’s kind of the Wild West out there in terms of the different ways in which companies are proposing to remove carbon from the atmosphere. In this report, we break down CDR processes into two classes: cyclic and once-through.Imagine we are in a boat that has a hole in the hull and is rapidly taking on water. Of course, we want to plug the hole as quickly as we can. But even once we have fixed the hole, we need to get the water out so we aren’t in danger of sinking or getting swamped. And this is particularly urgent if we haven’t completely fixed the hole so we still have a slow leak. Now, imagine we have a couple of options for how to get the water out so we don’t sink.The first is a sponge that we can use to absorb water, that we can then squeeze out and reuse. That’s a cyclic process in the sense that we have some material that we’re using over and over. There are cyclic CDR processes like chemical “direct air capture” (DAC), which acts basically like a sponge. You set up a big system with fans that blow air past some material that captures carbon dioxide. When the material is saturated, you close off the system and then use energy to essentially squeeze out the carbon and store it in a deep repository. Then you can reuse the material, in a cyclic process.The second class of approaches is what we call “once-through.” In the boat analogy, it would be as if you try to fix the leak using cartons of paper towels. You let them saturate and then throw them overboard, and you use each roll once.There are once-through CDR approaches, like enhanced rock weathering, that are designed to accelerate a natural process, by which certain rocks, when exposed to air, will absorb carbon from the atmosphere. Worldwide, this natural rock weathering is estimated to remove about 1 gigaton of carbon each year. “Enhanced rock weathering” is a CDR approach where you would dig up a lot of this rock, grind it up really small, to less than the width of a human hair, to get the process to happen much faster. The idea is, you dig up something, spread it out, and absorb CO2 in one go.The key difference between these two processes is that the cyclic process is subject to the second law of thermodynamics and there’s an energy constraint. You can set an actual limit from physics, saying any cyclic process is going to take a certain amount of energy, and that cannot be avoided. For example, we find that for cyclic direct-air-capture (DAC) plants, based on second law limits, the absolute minimum amount of energy you would need to capture a gigaton of carbon is comparable to the total yearly electric energy consumption of the state of Virginia. Systems currently under development use at least three to 10 times this much energy on a per ton basis (and capture tens of thousands, not billions, of tons). Such systems also need to move a lot of air; the air that would need to pass through a DAC system to capture a gigaton of CO2 is comparable to the amount of air that passes through all the air cooling systems on the planet.On the other hand, if you have a once-through process, you could in some respects avoid the energy constraint, but now you’ve got a materials constraint due to the central laws of chemistry. For once-through processes like enhanced rock weathering, that means that if you want to capture a gigaton of CO2, roughly speaking, you’re going to need a billion tons of rock.So, to capture gigatons of carbon through engineered methods requires tremendous amounts of physical material, air movement, and energy. On the other hand, everything we’re doing to put that CO2 in the atmosphere is extensive too, so large-scale emissions reductions face comparable challenges.Q: What does the report conclude, in terms of whether and how to remove carbon dioxide from the atmosphere?A: Our initial prejudice was, CDR is just going to take so much energy, and there’s no way around that because of the second law of thermodynamics, regardless of the method.But as we discussed, there is this nuance about cyclic versus once-through systems. And there are two points of view that we ended up threading a needle between. One is the view that CDR is a silver bullet, and we’ll just do CDR and not worry about emissions — we’ll just suck it all out of the atmosphere. And that’s not the case. It will be really expensive, and will take a lot of energy and materials to do large-scale CDR. But there’s another view, where people say, don’t even think about CDR. Even thinking about CDR will compromise our efforts toward emissions reductions. The report comes down somewhere in the middle, saying that CDR is not a magic bullet, but also not a no-go.If we are serious about managing climate change, we will likely want substantial CDR in addition to aggressive emissions reductions. The report concludes that research and development on CDR methods should be selectively and prudently pursued despite the expected cost and energy and material requirements.At a policy level, the main message is that we need an economic and policy framework that incentivizes emissions reductions and CDR in a common framework; this would naturally allow the market to optimize climate solutions. Since in many cases it is much easier and cheaper to cut emissions than it will likely ever be to remove atmospheric carbon, clearly understanding the challenges of CDR should help motivate rapid emissions reductions.For me, I’m optimistic in the sense that scientifically we understand what it will take to reduce emissions and to use CDR to bring CO2 levels down to a slightly lower level. Now, it’s really a societal and economic problem. I think humanity has the potential to solve these problems. I hope that we can find common ground so that we can take actions as a society that will benefit both humanity and the broader ecosystems on the planet, before we end up having bigger problems than we already have.  More

  • in

    Seeking climate connections among the oceans’ smallest organisms

    Andrew Babbin tries to pack light for work trips. Along with the travel essentials, though, he also brings a roll each of electrical tape, duct tape, lab tape, a pack of cable ties, and some bungee cords.“It’s my MacGyver kit: You never know when you have to rig something on the fly in the field or fix a broken bag,” Babbin says.The trips Babbin takes are far out to sea, on month-long cruises, where he works to sample waters off the Pacific coast and out in the open ocean. In remote locations, repair essentials often come in handy, as when Babbin had to zip-tie a wrench to a sampling device to help it sink through an icy Antarctic lake.Babbin is an oceanographer and marine biogeochemist who studies marine microbes and the ways in which they control the cycling of nitrogen between the ocean and the atmosphere. This exchange helps maintain healthy ocean ecosystems and supports the ocean’s capacity to store carbon.By combining measurements that he takes in the ocean with experiments in his MIT lab, Babbin is working to understand the connections between microbes and ocean nitrogen, which could in turn help scientists identify ways to maintain the ocean’s health and productivity. His work has taken him to many coastal and open-ocean regions around the globe.“You really become an oceanographer and an Earth scientist to see the world,” says Babbin, who recently earned tenure as the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “We embrace the diversity of places and cultures on this planet. To see just a small fraction of that is special.”A powerful cycleThe ocean has been a constant presence for Babbin since childhood. His family is from Monmouth County, New Jersey, where he and his twin sister grew up playing along the Jersey shore. When they were teenagers, their parents took the kids on family cruise vacations.“I always loved being on the water,” he says. “My favorite parts of any of those cruises were the days at sea, where you were just in the middle of some ocean basin with water all around you.”In school, Babbin gravitated to the sciences, and chemistry in particular. After high school, he attended Columbia University, where a visit to the school’s Earth and environmental engineering department catalyzed a realization.“For me, it was always this excitement about the water and about chemistry, and it was this pop of, ‘Oh wow, it doesn’t have to be one or the other,’” Babbin says.He chose to major in Earth and environmental engineering, with a concentration in water resources and climate risks. After graduating in 2008, Babbin returned to his home state, where he attended Princeton University and set a course for a PhD in geosciences, with a focus on chemical oceanography and environmental microbiology. His advisor, oceanographer Bess Ward, took Babbin on as a member of her research group and invited him on several month-long cruises to various parts of the eastern tropical Pacific.“I still remember that first trip,” Babbin recalls. “It was a whirlwind. Everyone else had been to sea a gazillion times and was loading the boat and strapping things down, and I had no idea of anything. And within a few hours, I was doing an experiment as the ship rocked back and forth!”Babbin learned to deploy sampling cannisters overboard, then haul them back up and analyze the seawater inside for signs of nitrogen — an essential nutrient for all living things on Earth.As it turns out, the plants and animals that depend on nitrogen to survive are unable to take it up from the atmosphere themselves. They require a sort of go-between, in the form of microbes that “fix” nitrogen, converting it from nitrogen gas to more digestible forms. In the ocean, this nitrogen fixation is done by highly specialized microbial species, which work to make nitrogen available to phytoplankton — microscopic plant-like organisms that are the foundation of the marine food chain. Phytoplankton are also a main route by which the ocean absorbs carbon dioxide from the atmosphere.Microorganisms may also use these biologically available forms of nitrogen for energy under certain conditions, returning nitrogen to the atmosphere. These microbes can also release a byproduct of nitrous oxide, which is a potent greenhouse gas that also can catalyze ozone loss in the stratosphere.Through his graduate work, at sea and in the lab, Babbin became fascinated with the cycling of nitrogen and the role that nitrogen-fixing microbes play in supporting the ocean’s ecosystems and the climate overall. A balance of nitrogen inputs and outputs sustains phytoplankton and maintains the ocean’s ability to soak up carbon dioxide.“Some of the really pressing questions in ocean biogeochemistry pertain to this cycling of nitrogen,” Babbin says. “Understanding the ways in which this one element cycles through the ocean, and how it is central to ecosystem health and the planet’s climate, has been really powerful.”In the lab and out to seaAfter completing his PhD in 2014, Babbin arrived at MIT as a postdoc in the Department of Civil and Environmental Engineering.“My first feeling when I came here was, wow, this really is a nerd’s playground,” Babbin says. “I embraced being part of a culture where we seek to understand the world better, while also doing the things we really want to do.”In 2017, he accepted a faculty position in MIT’s Department of Earth, Atmospheric and Planetary Sciences. He set up his laboratory space, painted in his favorite brilliant orange, on the top floor of the Green Building.His group uses 3D printers to fabricate microfluidic devices in which they reproduce the conditions of the ocean environment and study microbe metabolism and its effects on marine chemistry. In the field, Babbin has led research expeditions to the Galapagos Islands and parts of the eastern Pacific, where he has collected and analyzed samples of air and water for signs of nitrogen transformations and microbial activity. His new measuring station in the Galapagos is able to infer marine emissions of nitrous oxide across a large swath of the eastern tropical Pacific Ocean. His group has also sailed to southern Cuba, where the researchers studied interactions of microbes in coral reefs.Most recently, Babbin traveled to Antarctica, where he set up camp next to frozen lakes and plumbed for samples of pristine ice water that he will analyze for genetic remnants of ancient microbes. Such preserved bacterial DNA could help scientists understand how microbes evolved and influenced the Earth’s climate over billions of years.“Microbes are the terraformers,” Babbin notes. “They have been, since life evolved more than 3 billion years ago. We have to think about how they shape the natural world and how they will respond to the Anthropocene as humans monkey with the planet ourselves.”Collective actionBabbin is now charting new research directions. In addition to his work at sea and in the lab, he is venturing into engineering, with a new project to design denitrifying capsules. While nitrogen is an essential nutrient for maintaining a marine ecosystem, too much nitrogen, such as from fertilizer that runs off into lakes and streams, can generate blooms of toxic algae. Babbin is looking to design eco-friendly capsules that scrub excess anthropogenic nitrogen from local waterways. He’s also beginning the process of designing a new sensor to measure low-oxygen concentrations in the ocean. As the planet warms, the oceans are losing oxygen, creating “dead zones” where fish cannot survive. While others including Babbin have tried to map these oxygen minimum zones, or OMZs, they have done so sporadically, by dropping sensors into the ocean over limited range, depth, and times. Babbin’s sensors could potentially provide a more complete map of OMZs, as they would be deployed on wide-ranging, deep-diving, and naturally propulsive vehicles: sharks.“We want to measure oxygen. Sharks need oxygen. And if you look at where the sharks don’t go, you might have a sense of where the oxygen is not,” says Babbin, who is working with marine biologists on ways to tag sharks with oxygen sensors. “A number of these large pelagic fish move up and down the water column frequently, so you can map the depth to which they dive to, and infer something about the behavior. And my suggestion is, you might also infer something about the ocean’s chemistry.”When he reflects on what stimulates new ideas and research directions, Babbin credits working with others, in his own group and across MIT.“My best thoughts come from this collective action,” Babbin says. “Particularly because we all have different upbringings and approach things from a different perspective.”He’s bringing this collaborative spirit to his new role, as a mission director for MIT’s Climate Project. Along with Jesse Kroll, who is a professor of civil and environmental engineering and of chemical engineering, Babbin co-leads one of the project’s six missions: Restoring the Atmosphere, Protecting the Land and Oceans. Babbin and Kroll are planning a number of workshops across campus that they hope will generate new connections, and spark new ideas, particularly around ways to evaluate the effectiveness of different climate mitigation strategies and better assess the impacts of climate on society.“One area we want to promote is thinking of climate science and climate interventions as two sides of the same coin,” Babbin says. “There’s so much action that’s trying to be catalyzed. But we want it to be the best action. Because we really have one shot at doing this. Time is of the essence.” More