More stories

  • in

    With new heat treatment, 3D-printed metals can withstand extreme conditions

    A new MIT-developed heat treatment transforms the microscopic structure of 3D-printed metals, making the materials stronger and more resilient in extreme thermal environments. The technique could make it possible to 3D print high-performance blades and vanes for power-generating gas turbines and jet engines, which would enable new designs with improved fuel consumption and energy efficiency.

    Today’s gas turbine blades are manufactured through conventional casting processes in which molten metal is poured into complex molds and directionally solidified. These components are made from some of the most heat-resistant metal alloys on Earth, as they are designed to rotate at high speeds in extremely hot gas, extracting work to generate electricity in power plants and thrust in jet engines.

    There is growing interest in manufacturing turbine blades through 3D-printing, which, in addition to its environmental and cost benefits, could allow manufacturers to quickly produce more intricate, energy-efficient blade geometries. But efforts to 3D-print turbine blades have yet to clear a big hurdle: creep.

    In metallurgy, creep refers to a metal’s tendency to permanently deform in the face of persistent mechanical stress and high temperatures. While researchers have explored printing turbine blades, they have found that the printing process produces fine grains on the order of tens to hundreds of microns in size — a microstructure that is especially vulnerable to creep.

    “In practice, this would mean a gas turbine would have a shorter life or less fuel efficiency,” says Zachary Cordero, the Boeing Career Development Professor in Aeronautics and Astronautics at MIT. “These are costly, undesirable outcomes.”

    Cordero and his colleagues found a way to improve the structure of 3D-printed alloys by adding an additional heat-treating step, which transforms the as-printed material’s fine grains into much larger “columnar” grains — a sturdier microstructure that should minimize the material’s creep potential, since the “columns” are aligned with the axis of greatest stress. The researchers say the method, outlined today in Additive Manufacturing, clears the way for industrial 3D-printing of gas turbine blades.

    “In the near future, we envision gas turbine manufacturers will print their blades and vanes at large-scale additive manufacturing plants, then post-process them using our heat treatment,” Cordero says. “3D-printing will enable new cooling architectures that can improve the thermal efficiency of a turbine, so that it produces the same amount of power while burning less fuel and ultimately emits less carbon dioxide.”

    Cordero’s co-authors on the study are lead author Dominic Peachey, Christopher Carter, and Andres Garcia-Jimenez at MIT, Anugrahaprada Mukundan and Marie-Agathe Charpagne of the University of Illinois at Urbana-Champaign, and Donovan Leonard of Oak Ridge National Laboratory.

    Triggering a transformation

    The team’s new method is a form of directional recrystallization — a heat treatment that passes a material through a hot zone at a precisely controlled speed to meld a material’s many microscopic grains into larger, sturdier, and more uniform crystals.

    Directional recrystallization was invented more than 80 years ago and has been applied to wrought materials. In their new study, the MIT team adapted directional recrystallization for 3D-printed superalloys.

    The team tested the method on 3D-printed nickel-based superalloys — metals that are typically cast and used in gas turbines. In a series of experiments, the researchers placed 3D-printed samples of rod-shaped superalloys in a room-temperature water bath placed just below an induction coil. They slowly drew each rod out of the water and through the coil at various speeds, dramatically heating the rods to temperatures varying between 1,200 and 1,245 degrees Celsius.

    They found that drawing the rods at a particular speed (2.5 millimeters per hour) and through a specific temperature (1,235 degrees Celsius) created a steep thermal gradient that triggered a transformation in the material’s printed, fine-grained microstructure.

    “The material starts as small grains with defects called dislocations, that are like a mangled spaghetti,” Cordero explains. “When you heat this material up, those defects can annihilate and reconfigure, and the grains are able to grow. We’re continuously elongating the grains by consuming the defective material and smaller grains — a process termed recrystallization.”

    Creep away

    After cooling the heat-treated rods, the researchers examined their microstructure using optical and electron microscopy, and found that the material’s printed microscopic grains were replaced with “columnar” grains, or long crystal-like regions that were significantly larger than the original grains.

    “We’ve completely transformed the structure,” says lead author Dominic Peachey. “We show we can increase the grain size by orders of magnitude, to massive columnar grains, which theoretically should lead to dramatic improvements in creep properties.”

    The team also showed they could manipulate the draw speed and temperature of the rod samples to tailor the material’s growing grains, creating regions of specific grain size and orientation. This level of control, Cordero says, can enable manufacturers to print turbine blades with site-specific microstructures that are resilient to specific operating conditions.

    Cordero plans to test the heat treatment on 3D-printed geometries that more closely resemble turbine blades. The team is also exploring ways to speed up the draw rate, as well as test a heat-treated structure’s resistance to creep. Then, they envision that the heat treatment could enable the practical application of 3D-printing to produce industrial-grade turbine blades, with more complex shapes and patterns.

    “New blade and vane geometries will enable more energy-efficient land-based gas turbines, as well as, eventually, aeroengines,” Cordero notes. “This could from a baseline perspective lead to lower carbon dioxide emissions, just through improved efficiency of these devices.”

    This research was supported, in part, by the U.S. Office of Naval Research. More

  • in

    MIT PhD students shed light on important water and food research

    One glance at the news lately will reveal countless headlines on the dire state of global water and food security. Pollution, supply chain disruptions, and the war in Ukraine are all threatening water and food systems, compounding climate change impacts from heat waves, drought, floods, and wildfires.

    Every year, MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) offers fellowships to outstanding MIT graduate students who are working on innovative ways to secure water and food supplies in light of these urgent worldwide threats. J-WAFS announced this year’s fellowship recipients last April. Aditya Ghodgaonkar and Devashish Gokhale were awarded Rasikbhai L. Meswani Fellowships for Water Solutions, which are made possible by a generous gift from Elina and Nikhil Meswani and family. James Zhang, Katharina Fransen, and Linzixuan (Rhoda) Zhang were awarded J-WAFS Fellowships for Water and Food Solutions. The J-WAFS Fellowship for Water and Food Solutions is funded in part by J-WAFS Research Affiliate companies: Xylem, Inc., a water technology company, and GoAigua, a company leading the digital transformation of the water industry.

    The five fellows were each awarded a stipend and full tuition for one semester. They also benefit from mentorship, networking connections, and opportunities to showcase their research.

    “This year’s cohort of J-WAFS fellows show an indefatigable drive to explore, create, and push back boundaries,” says John H. Lienhard, director of J-WAFS. “Their passion and determination to create positive change for humanity are evident in these unique video portraits, which describe their solutions-oriented research in water and food,” Lienhard adds.

    J-WAFS funder Community Jameel recently commissioned video portraitures of each student that highlight their work and their inspiration to solve challenges in water and food. More about each J-WAFS fellow and their research follows.

    Play video

    Katharina Fransen

    In Professor Bradley Olsen’s lab in the Department of Chemical Engineering, Katharina Fransen works to develop biologically-based, biodegradable plastics which can be used for food packing that won’t pollute the environment. Fransen, a third-year PhD student, is motivated by the challenge of protecting the most vulnerable global communities from waste generated by the materials that are essential to connecting them to the global food supply. “We can’t ensure that all of our plastic waste gets recycled or reused, and so we want to make sure that if it does escape into the environment it can degrade, and that’s kind of where a lot of my research really comes in,” says Fransen. Most of her work involves creating polymers, or “really long chains of chemicals,” kind of like the paper rings a lot of us looped into chains as kids, Fransen explains. The polymers are optimized for food packaging applications to keep food fresher for longer, preventing food waste. Fransen says she finds the work “really interesting from the scientific perspective as well as from the idea that [she’s] going to make the world a little better with these new materials.” She adds, “I think it is both really fulfilling and really exciting and engaging.”

    Play video

    Aditya Ghodgaonkar

    “When I went to Kenya this past spring break, I had an opportunity to meet a lot of farmers and talk to them about what kind of maintenance issues they face,” says Aditya Ghodgaonkar, PhD candidate in the Department of Mechanical Engineering. Ghodgaonkar works with Associate Professor Amos Winter in the Global Engineering and Research (GEAR) Lab, where he designs hydraulic components for drip irrigation systems to make them water-efficient, off-grid, inexpensive, and low-maintenance. On his trip to Kenya, Ghodgaonkar gained firsthand knowledge from farmers about a common problem they encounter: clogging of drip irrigation emitters. He learned that clogging can be an expensive technical challenge to diagnose, mitigate, and resolve. He decided to focus his attention on designing emitters that are resistant to clogging, testing with sand and passive hydrodynamic filtration back in the lab at MIT. “I got into this from an academic standpoint,” says Ghodgaonkar. “It is only once I started working on the emitters, spoke with industrial partners that make these emitters, spoke with farmers, that I really truly appreciated the impact of what we’re doing.”

    Play video

    Devashish Gokhale

    Devashish Gokhale is a PhD student advised by Professor Patrick Doyle in the Department of Chemical Engineering. Gokhale’s commitment to global water security stems from his childhood in Pune, India, where both flooding and drought can occur depending on the time of year. “I’ve had these experiences where there’s been too much water and also too little water” he recalls. At MIT, Gokhale is developing cost-effective, sustainable, and reusable materials for water treatment with a focus on the elimination of emerging contaminants and low-concentration pollutants like heavy metals. Specifically, he works on making and optimizing polymeric hydrogel microparticles that can absorb micropollutants. “I know how important it is to do something which is not just scientifically interesting, but something which is impactful in a real way,” says Gokhale. Before starting a research project he asks himself, “are people going to be able to afford this? Is it really going to reach the people who need it the most?” Adding these constraints in the beginning of the research process sometimes makes the problem more difficult to solve, but Gokhale notes that in the end, the solution is much more promising.

    Play video

    James Zhang

    “We don’t really think much about it, it’s transparent, odorless, we just turn on our sink in many parts of the world and it just flows through,” says James Zhang when talking about water. Yet he notes that “many other parts of the world face water scarcity and this will only get worse due to global climate change.” A PhD student in the Department of Mechanical Engineering, Zhang works in the Nano Engineering Laboratory with Professor Gang Chen. Zhang is working on a technology that uses light-induced evaporation to clean water. He is currently investigating the fundamental properties of how light at different wavelengths interacts with liquids at the surface, particularly with brackish water surfaces. With strong theoretical and experimental components, his research could lead to innovations in desalinating water at high energy efficiencies. Zhang hopes that the technology can one day “produce lots of clean water for communities around the world that currently don’t have access to fresh water,” and create a new appreciation for this common liquid that many of us might not think about on a day-to-day basis.

    Play video

    Linzixuan (Rhoda) Zhang

    “Around the world there are about 2 billion people currently suffering from micronutrient deficiency because they do not have access to very healthy, very fresh food,” says chemical engineering PhD candidate Linzixuan (Rhoda) Zhang. This fact led Zhang to develop a micronutrient delivery platform that fortifies foods with essential vitamins and nutrients. With her advisors, Professor Robert Langer and Research Scientist Ana Jaklenec, Zhang brings biomedical engineering approaches to global health issues. Zhang says that “one of the most serious problems is vitamin A deficiency, because vitamin A is not very stable.” She goes on to explain that although vitamin A is present in different vegetables, when the vegetables are cooked, vitamin A can easily degrade. Zhang helped develop a group of biodegradable polymers that can stabilize micronutrients under cooking and storage conditions. With this technology, vitamin A, for example, could be encapsulated and effectively stabilized under boiling water. The platform has also shown efficient release in a simulation of the stomach environment. Zhang says it is the “little, tiny steps every day that are pushing us forward to the final impactful product.” More

  • in

    Advancing the energy transition amidst global crises

    “The past six years have been the warmest on the planet, and our track record on climate change mitigation is drastically short of what it needs to be,” said Robert C. Armstrong, MIT Energy Initiative (MITEI) director and the Chevron Professor of Chemical Engineering, introducing MITEI’s 15th Annual Research Conference.

    At the symposium, participants from academia, industry, and finance acknowledged the deepening difficulties of decarbonizing a world rocked by geopolitical conflicts and suffering from supply chain disruptions, energy insecurity, inflation, and a persistent pandemic. In spite of this grim backdrop, the conference offered evidence of significant progress in the energy transition. Researchers provided glimpses of a low-carbon future, presenting advances in such areas as long-duration energy storage, carbon capture, and renewable technologies.

    In his keynote remarks, Ernest J. Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus, founding director of MITEI, and former U.S. secretary of energy, highlighted “four areas that have materially changed in the last year” that could shake up, and possibly accelerate, efforts to address climate change.

    Extreme weather seems to be propelling the public and policy makers of both U.S. parties toward “convergence … at least in recognition of the challenge,” Moniz said. He perceives a growing consensus that climate goals will require — in diminishing order of certainty — firm (always-on) power to complement renewable energy sources, a fuel (such as hydrogen) flowing alongside electricity, and removal of atmospheric carbon dioxide (CO2).

    Russia’s invasion of Ukraine, with its “weaponization of natural gas” and global energy impacts, underscores the idea that climate, energy security, and geopolitics “are now more or less recognized widely as one conversation.” Moniz pointed as well to new U.S. laws on climate change and infrastructure that will amplify the role of science and technology and “address the drive to technological dominance by China.”

    The rapid transformation of energy systems will require a comprehensive industrial policy, Moniz said. Government and industry must select and rapidly develop low-carbon fuels, firm power sources (possibly including nuclear power), CO2 removal systems, and long-duration energy storage technologies. “We will need to make progress on all fronts literally in this decade to come close to our goals for climate change mitigation,” he concluded.

    Global cooperation?

    Over two days, conference participants delved into many of the issues Moniz raised. In one of the first panels, scholars pondered whether the international community could forge a coordinated climate change response. The United States’ rift with China, especially over technology trade policies, loomed large.

    “Hatred of China is a bipartisan hobby and passion, but a blanket approach isn’t right, even for the sake of national security,” said Yasheng Huang, the Epoch Foundation Professor of Global Economics and Management at the MIT Sloan School of Management. “Although the United States and China working together would have huge effects for both countries, it is politically unpalatable in the short term,” said F. Taylor Fravel, the Arthur and Ruth Sloan Professor of Political Science and director of the MIT Security Studies Program. John E. Parsons, deputy director for research at the MIT Center for Energy and Environmental Policy Research, suggested that the United States should use this moment “to get our own act together … and start doing things,” such as building nuclear power plants in a cost-effective way.

    Debating carbon removal

    Several panels took up the matter of carbon emissions and the most promising technologies for contending with them. Charles Harvey, MIT professor of civil and environmental engineering, and Howard Herzog, a senior research engineer at MITEI, set the stage early, debating whether capturing carbon was essential to reaching net-zero targets.

    “I have no trouble getting to net zero without carbon capture and storage,” said David Keith, the Gordon McKay Professor of Applied Physics at Harvard University, in a subsequent roundtable. Carbon capture seems more risky to Keith than solar geoengineering, which involves injecting sulfur into the stratosphere to offset CO2 and its heat-trapping impacts.

    There are new ways of moving carbon from where it’s a problem to where it’s safer. Kripa K. Varanasi, MIT professor of mechanical engineering, described a process for modulating the pH of ocean water to remove CO2. Timothy Krysiek, managing director for Equinor Ventures, talked about construction of a 900-kilometer pipeline transporting CO2 from northern Germany to a large-scale storage site located in Norwegian waters 3,000 meters below the seabed. “We can use these offshore Norwegian assets as a giant carbon sink for Europe,” he said.

    A startup showcase featured additional approaches to the carbon challenge. Mantel, which received MITEI Seed Fund money, is developing molten salt material to capture carbon for long-term storage or for use in generating electricity. Verdox has come up with an electrochemical process for capturing dilute CO2 from the atmosphere.

    But while much of the global warming discussion focuses on CO2, other greenhouse gases are menacing. Another panel discussed measuring and mitigating these pollutants. “Methane has 82 times more warming power than CO2 from the point of emission,” said Desirée L. Plata, MIT associate professor of civil and environmental engineering. “Cutting methane is the strongest lever we have to slow climate change in the next 25 years — really the only lever.”

    Steven Hamburg, chief scientist and senior vice president of the Environmental Defense Fund, cautioned that emission of hydrogen molecules into the atmosphere can cause increases in other greenhouse gases such as methane, ozone, and water vapor. As researchers and industry turn to hydrogen as a fuel or as a feedstock for commercial processes, “we will need to minimize leakage … or risk increasing warming,” he said.

    Supply chains, markets, and new energy ventures

    In panels on energy storage and the clean energy supply chain, there were interesting discussions of challenges ahead. High-density energy materials such as lithium, cobalt, nickel, copper, and vanadium for grid-scale energy storage, electric vehicles (EVs), and other clean energy technologies, can be difficult to source. “These often come from water-stressed regions, and we need to be super thoughtful about environmental stresses,” said Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in Materials Science and Engineering. She also noted that in light of the explosive growth in demand for metals such as lithium, recycling EVs won’t be of much help. “The amount of material coming back from end-of-life batteries is minor,” she said, until EVs are much further along in their adoption cycle.

    Arvind Sanger, founder and managing partner of Geosphere Capital, said that the United States should be developing its own rare earths and minerals, although gaining the know-how will take time, and overcoming “NIMBYism” (not in my backyard-ism) is a challenge. Sanger emphasized that we must continue to use “denser sources of energy” to catalyze the energy transition over the next decade. In particular, Sanger noted that “for every transition technology, steel is needed,” and steel is made in furnaces that use coal and natural gas. “It’s completely woolly-headed to think we can just go to a zero-fossil fuel future in a hurry,” he said.

    The topic of power markets occupied another panel, which focused on ways to ensure the distribution of reliable and affordable zero-carbon energy. Integrating intermittent resources such as wind and solar into the grid requires a suite of retail markets and new digital tools, said Anuradha Annaswamy, director of MIT’s Active-Adaptive Control Laboratory. Tim Schittekatte, a postdoc at the MIT Sloan School of Management, proposed auctions as a way of insuring consumers against periods of high market costs.

    Another panel described the very different investment needs of new energy startups, such as longer research and development phases. Hooisweng Ow, technology principal at Eni Next LLC Ventures, which is developing drilling technology for geothermal energy, recommends joint development and partnerships to reduce risk. Michael Kearney SM ’11, PhD ’19, SM ’19 is a partner at The Engine, a venture firm built by MIT investing in path-breaking technology to solve the toughest challenges in climate and other problems. Kearney believes the emergence of new technologies and markets will bring on “a labor transition on an order of magnitude never seen before in this country,” he said. “Workforce development is not a natural zone for startups … and this will have to change.”

    Supporting the global South

    The opportunities and challenges of the energy transition look quite different in the developing world. In conversation with Robert Armstrong, Luhut Binsar Pandjaitan, the coordinating minister for maritime affairs and investment of the Republic of Indonesia, reported that his “nation is rich with solar, wind, and energy transition minerals like nickel and copper,” but cannot on its own tackle developing renewable energy or reducing carbon emissions and improving grid infrastructure. “Education is a top priority, and we are very far behind in high technologies,” he said. “We need help and support from MIT to achieve our target,” he said.

    Technologies that could springboard Indonesia and other nations of the global South toward their climate goals are emerging in MITEI-supported projects and at young companies MITEI helped spawn. Among the promising innovations unveiled at the conference are new materials and designs for cooling buildings in hot climates and reducing the environmental costs of construction, and a sponge-like substance that passively sucks moisture out of the air to lower the energy required for running air conditioners in humid climates.

    Other ideas on the move from lab to market have great potential for industrialized nations as well, such as a computational framework for maximizing the energy output of ocean-based wind farms; a process for using ammonia as a renewable fuel with no CO2 emissions; long-duration energy storage derived from the oxidation of iron; and a laser-based method for unlocking geothermal steam to drive power plants. More

  • in

    Ocean microbes get their diet through a surprising mix of sources, study finds

    One of the smallest and mightiest organisms on the planet is a plant-like bacterium known to marine biologists as Prochlorococcus. The green-tinted microbe measures less than a micron across, and its populations suffuse through the upper layers of the ocean, where a single teaspoon of seawater can hold millions of the tiny organisms.

    Prochlorococcus grows through photosynthesis, using sunlight to convert the atmosphere’s carbon dioxide into organic carbon molecules. The microbe is responsible for 5 percent of the world’s photosynthesizing activity, and scientists have assumed that photosynthesis is the microbe’s go-to strategy for acquiring the carbon it needs to grow.

    But a new MIT study in Nature Microbiology today has found that Prochlorococcus relies on another carbon-feeding strategy, more than previously thought.

    Organisms that use a mix of strategies to provide carbon are known as mixotrophs. Most marine plankton are mixotrophs. And while Prochlorococcus is known to occasionally dabble in mixotrophy, scientists have assumed the microbe primarily lives a phototrophic lifestyle.

    The new MIT study shows that in fact, Prochlorococcus may be more of a mixotroph than it lets on. The microbe may get as much as one-third of its carbon through a second strategy: consuming the dissolved remains of other dead microbes.

    The new estimate may have implications for climate models, as the microbe is a significant force in capturing and “fixing” carbon in the Earth’s atmosphere and ocean.

    “If we wish to predict what will happen to carbon fixation in a different climate, or predict where Prochlorococcus will or will not live in the future, we probably won’t get it right if we’re missing a process that accounts for one-third of the population’s carbon supply,” says Mick Follows, a professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS), and its Department of Civil and Environmental Engineering.

    The study’s co-authors include first author and MIT postdoc Zhen Wu, along with collaborators from the University of Haifa, the Leibniz-Institute for Baltic Sea Research, the Leibniz-Institute of Freshwater Ecology and Inland Fisheries, and Potsdam University.

    Persistent plankton

    Since Prochlorococcus was first discovered in the Sargasso Sea in 1986, by MIT Institute Professor Sallie “Penny” Chisholm and others, the microbe has been observed throughout the world’s oceans, inhabiting the upper sunlit layers ranging from the surface down to about 160 meters. Within this range, light levels vary, and the microbe has evolved a number of ways to photosynthesize carbon in even low-lit regions.

    The organism has also evolved ways to consume organic compounds including glucose and certain amino acids, which could help the microbe survive for limited periods of time in dark ocean regions. But surviving on organic compounds alone is a bit like only eating junk food, and there is evidence that Prochlorococcus will die after a week in regions where photosynthesis is not an option.

    And yet, researchers including Daniel Sher of the University of Haifa, who is a co-author of the new study, have observed healthy populations of Prochlorococcus that persist deep in the sunlit zone, where the light intensity should be too low to maintain a population. This suggests that the microbes must be switching to a non-photosynthesizing, mixotrophic lifestyle in order to consume other organic sources of carbon.

    “It seems that at least some Prochlorococcus are using existing organic carbon in a mixotrophic way,” Follows says. “That stimulated the question: How much?”

    What light cannot explain

    In their new paper, Follows, Wu, Sher, and their colleagues looked to quantify the amount of carbon that Prochlorococcus is consuming through processes other than photosynthesis.

    The team looked first to measurements taken by Sher’s team, which previously took ocean samples at various depths in the Mediterranean Sea and measured the concentration of phytoplankton, including Prochlorococcus, along with the associated intensity of light and the concentration of nitrogen — an essential nutrient that is richly available in deeper layers of the ocean and that plankton can assimilate to make proteins.

    Wu and Follows used this data, and similar information from the Pacific Ocean, along with previous work from Chisholm’s lab, which established the rate of photosynthesis that Prochlorococcus could carry out in a given intensity of light.

    “We converted that light intensity profile into a potential growth rate — how fast the population of Prochlorococcus could grow if it was acquiring all it’s carbon by photosynthesis, and light is the limiting factor,” Follows explains.

    The team then compared this calculated rate to growth rates that were previously observed in the Pacific Ocean by several other research teams.

    “This data showed that, below a certain depth, there’s a lot of growth happening that photosynthesis simply cannot explain,” Follows says. “Some other process must be at work to make up the difference in carbon supply.”

    The researchers inferred that, in deeper, darker regions of the ocean, Prochlorococcus populations are able to survive and thrive by resorting to mixotrophy, including consuming organic carbon from detritus. Specifically, the microbe may be carrying out osmotrophy — a process by which an organism passively absorbs organic carbon molecules via osmosis.

    Judging by how fast the microbe is estimated to be growing below the sunlit zone, the team calculates that Prochlorococcus obtains up to one-third of its carbon diet through mixotrophic strategies.

    “It’s kind of like going from a specialist to a generalist lifestyle,” Follows says. “If I only eat pizza, then if I’m 20 miles from a pizza place, I’m in trouble, whereas if I eat burgers as well, I could go to the nearby McDonald’s. People had thought of Prochlorococcus as a specialist, where they do this one thing (photosynthesis) really well. But it turns out they may have more of a generalist lifestyle than we previously thought.”

    Chisholm, who has both literally and figuratively written the book on Prochlorococcus, says the group’s findings “expand the range of conditions under which their populations can not only survive, but also thrive. This study changes the way we think about the role of Prochlorococcus in the microbial food web.”

    This research was supported, in part, by the Israel Science Foundation, the U.S. National Science Foundation, and the Simons Foundation. More

  • in

    Methane research takes on new urgency at MIT

    One of the most notable climate change provisions in the 2022 Inflation Reduction Act is the first U.S. federal tax on a greenhouse gas (GHG). That the fee targets methane (CH4), rather than carbon dioxide (CO2), emissions is indicative of the urgency the scientific community has placed on reducing this short-lived but powerful gas. Methane persists in the air about 12 years — compared to more than 1,000 years for CO2 — yet it immediately causes about 120 times more warming upon release. The gas is responsible for at least a quarter of today’s gross warming. 

    “Methane has a disproportionate effect on near-term warming,” says Desiree Plata, the director of MIT Methane Network. “CH4 does more damage than CO2 no matter how long you run the clock. By removing methane, we could potentially avoid critical climate tipping points.” 

    Because GHGs have a runaway effect on climate, reductions made now will have a far greater impact than the same reductions made in the future. Cutting methane emissions will slow the thawing of permafrost, which could otherwise lead to massive methane releases, as well as reduce increasing emissions from wetlands.  

    “The goal of MIT Methane Network is to reduce methane emissions by 45 percent by 2030, which would save up to 0.5 degree C of warming by 2100,” says Plata, an associate professor of civil and environmental engineering at MIT and director of the Plata Lab. “When you consider that governments are trying for a 1.5-degree reduction of all GHGs by 2100, this is a big deal.” 

    Under normal concentrations, methane, like CO2, poses no health risks. Yet methane assists in the creation of high levels of ozone. In the lower atmosphere, ozone is a key component of air pollution, which leads to “higher rates of asthma and increased emergency room visits,” says Plata. 

    Methane-related projects at the Plata Lab include a filter made of zeolite — the same clay-like material used in cat litter — designed to convert methane into CO2 at dairy farms and coal mines. At first glance, the technology would appear to be a bit of a hard sell, since it converts one GHG into another. Yet the zeolite filter’s low carbon and dollar costs, combined with the disproportionate warming impact of methane, make it a potential game-changer.

    The sense of urgency about methane has been amplified by recent studies that show humans are generating far more methane emissions than previously estimated, and that the rates are rising rapidly. Exactly how much methane is in the air is uncertain. Current methods for measuring atmospheric methane, such as ground, drone, and satellite sensors, “are not readily abundant and do not always agree with each other,” says Plata.  

    The Plata Lab is collaborating with Tim Swager in the MIT Department of Chemistry to develop low-cost methane sensors. “We are developing chemiresisitive sensors that cost about a dollar that you could place near energy infrastructure to back-calculate where leaks are coming from,” says Plata.  

    The researchers are working on improving the accuracy of the sensors using machine learning techniques and are planning to integrate internet-of-things technology to transmit alerts. Plata and Swager are not alone in focusing on data collection: the Inflation Reduction Act adds significant funding for methane sensor research. 

    Other research at the Plata Lab includes the development of nanomaterials and heterogeneous catalysis techniques for environmental applications. The lab also explores mitigation solutions for industrial waste, particularly those related to the energy transition. Plata is the co-founder of an lithium-ion battery recycling startup called Nth Cycle. 

    On a more fundamental level, the Plata Lab is exploring how to develop products with environmental and social sustainability in mind. “Our overarching mission is to change the way that we invent materials and processes so that environmental objectives are incorporated along with traditional performance and cost metrics,” says Plata. “It is important to do that rigorous assessment early in the design process.”

    Play video

    MIT amps up methane research 

    The MIT Methane Network brings together 26 researchers from MIT along with representatives of other institutions “that are dedicated to the idea that we can reduce methane levels in our lifetime,” says Plata. The organization supports research such as Plata’s zeolite and sensor projects, as well as designing pipeline-fixing robots, developing methane-based fuels for clean hydrogen, and researching the capture and conversion of methane into liquid chemical precursors for pharmaceuticals and plastics. Other members are researching policies to encourage more sustainable agriculture and land use, as well as methane-related social justice initiatives. 

    “Methane is an especially difficult problem because it comes from all over the place,” says Plata. A recent Global Carbon Project study estimated that half of methane emissions are caused by humans. This is led by waste and agriculture (28 percent), including cow and sheep belching, rice paddies, and landfills.  

    Fossil fuels represent 18 percent of the total budget. Of this, about 63 percent is derived from oil and gas production and pipelines, 33 percent from coal mining activities, and 5 percent from industry and transportation. Human-caused biomass burning, primarily from slash-and-burn agriculture, emits about 4 percent of the global total.  

    The other half of the methane budget includes natural methane emissions from wetlands (20 percent) and other natural sources (30 percent). The latter includes permafrost melting and natural biomass burning, such as forest fires started by lightning.  

    With increases in global warming and population, the line between anthropogenic and natural causes is getting fuzzier. “Human activities are accelerating natural emissions,” says Plata. “Climate change increases the release of methane from wetlands and permafrost and leads to larger forest and peat fires.”  

    The calculations can get complicated. For example, wetlands provide benefits from CO2 capture, biological diversity, and sea level rise resiliency that more than compensate for methane releases. Meanwhile, draining swamps for development increases emissions. 

    Over 100 nations have signed onto the U.N.’s Global Methane Pledge to reduce at least 30 percent of anthropogenic emissions within the next 10 years. The U.N. report estimates that this goal can be achieved using proven technologies and that about 60 percent of these reductions can be accomplished at low cost. 

    Much of the savings would come from greater efficiencies in fossil fuel extraction, processing, and delivery. The methane fees in the Inflation Reduction Act are primarily focused on encouraging fossil fuel companies to accelerate ongoing efforts to cap old wells, flare off excess emissions, and tighten pipeline connections.  

    Fossil fuel companies have already made far greater pledges to reduce methane than they have with CO2, which is central to their business. This is due, in part, to the potential savings, as well as in preparation for methane regulations expected from the Environmental Protection Agency in late 2022. The regulations build upon existing EPA oversight of drilling operations, and will likely be exempt from the U.S. Supreme Court’s ruling that limits the federal government’s ability to regulate GHGs. 

    Zeolite filter targets methane in dairy and coal 

    The “low-hanging fruit” of gas stream mitigation addresses most of the 20 percent of total methane emissions in which the gas is released in sufficiently high concentrations for flaring. Plata’s zeolite filter aims to address the thornier challenge of reducing the 80 percent of non-flammable dilute emissions. 

    Plata found inspiration in decades-old catalysis research for turning methane into methanol. One strategy has been to use an abundant, low-cost aluminosilicate clay called zeolite.  

    “The methanol creation process is challenging because you need to separate a liquid, and it has very low efficiency,” says Plata. “Yet zeolite can be very efficient at converting methane into CO2, and it is much easier because it does not require liquid separation. Converting methane to CO2 sounds like a bad thing, but there is a major anti-warming benefit. And because methane is much more dilute than CO2, the relative CO2 contribution is minuscule.”  

    Using zeolite to create methanol requires highly concentrated methane, high temperatures and pressures, and industrial processing conditions. Yet Plata’s process, which dopes the zeolite with copper, operates in the presence of oxygen at much lower temperatures under typical pressures. “We let the methane proceed the way it wants from a thermodynamic perspective from methane to methanol down to CO2,” says Plata. 

    Researchers around the world are working on other dilute methane removal technologies. Projects include spraying iron salt aerosols into sea air where they react with natural chlorine or bromine radicals, thereby capturing methane. Most of these geoengineering solutions, however, are difficult to measure and would require massive scale to make a difference.  

    Plata is focusing her zeolite filters on environments where concentrations are high, but not so high as to be flammable. “We are trying to scale zeolite into filters that you could snap onto the side of a cross-ventilation fan in a dairy barn or in a ventilation air shaft in a coal mine,” says Plata. “For every packet of air we bring in, we take a lot of methane out, so we get more bang for our buck.”  

    The major challenge is creating a filter that can handle high flow rates without getting clogged or falling apart. Dairy barn air handlers can push air at up to 5,000 cubic feet per minute and coal mine handlers can approach 500,000 CFM. 

    Plata is exploring engineering options including fluidized bed reactors with floating catalyst particles. Another filter solution, based in part on catalytic converters, features “higher-order geometric structures where you have a porous material with a long path length where the gas can interact with the catalyst,” says Plata. “This avoids the challenge with fluidized beds of containing catalyst particles in the reactor. Instead, they are fixed within a structured material.”  

    Competing technologies for removing methane from mine shafts “operate at temperatures of 1,000 to 1,200 degrees C, requiring a lot of energy and risking explosion,” says Plata. “Our technology avoids safety concerns by operating at 300 to 400 degrees C. It reduces energy use and provides more tractable deployment costs.” 

    Potentially, energy and dollar costs could be further reduced in coal mines by capturing the heat generated by the conversion process. “In coal mines, you have enrichments above a half-percent methane, but below the 4 percent flammability threshold,” says Plata. “The excess heat from the process could be used to generate electricity using off-the-shelf converters.” 

    Plata’s dairy barn research is funded by the Gerstner Family Foundation and the coal mining project by the U.S. Department of Energy. “The DOE would like us to spin out the technology for scale-up within three years,” says Plata. “We cannot guarantee we will hit that goal, but we are trying to develop this as quickly as possible. Our society needs to start reducing methane emissions now.”  More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More

  • in

    In nanotube science, is boron nitride the new carbon?

    Engineers at MIT and the University of Tokyo have produced centimeter-scale structures, large enough for the eye to see, that are packed with hundreds of billions of hollow aligned fibers, or nanotubes, made from hexagonal boron nitride.

    Hexagonal boron nitride, or hBN, is a single-atom-thin material that has been coined “white graphene” for its transparent appearance and its similarity to carbon-based graphene in molecular structure and strength. It can also withstand higher temperatures than graphene, and is electrically insulating, rather than conductive. When hBN is rolled into nanometer-scale tubes, or nanotubes, its exceptional properties are significantly enhanced.

    The team’s results, published today in the journal ACS Nano, provide a route toward fabricating aligned boron nitride nanotubes (A-BNNTs) in bulk. The researchers plan to harness the technique to fabricate bulk-scale arrays of these nanotubes, which can then be combined with other materials to make stronger, more heat-resistant composites, for instance to shield space structures and hypersonic aircraft.

    As hBN is transparent and electrically insulating, the team also envisions incorporating the BNNTs into transparent windows and using them to electrically insulate sensors within electronic devices. The team is also investigating ways to weave the nanofibers into membranes for water filtration and for “blue energy” — a concept for renewable energy in which electricity is produced from the ionic filtering of salt water into fresh water.

    Brian Wardle, professor of aeronautics and astronautics at MIT, likens the team’s results to scientists’ decades-long, ongoing pursuit of manufacturing bulk-scale carbon nanotubes.

    “In 1991, a single carbon nanotube was identified as an interesting thing, but it’s been 30 years getting to bulk aligned carbon nanotubes, and the world’s not even fully there yet,” Wardle says. “With the work we’re doing, we’ve just short-circuited about 20 years in getting to bulk-scale versions of aligned boron nitride nanotubes.”

    Wardle is the senior author of the new study, which includes lead author and MIT research scientist Luiz Acauan, former MIT postdoc Haozhe Wang, and collaborators at the University of Tokyo.

    A vision, aligned

    Like graphene, hexagonal boron nitride has a molecular structure resembling chicken wire. In graphene, this chicken wire configuration is made entirely of carbon atoms, arranged in a repeating pattern of hexagons. For hBN, the hexagons are composed of alternating atoms of boron and nitrogen. In recent years, researchers have found that two-dimensional sheets of hBN exhibit exceptional properties of strength, stiffness, and resilience at high temperatures. When sheets of hBN are rolled into nanotube form, these properties are further enhanced, particularly when the nanotubes are aligned, like tiny trees in a densely packed forest.

    But finding ways to synthesize stable, high quality BNNTs has proven challenging. A handful of efforts to do so have produced low-quality, nonaligned fibers.

    “If you can align them, you have much better chance of harnessing BNNTs properties at the bulk scale to make actual physical devices, composites, and membranes,” Wardle says.

    In 2020, Rong Xiang and colleagues at the University of Tokyo found they could produce high-quality boron nitride nanotubes by first using a conventional approach of chemical vapor deposition to grow a forest of short, few micron-long carbon nanotubes. They then coated the carbon-based forest with “precursors” of boron and nitrogen gas, which when baked in an oven at high temperatures crystallized onto the carbon nanotubes to form high-quality nanotubes of hexagonal boron nitride with carbon nanotubes inside.

    Burning scaffolds

    In the new study, Wardle and Acauan have extend and scale Xiang’s approach, essentially removing the underlying carbon nanotubes and leaving the long boron nitride nanotubes to stand on their own. The team drew on the expertise of Wardle’s group, which has focused for years on fabricating high-quality aligned arrays of carbon nanotubes. With their current work, the researchers looked for ways to tweak the temperatures and pressures of the chemical vapor deposition process in order to remove the carbon nanotubes while leaving the boron nitride nanotubes intact.

    “The first few times we did it, it was completely ugly garbage,” Wardle recalls. “The tubes curled up into a ball, and they didn’t work.”

    Eventually, the team hit on a combination of temperatures, pressures, and precursors that did the trick. With this combination of processes, the researchers first reproduced the steps that Xiang took to synthesize the boron-nitride-coated carbon nanotubes. As hBN is resistant to higher temperatures than graphene, the team then cranked up the heat to burn away the underlying black carbon nanotube scaffold, while leaving the transparent, freestanding boron nitride nanotubes intact.
    By using carbon nanotubes as a scaffold, MIT engineers grow forests of “white graphene” that emerge (in MIT pattern) after burning away the black carbon scaffold. Courtesy of the researchersIn microscopic images, the team observed clear crystalline structures — evidence that the boron nitride nanotubes have a high quality. The structures were also dense: Within a square centimeter, the researchers were able to synthesize a forest of more than 100 billion aligned boron nitride nanotubes, that measured about a millimeter in height — large enough to be visible by eye. By nanotube engineering standards, these dimensions are considered to be “bulk” in scale.

    “We are now able to make these nanoscale fibers at bulk scale, which has never been shown before,” Acauan says.

    To demonstrate the flexibility of their technique, the team synthesized larger carbon-based structures, including a weave of carbon fibers, a mat of “fuzzy” carbon nanotubes, and sheets of randomly oriented carbon nanotubes known as “buckypaper.” They coated each carbon-based sample with boron and nitrogen precursors, then went through their process to burn away the underlying carbon. In each demonstration, they were left with a boron-nitride replica of the original black carbon scaffold.

    They also were able to “knock down” the forests of BNNTs, producing horizontally aligned fiber films that are a preferred configuration for incorporating into composite materials.

    “We are now working toward fibers to reinforce ceramic matrix composites, for hypersonic and space applications where there are very high temperatures, and for windows for devices that need to be optically transparent,” Wardle says. “You could make transparent materials that are reinforced with these very strong nanotubes.”

    This research was supported, in part, by Airbus, ANSYS, Boeing, Embraer, Lockheed Martin, Saab AB, and Teijin Carbon America through MIT’s Nano-Engineered Composite aerospace STructures (NECST) Consortium. More

  • in

    Coordinating climate and air-quality policies to improve public health

    As America’s largest investment to fight climate change, the Inflation Reduction Act positions the country to reduce its greenhouse gas emissions by an estimated 40 percent below 2005 levels by 2030. But as it edges the United States closer to achieving its international climate commitment, the legislation is also expected to yield significant — and more immediate — improvements in the nation’s health. If successful in accelerating the transition from fossil fuels to clean energy alternatives, the IRA will sharply reduce atmospheric concentrations of fine particulates known to exacerbate respiratory and cardiovascular disease and cause premature deaths, along with other air pollutants that degrade human health. One recent study shows that eliminating air pollution from fossil fuels in the contiguous United States would prevent more than 50,000 premature deaths and avoid more than $600 billion in health costs each year.

    While national climate policies such as those advanced by the IRA can simultaneously help mitigate climate change and improve air quality, their results may vary widely when it comes to improving public health. That’s because the potential health benefits associated with air quality improvements are much greater in some regions and economic sectors than in others. Those benefits can be maximized, however, through a prudent combination of climate and air-quality policies.

    Several past studies have evaluated the likely health impacts of various policy combinations, but their usefulness has been limited due to a reliance on a small set of standard policy scenarios. More versatile tools are needed to model a wide range of climate and air-quality policy combinations and assess their collective effects on air quality and human health. Now researchers at the MIT Joint Program on the Science and Policy of Global Change and MIT Institute for Data, Systems and Society (IDSS) have developed a publicly available, flexible scenario tool that does just that.

    In a study published in the journal Geoscientific Model Development, the MIT team introduces its Tool for Air Pollution Scenarios (TAPS), which can be used to estimate the likely air-quality and health outcomes of a wide range of climate and air-quality policies at the regional, sectoral, and fuel-based level. 

    “This tool can help integrate the siloed sustainability issues of air pollution and climate action,” says the study’s lead author William Atkinson, who recently served as a Biogen Graduate Fellow and research assistant at the IDSS Technology and Policy Program’s (TPP) Research to Policy Engagement Initiative. “Climate action does not guarantee a clean air future, and vice versa — but the issues have similar sources that imply shared solutions if done right.”

    The study’s initial application of TAPS shows that with current air-quality policies and near-term Paris Agreement climate pledges alone, short-term pollution reductions give way to long-term increases — given the expected growth of emissions-intensive industrial and agricultural processes in developing regions. More ambitious climate and air-quality policies could be complementary, each reducing different pollutants substantially to give tremendous near- and long-term health benefits worldwide.

    “The significance of this work is that we can more confidently identify the long-term emission reduction strategies that also support air quality improvements,” says MIT Joint Program Deputy Director C. Adam Schlosser, a co-author of the study. “This is a win-win for setting climate targets that are also healthy targets.”

    TAPS projects air quality and health outcomes based on three integrated components: a recent global inventory of detailed emissions resulting from human activities (e.g., fossil fuel combustion, land-use change, industrial processes); multiple scenarios of emissions-generating human activities between now and the year 2100, produced by the MIT Economic Projection and Policy Analysis model; and emissions intensity (emissions per unit of activity) scenarios based on recent data from the Greenhouse Gas and Air Pollution Interactions and Synergies model.

    “We see the climate crisis as a health crisis, and believe that evidence-based approaches are key to making the most of this historic investment in the future, particularly for vulnerable communities,” says Johanna Jobin, global head of corporate reputation and responsibility at Biogen. “The scientific community has spoken with unanimity and alarm that not all climate-related actions deliver equal health benefits. We’re proud of our collaboration with the MIT Joint Program to develop this tool that can be used to bridge research-to-policy gaps, support policy decisions to promote health among vulnerable communities, and train the next generation of scientists and leaders for far-reaching impact.”

    The tool can inform decision-makers about a wide range of climate and air-quality policies. Policy scenarios can be applied to specific regions, sectors, or fuels to investigate policy combinations at a more granular level, or to target short-term actions with high-impact benefits.

    TAPS could be further developed to account for additional emissions sources and trends.

    “Our new tool could be used to examine a large range of both climate and air quality scenarios. As the framework is expanded, we can add detail for specific regions, as well as additional pollutants such as air toxics,” says study supervising co-author Noelle Selin, professor at IDSS and the MIT Department of Earth, Atmospheric and Planetary Sciences, and director of TPP.    

    This research was supported by the U.S. Environmental Protection Agency and its Science to Achieve Results (STAR) program; Biogen; TPP’s Leading Technology and Policy Initiative; and TPP’s Research to Policy Engagement Initiative. More