More stories

  • in

    J-WAFS announces 2023 seed grant recipients

    Today, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) announced its ninth round of seed grants to support innovative research projects at MIT. The grants are designed to fund research efforts that tackle challenges related to water and food for human use, with the ultimate goal of creating meaningful impact as the world population continues to grow and the planet undergoes significant climate and environmental changes.Ten new projects led by 15 researchers from seven different departments will be supported this year. The projects address a range of challenges by employing advanced materials, technology innovations, and new approaches to resource management. The new projects aim to remove harmful chemicals from water sources, develop monitoring and other systems to help manage various aquaculture industries, optimize water purification materials, and more.“The seed grant program is J-WAFS’ flagship grant initiative,” says J-WAFS executive director Renee J. Robins. “The funding is intended to spur groundbreaking MIT research addressing complex issues that are challenging our water and food systems. The 10 projects selected this year show great promise, and we look forward to the progress and accomplishments these talented researchers will make,” she adds.The 2023 J-WAFS seed grant researchers and their projects are:Sara Beery, an assistant professor in the Department of Electrical Engineering and Computer Science (EECS), is building the first completely automated system to estimate the size of salmon populations in the Pacific Northwest (PNW).Salmon are a keystone species in the PNW, feeding human populations for the last 7,500 years at least. However, overfishing, habitat loss, and climate change threaten extinction of salmon populations across the region. Accurate salmon counts during their seasonal migration to their natal river to spawn are essential for fisheries’ regulation and management but are limited by human capacity. Fish population monitoring is a widespread challenge in the United States and worldwide. Beery and her team are working to build a system that will provide a detailed picture of the state of salmon populations in unprecedented, spatial, and temporal resolution by combining sonar sensors and computer vision and machine learning (CVML) techniques. The sonar will capture individual fish as they swim upstream and CVML will train accurate algorithms to interpret the sonar video for detecting, tracking, and counting fish automatically while adapting to changing river conditions and fish densities.Another aquaculture project is being led by Michael Triantafyllou, the Henry L. and Grace Doherty Professor in Ocean Science and Engineering in the Department of Mechanical Engineering, and Robert Vincent, the assistant director at MIT’s Sea Grant Program. They are working with Otto Cordero, an associate professor in the Department of Civil and Environmental Engineering, to control harmful bacteria blooms in aquaculture algae feed production.

    Aquaculture in the United States represents a $1.5 billion industry annually and helps support 1.7 million jobs, yet many American hatcheries are not able to keep up with demand. One barrier to aquaculture production is the high degree of variability in survival rates, most likely caused by a poorly controlled microbiome that leads to bacterial infections and sub-optimal feed efficiency. Triantafyllou, Vincent, and Cordero plan to monitor the microbiome composition of a shellfish hatchery in order to identify possible causing agents of mortality, as well as beneficial microbes. They hope to pair microbe data with detail phenotypic information about the animal population to generate rapid diagnostic tests and explore the potential for microbiome therapies to protect larvae and prevent future outbreaks. The researchers plan to transfer their findings and technology to the local and regional aquaculture community to ensure healthy aquaculture production that will support the expansion of the U.S. aquaculture industry.

    David Des Marais is the Cecil and Ida Green Career Development Professor in the Department of Civil and Environmental Engineering. His 2023 J-WAFS project seeks to understand plant growth responses to elevated carbon dioxide (CO2) in the atmosphere, in the hopes of identifying breeding strategies that maximize crop yield under future CO2 scenarios.Today’s crop plants experience higher atmospheric CO2 than 20 or 30 years ago. Crops such as wheat, oat, barley, and rice typically increase their growth rate and biomass when grown at experimentally elevated atmospheric CO2. This is known as the so-called “CO2 fertilization effect.” However, not all plant species respond to rising atmospheric CO2 with increased growth, and for the ones that do, increased growth doesn’t necessarily correspond to increased crop yield. Using specially built plant growth chambers that can control the concentration of CO2, Des Marais will explore how CO2 availability impacts the development of tillers (branches) in the grass species Brachypodium. He will study how gene expression controls tiller development, and whether this is affected by the growing environment. The tillering response refers to how many branches a plant produces, which sets a limit on how much grain it can yield. Therefore, optimizing the tillering response to elevated CO2 could greatly increase yield. Des Marais will also look at the complete genome sequence of Brachypodium, wheat, oat, and barley to help identify genes relevant for branch growth.Darcy McRose, an assistant professor in the Department of Civil and Environmental Engineering, is researching whether a combination of plant metabolites and soil bacteria can be used to make mineral-associated phosphorus more bioavailable.The nutrient phosphorus is essential for agricultural plant growth, but when added as a fertilizer, phosphorus sticks to the surface of soil minerals, decreasing bioavailability, limiting plant growth, and accumulating residual phosphorus. Heavily fertilized agricultural soils often harbor large reservoirs of this type of mineral-associated “legacy” phosphorus. Redox transformations are one chemical process that can liberate mineral-associated phosphorus. However, this needs to be carefully controlled, as overly mobile phosphorus can lead to runoff and pollution of natural waters. Ideally, phosphorus would be made bioavailable when plants need it and immobile when they don’t. Many plants make small metabolites called coumarins that might be able to solubilize mineral-adsorbed phosphorus and be activated and inactivated under different conditions. McRose will use laboratory experiments to determine whether a combination of plant metabolites and soil bacteria can be used as a highly efficient and tunable system for phosphorus solubilization. She also aims to develop an imaging platform to investigate exchanges of phosphorus between plants and soil microbes.Many of the 2023 seed grants will support innovative technologies to monitor, quantify, and remediate various kinds of pollutants found in water. Two of the new projects address the problem of per- and polyfluoroalkyl substances (PFAS), human-made chemicals that have recently emerged as a global health threat. Known as “forever chemicals,” PFAS are used in many manufacturing processes. These chemicals are known to cause significant health issues including cancer, and they have become pervasive in soil, dust, air, groundwater, and drinking water. Unfortunately, the physical and chemical properties of PFAS render them difficult to detect and remove.Aristide Gumyusenge, the Merton C. Assistant Professor of Materials Science and Engineering, is using metal-organic frameworks for low-cost sensing and capture of PFAS. Most metal-organic frameworks (MOFs) are synthesized as particles, which complicates their high accuracy sensing performance due to defects such as intergranular boundaries. Thin, film-based electronic devices could enable the use of MOFs for many applications, especially chemical sensing. Gumyusenge’s project aims to design test kits based on two-dimensional conductive MOF films for detecting PFAS in drinking water. In early demonstrations, Gumyusenge and his team showed that these MOF films can sense PFAS at low concentrations. They will continue to iterate using a computation-guided approach to tune sensitivity and selectivity of the kits with the goal of deploying them in real-world scenarios.Carlos Portela, the Brit (1961) and Alex (1949) d’Arbeloff Career Development Professor in the Department of Mechanical Engineering, and Ariel Furst, the Cook Career Development Professor in the Department of Chemical Engineering, are building novel architected materials to act as filters for the removal of PFAS from water. Portela and Furst will design and fabricate nanoscale materials that use activated carbon and porous polymers to create a physical adsorption system. They will engineer the materials to have tunable porosities and morphologies that can maximize interactions between contaminated water and functionalized surfaces, while providing a mechanically robust system.Rohit Karnik is a Tata Professor and interim co-department head of the Department of Mechanical Engineering. He is working on another technology, his based on microbead sensors, to rapidly measure and monitor trace contaminants in water.Water pollution from both biological and chemical contaminants contributes to an estimated 1.36 million deaths annually. Chemical contaminants include pesticides and herbicides, heavy metals like lead, and compounds used in manufacturing. These emerging contaminants can be found throughout the environment, including in water supplies. The Environmental Protection Agency (EPA) in the United States sets recommended water quality standards, but states are responsible for developing their own monitoring criteria and systems, which must be approved by the EPA every three years. However, the availability of data on regulated chemicals and on candidate pollutants is limited by current testing methods that are either insensitive or expensive and laboratory-based, requiring trained scientists and technicians. Karnik’s project proposes a simple, self-contained, portable system for monitoring trace and emerging pollutants in water, making it suitable for field studies. The concept is based on multiplexed microbead-based sensors that use thermal or gravitational actuation to generate a signal. His proposed sandwich assay, a testing format that is appealing for environmental sensing, will enable both single-use and continuous monitoring. The hope is that the bead-based assays will increase the ease and reach of detecting and quantifying trace contaminants in water for both personal and industrial scale applications.Alexander Radosevich, a professor in the Department of Chemistry, and Timothy Swager, the John D. MacArthur Professor of Chemistry, are teaming up to create rapid, cost-effective, and reliable techniques for on-site arsenic detection in water.Arsenic contamination of groundwater is a problem that affects as many as 500 million people worldwide. Arsenic poisoning can lead to a range of severe health problems from cancer to cardiovascular and neurological impacts. Both the EPA and the World Health Organization have established that 10 parts per billion is a practical threshold for arsenic in drinking water, but measuring arsenic in water at such low levels is challenging, especially in resource-limited environments where access to sensitive laboratory equipment may not be readily accessible. Radosevich and Swager plan to develop reaction-based chemical sensors that bind and extract electrons from aqueous arsenic. In this way, they will exploit the inherent reactivity of aqueous arsenic to selectively detect and quantify it. This work will establish the chemical basis for a new method of detecting trace arsenic in drinking water.Rajeev Ram is a professor in the Department of Electrical Engineering and Computer Science. His J-WAFS research will advance a robust technology for monitoring nitrogen-containing pollutants, which threaten over 15,000 bodies of water in the United States alone.Nitrogen in the form of nitrate, nitrite, ammonia, and urea can run off from agricultural fertilizer and lead to harmful algal blooms that jeopardize human health. Unfortunately, monitoring these contaminants in the environment is challenging, as sensors are difficult to maintain and expensive to deploy. Ram and his students will work to establish limits of detection for nitrate, nitrite, ammonia, and urea in environmental, industrial, and agricultural samples using swept-source Raman spectroscopy. Swept-source Raman spectroscopy is a method of detecting the presence of a chemical by using a tunable, single mode laser that illuminates a sample. This method does not require costly, high-power lasers or a spectrometer. Ram will then develop and demonstrate a portable system that is capable of achieving chemical specificity in complex, natural environments. Data generated by such a system should help regulate polluters and guide remediation.Kripa Varanasi, a professor in the Department of Mechanical Engineering, and Angela Belcher, the James Mason Crafts Professor and head of the Department of Biological Engineering, will join forces to develop an affordable water disinfection technology that selectively identifies, adsorbs, and kills “superbugs” in domestic and industrial wastewater.Recent research predicts that antibiotic-resistance bacteria (superbugs) will result in $100 trillion in health care expenses and 10 million deaths annually by 2050. The prevalence of superbugs in our water systems has increased due to corroded pipes, contamination, and climate change. Current drinking water disinfection technologies are designed to kill all types of bacteria before human consumption. However, for certain domestic and industrial applications there is a need to protect the good bacteria required for ecological processes that contribute to soil and plant health. Varanasi and Belcher will combine material, biological, process, and system engineering principles to design a sponge-based water disinfection technology that can identify and destroy harmful bacteria while leaving the good bacteria unharmed. By modifying the sponge surface with specialized nanomaterials, their approach will be able to kill superbugs faster and more efficiently. The sponge filters can be deployed under very low pressure, making them an affordable technology, especially in resource-constrained communities.In addition to the 10 seed grant projects, J-WAFS will also fund a research initiative led by Greg Sixt. Sixt is the research manager for climate and food systems at J-WAFS, and the director of the J-WAFS-led Food and Climate Systems Transformation (FACT) Alliance. His project focuses on the Lake Victoria Basin (LVB) of East Africa. The second-largest freshwater lake in the world, Lake Victoria straddles three countries (Uganda, Tanzania, and Kenya) and has a catchment area that encompasses two more (Rwanda and Burundi). Sixt will collaborate with Michael Hauser of the University of Natural Resources and Life Sciences, Vienna, and Paul Kariuki, of the Lake Victoria Basin Commission.The group will study how to adapt food systems to climate change in the Lake Victoria Basin. The basin is facing a range of climate threats that could significantly impact livelihoods and food systems in the expansive region. For example, extreme weather events like droughts and floods are negatively affecting agricultural production and freshwater resources. Across the LVB, current approaches to land and water management are unsustainable and threaten future food and water security. The Lake Victoria Basin Commission (LVBC), a specialized institution of the East African Community, wants to play a more vital role in coordinating transboundary land and water management to support transitions toward more resilient, sustainable, and equitable food systems. The primary goal of this research will be to support the LVBC’s transboundary land and water management efforts, specifically as they relate to sustainability and climate change adaptation in food systems. The research team will work with key stakeholders in Kenya, Uganda, and Tanzania to identify specific capacity needs to facilitate land and water management transitions. The two-year project will produce actionable recommendations to the LVBC. More

  • in

    Four researchers with MIT ties earn 2023 Schmidt Science Fellowships

    Four researchers with ties to MIT have been named Schmidt Science Fellows this year. Lillian Chin ’17, SM ’19; Neil Dalvie PD ’22, PhD ’22; Suong Nguyen, and Yirui Zhang SM ’19, PhD ’23 are among the 32 exceptional early-career scientists worldwide chosen to receive the prestigious fellowships.

    “History provides powerful examples of what happens when scientists are given the freedom to ask big questions which can achieve real breakthroughs across disciplines,” says Wendy Schmidt, co-founder of Schmidt Futures and president of the Schmidt Family Foundation. “Schmidt Science Fellows are tackling climate destruction, discovering new drugs against disease, developing novel materials, using machine learning to understand the drivers of human health, and much more. This new cohort will add to this legacy in applying scientific discovery to improve human health and opportunity, and preserve and restore essential planetary systems.”

    Schmidt Futures is a philanthropic initiative that brings talented people together in networks to prove out their ideas and solve hard problems in science and society. Schmidt Science Fellows receive a stipend of $100,000 a year for up to two years of postdoctoral research in a discipline different from their PhD at a world-leading lab anywhere across the globe.

    Lillian Chin ’17, SM ’19 is currently pursuing her PhD in the Department of Electrical Engineering and Computer Science. Her research focuses on creating new materials for robots. By designing the geometry of a material, Chin creates new “meta-materials” that have different properties from the original. Using this technique, she has created robot balls that dramatically expand in volume and soft grippers that can work in dangerous environments. All of these robots are built out of a single material, letting the researchers 3D print them with extra internal features like channels. These channels help to measure the deformation of metamaterials, enabling Chin and her collaborators to create robots that are strong, can move, and sense their own shape, like muscles do.

    “I feel very honored to have been chosen for this fellowship,” says Chin. “I feel like I proposed a very risky pivot, since my background is only in engineering, with very limited exposure to neuroscience. I’m very excited to be given the opportunity to learn best practices for interacting with patients and be able to translate my knowledge from robotics to biology.”

    With the Schmidt Fellowship, Chin plans to pursue new frontiers for custom materials with internal sensors, which can measure force and deformation and can be placed anywhere within the material. “I want to use these materials to make tools for clinicians and neuroscientists to better understand how humans touch and grasp objects around them,” says Chin. “I’m especially interested in seeing how my materials could help in diagnosis motor-related diseases or improve rehab outcomes by providing the patient with feedback. This will help me create robots that have a better sense of touch and learn how to move objects around like humans do.”

    Neil Dalvie PD ’22, PhD ’22 is a graduate of the Department of Chemical Engineering, where he worked with Professor J. Christopher Love on manufacturing of therapeutic proteins. Dalvie developed molecular biology techniques for manufacturing high-quality proteins in yeast, which enables rapid testing of new products and low-cost manufacturing and large scales. During the pandemic, he led a team that applied these learnings to develop a Covid-19 vaccine that was deployed in multiple low-income countries. After graduating, Dalvie wanted to apply the precision biological engineering that is routinely deployed in medicinal manufacturing to other large-scale bioprocesses.

    “It’s rare for scientists to cross large technical gaps after so many years of specific training to get a PhD — you get comfy being an expert in your field,” says Dalvie. “I was definitely intimidated by the giant leap from vaccine manufacturing to the natural rock cycle. The fellowship has allowed me to dive into the new field by removing immediate pressure to publish or find my next job. I am excited for what commonalities we will find between biomanufacturing and biogeochemistry.”

    As a Schmidt Science Fellow, Dalvie will work with Professor Pamela Silver at Harvard Medical School on engineering microorganisms for enhanced rock weathering and carbon sequestration to combat climate change. They are applying modern molecular biology to enhance natural biogeochemical processes at gigaton scales.

    Suong (Su) Nguyen, a postdoctoral researcher in Professor Jeremiah Johnson’s lab in the Department of Chemistry, earned her PhD from Princeton University, where she developed light-driven, catalytic methodologies for organic synthesis, biomass valorization, plastic waste recycling, and functionalization of quantum sensing materials.

    As a Schmidt Science fellow, Nguyen will pivot from organic chemistry to nanomaterials. Biological systems are able to synthesize macromolecules with precise structure essential for their biological function. Scientists have long dreamed of achieving similar control over synthetic materials, but existing methods are inefficient and limited in scope. Nguyen hopes to develop new strategies to achieve such high level of control over the structure and properties of nanomaterials and explore their potential for use in therapeutic applications.

    “I feel extremely honored and grateful to receive the Schmidt Science Fellowship,” says Nguyen. “The fellowship will provide me with a unique opportunity to engage with scientists from a very wide range of research backgrounds. I believe this will significantly shape the research objectives for my future career.”

    Yirui Zhang SM ’19, PhD ’22 is a graduate of the Department of Mechanical Engineering. Zhang’s research focuses on electrochemical energy storage and conversion, including lithium-ion batteries and electrocatalysis. She has developed in situ spectroscopy and electrochemical methods to probe the electrode-electrolyte interface, understand the interfacial molecular structures, and unravel the fundamental thermodynamics and kinetics of (electro)chemical reactions in energy storage. Further, she has leveraged the physical chemistry of liquids and tuned the molecular structures at the interface to improve the stability and kinetics of electrochemical reactions. 

    “I am honored and thrilled to have been named a Schmidt Science Fellow,” says Zhang. “The fellowship will not only provide me with the unique opportunity to broaden my scientific perspectives and pursue pivoting research, but also create a lifelong network for us to collaborate across diverse fields and become scientific and societal thought leaders. I look forward to pushing the boundaries of my research and advancing technologies to tackle global challenges in energy storage and health care with interdisciplinary efforts!”

    As a Schmidt Science Fellow, Zhang will work across disciplines and pivot to biosensing. She plans to combine spectroscopy, electrokinetics, and machine learning to develop a fast and cost-effective technique for monitoring and understanding infectious disease. The innovations will benefit next-generation point-of-care medical devices and wastewater-based epidemiology to provide timely diagnosis and help protect humans against deadly infections and antimicrobial resistance. More

  • in

    Inaugural J-WAFS Grand Challenge aims to develop enhanced crop variants and move them from lab to land

    According to MIT’s charter, established in 1861, part of the Institute’s mission is to advance the “development and practical application of science in connection with arts, agriculture, manufactures, and commerce.” Today, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) is one of the driving forces behind water and food-related research on campus, much of which relates to agriculture. In 2022, J-WAFS established the Water and Food Grand Challenge Grant to inspire MIT researchers to work toward a water-secure and food-secure future for our changing planet. Not unlike MIT’s Climate Grand Challenges, the J-WAFS Grand Challenge seeks to leverage multiple areas of expertise, programs, and Institute resources. The initial call for statements of interests returned 23 letters from MIT researchers spanning 18 departments, labs, and centers. J-WAFS hosted workshops for the proposers to present and discuss their initial ideas. These were winnowed down to a smaller set of invited concept papers, followed by the final proposal stage. 

    Today, J-WAFS is delighted to report that the inaugural J-WAFS Grand Challenge Grant has been awarded to a team of researchers led by Professor Matt Shoulders and research scientist Robert Wilson of the Department of Chemistry. A panel of expert, external reviewers highly endorsed their proposal, which tackles a longstanding problem in crop biology — how to make photosynthesis more efficient. The team will receive $1.5 million over three years to facilitate a multistage research project that combines cutting-edge innovations in synthetic and computational biology. If successful, this project could create major benefits for agriculture and food systems worldwide.

    “Food systems are a major source of global greenhouse gas emissions, and they are also increasingly vulnerable to the impacts of climate change. That’s why when we talk about climate change, we have to talk about food systems, and vice versa,” says Maria T. Zuber, MIT’s vice president for research. “J-WAFS is central to MIT’s efforts to address the interlocking challenges of climate, water, and food. This new grant program aims to catalyze innovative projects that will have real and meaningful impacts on water and food. I congratulate Professor Shoulders and the rest of the research team on being the inaugural recipients of this grant.”

    Shoulders will work with Bryan Bryson, associate professor of biological engineering, as well as Bin Zhang, associate professor of chemistry, and Mary Gehring, a professor in the Department of Biology and the Whitehead Institute for Biomedical Research. Robert Wilson from the Shoulders lab will be coordinating the research effort. The team at MIT will work with outside collaborators Spencer Whitney, a professor from the Australian National University, and Ahmed Badran, an assistant professor at the Scripps Research Institute. A milestone-based collaboration will also take place with Stephen Long, a professor from the University of Illinois at Urbana-Champaign. The group consists of experts in continuous directed evolution, machine learning, molecular dynamics simulations, translational plant biochemistry, and field trials.

    “This project seeks to fundamentally improve the RuBisCO enzyme that plants use to convert carbon dioxide into the energy-rich molecules that constitute our food,” says J-WAFS Director John H. Lienhard V. “This difficult problem is a true grand challenge, calling for extensive resources. With J-WAFS’ support, this long-sought goal may finally be achieved through MIT’s leading-edge research,” he adds.

    RuBisCO: No, it’s not a new breakfast cereal; it just might be the key to an agricultural revolution

    A growing global population, the effects of climate change, and social and political conflicts like the war in Ukraine are all threatening food supplies, particularly grain crops. Current projections estimate that crop production must increase by at least 50 percent over the next 30 years to meet food demands. One key barrier to increased crop yields is a photosynthetic enzyme called Ribulose-1,5-Bisphosphate Carboxylase/Oxygenase (RuBisCO). During photosynthesis, crops use energy gathered from light to draw carbon dioxide (CO2) from the atmosphere and transform it into sugars and cellulose for growth, a process known as carbon fixation. RuBisCO is essential for capturing the CO2 from the air to initiate conversion of CO2 into energy-rich molecules like glucose. This reaction occurs during the second stage of photosynthesis, also known as the Calvin cycle. Without RuBisCO, the chemical reactions that account for virtually all carbon acquisition in life could not occur.

    Unfortunately, RuBisCO has biochemical shortcomings. Notably, the enzyme acts slowly. Many other enzymes can process a thousand molecules per second, but RuBisCO in chloroplasts fixes less than six carbon dioxide molecules per second, often limiting the rate of plant photosynthesis. Another problem is that oxygen (O2) molecules and carbon dioxide molecules are relatively similar in shape and chemical properties, and RuBisCO is unable to fully discriminate between the two. The inadvertent fixation of oxygen by RuBisCO leads to energy and carbon loss. What’s more, at higher temperatures RuBisCO reacts even more frequently with oxygen, which will contribute to decreased photosynthetic efficiency in many staple crops as our climate warms.

    The scientific consensus is that genetic engineering and synthetic biology approaches could revolutionize photosynthesis and offer protection against crop losses. To date, crop RuBisCO engineering has been impaired by technological obstacles that have limited any success in significantly enhancing crop production. Excitingly, genetic engineering and synthetic biology tools are now at a point where they can be applied and tested with the aim of creating crops with new or improved biological pathways for producing more food for the growing population.

    An epic plan for fighting food insecurity

    The 2023 J-WAFS Grand Challenge project will use state-of-the-art, transformative protein engineering techniques drawn from biomedicine to improve the biochemistry of photosynthesis, specifically focusing on RuBisCO. Shoulders and his team are planning to build what they call the Enhanced Photosynthesis in Crops (EPiC) platform. The project will evolve and design better crop RuBisCO in the laboratory, followed by validation of the improved enzymes in plants, ultimately resulting in the deployment of enhanced RuBisCO in field trials to evaluate the impact on crop yield. 

    Several recent developments make high-throughput engineering of crop RuBisCO possible. RuBisCO requires a complex chaperone network for proper assembly and function in plants. Chaperones are like helpers that guide proteins during their maturation process, shielding them from aggregation while coordinating their correct assembly. Wilson and his collaborators previously unlocked the ability to recombinantly produce plant RuBisCO outside of plant chloroplasts by reconstructing this chaperone network in Escherichia coli (E. coli). Whitney has now established that the RuBisCO enzymes from a range of agriculturally relevant crops, including potato, carrot, strawberry, and tobacco, can also be expressed using this technology. Whitney and Wilson have further developed a range of RuBisCO-dependent E. coli screens that can identify improved RuBisCO from complex gene libraries. Moreover, Shoulders and his lab have developed sophisticated in vivo mutagenesis technologies that enable efficient continuous directed evolution campaigns. Continuous directed evolution refers to a protein engineering process that can accelerate the steps of natural evolution simultaneously in an uninterrupted cycle in the lab, allowing for rapid testing of protein sequences. While Shoulders and Badran both have prior experience with cutting-edge directed evolution platforms, this will be the first time directed evolution is applied to RuBisCO from plants.

    Artificial intelligence is changing the way enzyme engineering is undertaken by researchers. Principal investigators Zhang and Bryson will leverage modern computational methods to simulate the dynamics of RuBisCO structure and explore its evolutionary landscape. Specifically, Zhang will use molecular dynamics simulations to simulate and monitor the conformational dynamics of the atoms in a protein and its programmed environment over time. This approach will help the team evaluate the effect of mutations and new chemical functionalities on the properties of RuBisCO. Bryson will employ artificial intelligence and machine learning to search the RuBisCO activity landscape for optimal sequences. The computational and biological arms of the EPiC platform will work together to both validate and inform each other’s approaches to accelerate the overall engineering effort.

    Shoulders and the group will deploy their designed enzymes in tobacco plants to evaluate their effects on growth and yield relative to natural RuBisCO. Gehring, a plant biologist, will assist with screening improved RuBisCO variants using the tobacco variety Nicotiana benthamianaI, where transient expression can be deployed. Transient expression is a speedy approach to test whether novel engineered RuBisCO variants can be correctly synthesized in leaf chloroplasts. Variants that pass this quality-control checkpoint at MIT will be passed to the Whitney Lab at the Australian National University for stable transformation into Nicotiana tabacum (tobacco), enabling robust measurements of photosynthetic improvement. In a final step, Professor Long at the University of Illinois at Urbana-Champaign will perform field trials of the most promising variants.

    Even small improvements could have a big impact

    A common criticism of efforts to improve RuBisCO is that natural evolution has not already identified a better enzyme, possibly implying that none will be found. Traditional views have speculated a catalytic trade-off between RuBisCO’s specificity factor for CO2 / O2 versus its CO2 fixation efficiency, leading to the belief that specificity factor improvements might be offset by even slower carbon fixation or vice versa. This trade-off has been suggested to explain why natural evolution has been slow to achieve a better RuBisCO. But Shoulders and the team are convinced that the EPiC platform can unlock significant overall improvements to plant RuBisCO. This view is supported by the fact that Wilson and Whitney have previously used directed evolution to improve CO2 fixation efficiency by 50 percent in RuBisCO from cyanobacteria (the ancient progenitors of plant chloroplasts) while simultaneously increasing the specificity factor. 

    The EPiC researchers anticipate that their initial variants could yield 20 percent increases in RuBisCO’s specificity factor without impairing other aspects of catalysis. More sophisticated variants could lift RuBisCO out of its evolutionary trap and display attributes not currently observed in nature. “If we achieve anywhere close to such an improvement and it translates to crops, the results could help transform agriculture,” Shoulders says. “If our accomplishments are more modest, it will still recruit massive new investments to this essential field.”

    Successful engineering of RuBisCO would be a scientific feat of its own and ignite renewed enthusiasm for improving plant CO2 fixation. Combined with other advances in photosynthetic engineering, such as improved light usage, a new green revolution in agriculture could be achieved. Long-term impacts of the technology’s success will be measured in improvements to crop yield and grain availability, as well as resilience against yield losses under higher field temperatures. Moreover, improved land productivity together with policy initiatives would assist in reducing the environmental footprint of agriculture. With more “crop per drop,” reductions in water consumption from agriculture would be a major boost to sustainable farming practices.

    “Our collaborative team of biochemists and synthetic biologists, computational biologists, and chemists is deeply integrated with plant biologists and field trial experts, yielding a robust feedback loop for enzyme engineering,” Shoulders adds. “Together, this team will be able to make a concerted effort using the most modern, state-of-the-art techniques to engineer crop RuBisCO with an eye to helping make meaningful gains in securing a stable crop supply, hopefully with accompanying improvements in both food and water security.” More

  • in

    Moving perovskite advancements from the lab to the manufacturing floor

    The following was issued as a joint announcement from MIT.nano and the MIT Research Laboratory for Electronics; CubicPV; Verde Technologies; Princeton University; and the University of California at San Diego.

    Tandem solar cells are made of stacked materials — such as silicon paired with perovskites — that together absorb more of the solar spectrum than single materials, resulting in a dramatic increase in efficiency. Their potential to generate significantly more power than conventional cells could make a meaningful difference in the race to combat climate change and the transition to a clean-energy future.

    However, current methods to create stable and efficient perovskite layers require time-consuming, painstaking rounds of design iteration and testing, inhibiting their development for commercial use. Today, the U.S. Department of Energy Solar Energy Technologies Office (SETO) announced that MIT has been selected to receive an $11.25 million cost-shared award to establish a new research center to address this challenge by using a co-optimization framework guided by machine learning and automation.

    A collaborative effort with lead industry participant CubicPV, solar startup Verde Technologies, and academic partners Princeton University and the University of California San Diego (UC San Diego), the center will bring together teams of researchers to support the creation of perovskite-silicon tandem solar modules that are co-designed for both stability and performance, with goals to significantly accelerate R&D and the transfer of these achievements into commercial environments.

    “Urgent challenges demand rapid action. This center will accelerate the development of tandem solar modules by bringing academia and industry into closer partnership,” says MIT professor of mechanical engineering Tonio Buonassisi, who will direct the center. “We’re grateful to the Department of Energy for supporting this powerful new model and excited to get to work.”

    Adam Lorenz, CTO of solar energy technology company CubicPV, stresses the importance of thinking about scale, alongside quality and efficiency, to accelerate the perovskite effort into the commercial environment. “Instead of chasing record efficiencies with tiny pixel-sized devices and later attempting to stabilize them, we will simultaneously target stability, reproducibility, and efficiency,” he says. “It’s a module-centric approach that creates a direct channel for R&D advancements into industry.”

    The center will be named Accelerated Co-Design of Durable, Reproducible, and Efficient Perovskite Tandems, or ADDEPT. The grant will be administered through the MIT Research Laboratory for Electronics (RLE).

    David Fenning, associate professor of nanoengineering at UC San Diego, has worked with Buonassisi on the idea of merging materials, automation, and computation, specifically in this field of artificial intelligence and solar, since 2014. Now, a central thrust of the ADDEPT project will be to deploy machine learning and robotic screening to optimize processing of perovskite-based solar materials for efficiency and durability.

    “We have already seen early indications of successful technology transfer between our UC San Diego robot PASCAL and industry,” says Fenning. “With this new center, we will bring research labs and the emerging perovskite industry together to improve reproducibility and reduce time to market.”

    “Our generation has an obligation to work collaboratively in the fight against climate change,” says Skylar Bagdon, CEO of Verde Technologies, which received the American-Made Perovskite Startup Prize. “Throughout the course of this center, Verde will do everything in our power to help this brilliant team transition lab-scale breakthroughs into the world where they can have an impact.”

    Several of the academic partners echoed the importance of the joint effort between academia and industry. Barry Rand, professor of electrical and computer engineering at the Andlinger Center for Energy and the Environment at Princeton University, pointed to the intersection of scientific knowledge and market awareness. “Understanding how chemistry affects films and interfaces will empower us to co-design for stability and performance,” he says. “The center will accelerate this use-inspired science, with close guidance from our end customers, the industry partners.”

    A critical resource for the center will be MIT.nano, a 200,000-square-foot research facility set in the heart of the campus. MIT.nano Director Vladimir Bulović, the Fariborz Maseeh (1990) Professor of Emerging Technology, says he envisions MIT.nano as a hub for industry and academic partners, facilitating technology development and transfer through shared lab space, open-access equipment, and streamlined intellectual property frameworks.

    “MIT has a history of groundbreaking innovation using perovskite materials for solar applications,” says Bulović. “We’re thrilled to help build on that history by anchoring ADDEPT at MIT.nano and working to help the nation advance the future of these promising materials.”

    MIT was selected as a part of the SETO Fiscal Year 2022 Photovoltaics (PV) funding program, an effort to reduce costs and supply chain vulnerabilities, further develop durable and recyclable solar technologies, and advance perovskite PV technologies toward commercialization. ADDEPT is one project that will tackle perovskite durability, which will extend module life. The overarching goal of these projects is to lower the levelized cost of electricity generated by PV.

    Research groups involved with the ADDEPT project at MIT include Buonassisi’s Accelerated Materials Laboratory for Sustainability (AMLS), Bulović’s Organic and Nanostructured Electronics (ONE) Lab, and the Bawendi Group led by Lester Wolfe Professor in Chemistry Moungi Bawendi. Also working on the project is Jeremiah Mwaura, research scientist in the ONE Lab. More

  • in

    Study: Shutting down nuclear power could increase air pollution

    Nearly 20 percent of today’s electricity in the United States comes from nuclear power. The U.S. has the largest nuclear fleet in the world, with 92 reactors scattered around the country. Many of these power plants have run for more than half a century and are approaching the end of their expected lifetimes.

    Policymakers are debating whether to retire the aging reactors or reinforce their structures to continue producing nuclear energy, which many consider a low-carbon alternative to climate-warming coal, oil, and natural gas.

    Now, MIT researchers say there’s another factor to consider in weighing the future of nuclear power: air quality. In addition to being a low carbon-emitting source, nuclear power is relatively clean in terms of the air pollution it generates. Without nuclear power, how would the pattern of air pollution shift, and who would feel its effects?

    The MIT team took on these questions in a new study appearing today in Nature Energy. They lay out a scenario in which every nuclear power plant in the country has shut down, and consider how other sources such as coal, natural gas, and renewable energy would fill the resulting energy needs throughout an entire year.

    Their analysis reveals that indeed, air pollution would increase, as coal, gas, and oil sources ramp up to compensate for nuclear power’s absence. This in itself may not be surprising, but the team has put numbers to the prediction, estimating that the increase in air pollution would have serious health effects, resulting in an additional 5,200 pollution-related deaths over a single year.

    If, however, more renewable energy sources become available to supply the energy grid, as they are expected to by the year 2030, air pollution would be curtailed, though not entirely. The team found that even under this heartier renewable scenario, there is still a slight increase in air pollution in some parts of the country, resulting in a total of 260 pollution-related deaths over one year.

    When they looked at the populations directly affected by the increased pollution, they found that Black or African American communities — a disproportionate number of whom live near fossil-fuel plants — experienced the greatest exposure.

    “This adds one more layer to the environmental health and social impacts equation when you’re thinking about nuclear shutdowns, where the conversation often focuses on local risks due to accidents and mining or long-term climate impacts,” says lead author Lyssa Freese, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).

    “In the debate over keeping nuclear power plants open, air quality has not been a focus of that discussion,” adds study author Noelle Selin, a professor in MIT’s Institute for Data, Systems, and Society (IDSS) and EAPS. “What we found was that air pollution from fossil fuel plants is so damaging, that anything that increases it, such as a nuclear shutdown, is going to have substantial impacts, and for some people more than others.”

    The study’s MIT-affiliated co-authors also include Principal Research Scientist Sebastian Eastham and Guillaume Chossière SM ’17, PhD ’20, along with Alan Jenn of the University of California at Davis.

    Future phase-outs

    When nuclear power plants have closed in the past, fossil fuel use increased in response. In 1985, the closure of reactors in Tennessee Valley prompted a spike in coal use, while the 2012 shutdown of a plant in California led to an increase in natural gas. In Germany, where nuclear power has almost completely been phased out, coal-fired power increased initially to fill the gap.

    Noting these trends, the MIT team wondered how the U.S. energy grid would respond if nuclear power were completely phased out.

    “We wanted to think about what future changes were expected in the energy grid,” Freese says. “We knew that coal use was declining, and there was a lot of work already looking at the impact of what that would have on air quality. But no one had looked at air quality and nuclear power, which we also noticed was on the decline.”

    In the new study, the team used an energy grid dispatch model developed by Jenn to assess how the U.S. energy system would respond to a shutdown of nuclear power. The model simulates the production of every power plant in the country and runs continuously to estimate, hour by hour, the energy demands in 64 regions across the country.

    Much like the way the actual energy market operates, the model chooses to turn a plant’s production up or down based on cost: Plants producing the cheapest energy at any given time are given priority to supply the grid over more costly energy sources.

    The team fed the model available data on each plant’s changing emissions and energy costs throughout an entire year. They then ran the model under different scenarios, including: an energy grid with no nuclear power, a baseline grid similar to today’s that includes nuclear power, and a grid with no nuclear power that also incorporates the additional renewable sources that are expected to be added by 2030.

    They combined each simulation with an atmospheric chemistry model to simulate how each plant’s various emissions travel around the country and to overlay these tracks onto maps of population density. For populations in the path of pollution, they calculated the risk of premature death based on their degree of exposure.

    System response

    Play video

    Courtesy of the researchers, edited by MIT News

    Their analysis showed a clear pattern: Without nuclear power, air pollution worsened in general, mainly affecting regions in the East Coast, where nuclear power plants are mostly concentrated. Without those plants, the team observed an uptick in production from coal and gas plants, resulting in 5,200 pollution-related deaths across the country, compared to the baseline scenario.

    They also calculated that more people are also likely to die prematurely due to climate impacts from the increase in carbon dioxide emissions, as the grid compensates for nuclear power’s absence. The climate-related effects from this additional influx of carbon dioxide could lead to 160,000 additional deaths over the next century.

    “We need to be thoughtful about how we’re retiring nuclear power plants if we are trying to think about them as part of an energy system,” Freese says. “Shutting down something that doesn’t have direct emissions itself can still lead to increases in emissions, because the grid system will respond.”

    “This might mean that we need to deploy even more renewables, in order to fill the hole left by nuclear, which is essentially a zero-emissions energy source,” Selin adds. “Otherwise we will have a reduction in air quality that we weren’t necessarily counting on.”

    This study was supported, in part, by the U.S. Environmental Protection Agency. More

  • in

    Study: Smoke particles from wildfires can erode the ozone layer

    A wildfire can pump smoke up into the stratosphere, where the particles drift for over a year. A new MIT study has found that while suspended there, these particles can trigger chemical reactions that erode the protective ozone layer shielding the Earth from the sun’s damaging ultraviolet radiation.

    The study, which appears today in Nature, focuses on the smoke from the “Black Summer” megafire in eastern Australia, which burned from December 2019 into January 2020. The fires — the country’s most devastating on record — scorched tens of millions of acres and pumped more than 1 million tons of smoke into the atmosphere.

    The MIT team identified a new chemical reaction by which smoke particles from the Australian wildfires made ozone depletion worse. By triggering this reaction, the fires likely contributed to a 3-5 percent depletion of total ozone at mid-latitudes in the Southern Hemisphere, in regions overlying Australia, New Zealand, and parts of Africa and South America.

    The researchers’ model also indicates the fires had an effect in the polar regions, eating away at the edges of the ozone hole over Antarctica. By late 2020, smoke particles from the Australian wildfires widened the Antarctic ozone hole by 2.5 million square kilometers — 10 percent of its area compared to the previous year.

    It’s unclear what long-term effect wildfires will have on ozone recovery. The United Nations recently reported that the ozone hole, and ozone depletion around the world, is on a recovery track, thanks to a sustained international effort to phase out ozone-depleting chemicals. But the MIT study suggests that as long as these chemicals persist in the atmosphere, large fires could spark a reaction that temporarily depletes ozone.

    “The Australian fires of 2020 were really a wake-up call for the science community,” says Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at MIT and a leading climate scientist who first identified the chemicals responsible for the Antarctic ozone hole. “The effect of wildfires was not previously accounted for in [projections of] ozone recovery. And I think that effect may depend on whether fires become more frequent and intense as the planet warms.”

    The study is led by Solomon and MIT research scientist Kane Stone, along with collaborators from the Institute for Environmental and Climate Research in Guangzhou, China; the U.S. National Oceanic and Atmospheric Administration; the U.S. National Center for Atmospheric Research; and Colorado State University.

    Chlorine cascade

    The new study expands on a 2022 discovery by Solomon and her colleagues, in which they first identified a chemical link between wildfires and ozone depletion. The researchers found that chlorine-containing compounds, originally emitted by factories in the form of chlorofluorocarbons (CFCs), could react with the surface of fire aerosols. This interaction, they found, set off a chemical cascade that produced chlorine monoxide — the ultimate ozone-depleting molecule. Their results showed that the Australian wildfires likely depleted ozone through this newly identified chemical reaction.

    “But that didn’t explain all the changes that were observed in the stratosphere,” Solomon says. “There was a whole bunch of chlorine-related chemistry that was totally out of whack.”

    In the new study, the team took a closer look at the composition of molecules in the stratosphere following the Australian wildfires. They combed through three independent sets of satellite data and observed that in the months following the fires, concentrations of hydrochloric acid dropped significantly at mid-latitudes, while chlorine monoxide spiked.

    Hydrochloric acid (HCl) is present in the stratosphere as CFCs break down naturally over time. As long as chlorine is bound in the form of HCl, it doesn’t have a chance to destroy ozone. But if HCl breaks apart, chlorine can react with oxygen to form ozone-depleting chlorine monoxide.

    In the polar regions, HCl can break apart when it interacts with the surface of cloud particles at frigid temperatures of about 155 kelvins. However, this reaction was not expected to occur at mid-latitudes, where temperatures are much warmer.

    “The fact that HCl at mid-latitudes dropped by this unprecedented amount was to me kind of a danger signal,” Solomon says.

    She wondered: What if HCl could also interact with smoke particles, at warmer temperatures and in a way that released chlorine to destroy ozone? If such a reaction was possible, it would explain the imbalance of molecules and much of the ozone depletion observed following the Australian wildfires.

    Smoky drift

    Solomon and her colleagues dug through the chemical literature to see what sort of organic molecules could react with HCl at warmer temperatures to break it apart.

    “Lo and behold, I learned that HCl is extremely soluble in a whole broad range of organic species,” Solomon says. “It likes to glom on to lots of compounds.”

    The question then, was whether the Australian wildfires released any of those compounds that could have triggered HCl’s breakup and any subsequent depletion of ozone. When the team looked at the composition of smoke particles in the first days after the fires, the picture was anything but clear.

    “I looked at that stuff and threw up my hands and thought, there’s so much stuff in there, how am I ever going to figure this out?” Solomon recalls. “But then I realized it had actually taken some weeks before you saw the HCl drop, so you really need to look at the data on aged wildfire particles.”

    When the team expanded their search, they found that smoke particles persisted over months, circulating in the stratosphere at mid-latitudes, in the same regions and times when concentrations of HCl dropped.

    “It’s the aged smoke particles that really take up a lot of the HCl,” Solomon says. “And then you get, amazingly, the same reactions that you get in the ozone hole, but over mid-latitudes, at much warmer temperatures.”

    When the team incorporated this new chemical reaction into a model of atmospheric chemistry, and simulated the conditions of the Australian wildfires, they observed a 5 percent depletion of ozone throughout the stratosphere at mid-latitudes, and a 10 percent widening of the ozone hole over Antarctica.

    The reaction with HCl is likely the main pathway by which wildfires can deplete ozone. But Solomon guesses there may be other chlorine-containing compounds drifting in the stratosphere, that wildfires could unlock.

    “There’s now sort of a race against time,” Solomon says. “Hopefully, chlorine-containing compounds will have been destroyed, before the frequency of fires increases with climate change. This is all the more reason to be vigilant about global warming and these chlorine-containing compounds.”

    This research was supported, in part, by NASA and the U.S. National Science Foundation. More

  • in

    A more sustainable way to generate phosphorus

    Phosphorus is an essential ingredient in thousands of products, including herbicides, lithium-ion batteries, and even soft drinks. Most of this phosphorus comes from an energy-intensive process that contributes significantly to global carbon emissions.

    In an effort to reduce that carbon footprint, MIT chemists have devised an alternative way to generate white phosphorus, a critical intermediate in the manufacture of those phosphorus-containing products. Their approach, which uses electricity to speed up a key chemical reaction, could reduce the carbon emissions of the process by half or even more, the researchers say.

    “White phosphorus is currently an indispensable intermediate, and our process dramatically reduces the carbon footprint of converting phosphate to white phosphorus,” says Yogesh Surendranath, an associate professor of chemistry at MIT and the senior author of the study.

    The new process reduces the carbon footprint of white phosphorus production in two ways: It reduces the temperatures required for the reaction, and it generates significantly less carbon dioxide as a waste product.

    Recent MIT graduate Jonathan “Jo” Melville PhD ’21 and MIT graduate student Andrew Licini are the lead authors of the paper, which appears today in ACS Central Science.

    Purifying phosphorus

    When phosphorus is mined out of the ground, it is in the form of phosphate, a mineral whose basic unit comprises one atom of phosphorus bound to four oxygen atoms. About 95 percent of this phosphate ore is used to make fertilizer. The remaining phosphate ore is processed separately into white phosphorus, a molecule composed of four phosphorus atoms bound to each other. White phosphorus is then fed into a variety of chemical processes that are used to manufacture many different products, such as lithium battery electrolytes and semiconductor dopants.

    Converting those mined phosphates into white phosphorus accounts for a substantial fraction of the carbon footprint of the entire phosphorus industry, Surendranath says. The most energy-intensive part of the process is breaking the bonds between phosphorus and oxygen, which are very stable.

    Using the traditional “thermal process,” those bonds are broken by heating carbon coke and phosphate rock to a temperature of 1,500 degrees Celsius. In this process, the carbon serves to strip away the oxygen atoms from phosphorus, leading to the eventual generation of CO2 as a byproduct. In addition, sustaining those temperatures requires a great deal of energy, adding to the carbon footprint of the process.

    “That process hasn’t changed substantially since its inception over a century ago. Our goal was to figure out how we could develop a process that would substantially lower the carbon footprint of this process,” Surendranath says. “The idea was to combine it with renewable electricity and drive that conversion of phosphate to white phosphorus with electrons rather than using carbon.”

    To do that, the researchers had to come up with an alternative way to weaken the strong phosphorus-oxygen bonds found in phosphates. They achieved this by controlling the environment in which the reaction occurs. The researchers found that the reaction could be promoted using a dehydrated form of phosphoric acid, which contains long chains of phosphate salts held together by bonds called phosphoryl anhydrides. These bonds help to weaken the phosphorus-oxygen bonds.

    When the researchers run an electric current through these salts, electrons break the weakened bonds, allowing the phosphorus atoms to break free and bind to each other to form white phosphorus. At the temperatures needed for this system (about 800 C), phosphorus exists as a gas, so it can bubble out of the solution and be collected in an external chamber.

    Decarbonization

    The electrode that the researchers used for this demonstration relies on carbon as a source of electrons, so the process generates some carbon dioxide as a byproduct. However, they are now working on swapping that electrode out for one that would use phosphate itself as the electron source, which would further reduce the carbon footprint by cleanly separating phosphate into phosphorus and oxygen.

    With the process reported in this paper, the researchers have reduced the overall carbon footprint for generating white phosphorus by about 50 percent. With future modifications, they hope to bring the carbon emissions down to nearly zero, in part by using renewable energy such as solar or wind power to drive the electric current required.

    If the researchers succeed in scaling up their process and making it widely available, it could allow industrial users to generate white phosphorus on site instead of having it shipped from the few places in the world where it is currently manufactured. That would cut down on the risks of transporting white phosphorus, which is an explosive material.

    “We’re excited about the prospect of doing on-site generation of this intermediate, so you don’t have to do the transportation and distribution,” Surendranath says. “If you could decentralize this production, the end user could make it on site and use it in an integrated fashion.”

    In order to do this study, the researchers had to develop new tools for controlling the electrolytes (such as salts and acids) present in the environment, and for measuring how those electrolytes affect the reaction. Now, they plan to use the same approach to try to develop lower-carbon processes for isolating other industrially important elements, such as silicon and iron.

    “This work falls within our broader interests in decarbonizing these legacy industrial processes that have a huge carbon footprint,” Surendranath says. “The basic science that leads us there is understanding how you can tailor the electrolytes to foster these processes.”

    The research was funded by the UMRP Partnership for Progress on Sustainable Development in Africa, a fellowship from the MIT Tata Center for Technology and Design, and a National Defense Science and Engineering Graduate Fellowship. More

  • in

    Methane research takes on new urgency at MIT

    One of the most notable climate change provisions in the 2022 Inflation Reduction Act is the first U.S. federal tax on a greenhouse gas (GHG). That the fee targets methane (CH4), rather than carbon dioxide (CO2), emissions is indicative of the urgency the scientific community has placed on reducing this short-lived but powerful gas. Methane persists in the air about 12 years — compared to more than 1,000 years for CO2 — yet it immediately causes about 120 times more warming upon release. The gas is responsible for at least a quarter of today’s gross warming. 

    “Methane has a disproportionate effect on near-term warming,” says Desiree Plata, the director of MIT Methane Network. “CH4 does more damage than CO2 no matter how long you run the clock. By removing methane, we could potentially avoid critical climate tipping points.” 

    Because GHGs have a runaway effect on climate, reductions made now will have a far greater impact than the same reductions made in the future. Cutting methane emissions will slow the thawing of permafrost, which could otherwise lead to massive methane releases, as well as reduce increasing emissions from wetlands.  

    “The goal of MIT Methane Network is to reduce methane emissions by 45 percent by 2030, which would save up to 0.5 degree C of warming by 2100,” says Plata, an associate professor of civil and environmental engineering at MIT and director of the Plata Lab. “When you consider that governments are trying for a 1.5-degree reduction of all GHGs by 2100, this is a big deal.” 

    Under normal concentrations, methane, like CO2, poses no health risks. Yet methane assists in the creation of high levels of ozone. In the lower atmosphere, ozone is a key component of air pollution, which leads to “higher rates of asthma and increased emergency room visits,” says Plata. 

    Methane-related projects at the Plata Lab include a filter made of zeolite — the same clay-like material used in cat litter — designed to convert methane into CO2 at dairy farms and coal mines. At first glance, the technology would appear to be a bit of a hard sell, since it converts one GHG into another. Yet the zeolite filter’s low carbon and dollar costs, combined with the disproportionate warming impact of methane, make it a potential game-changer.

    The sense of urgency about methane has been amplified by recent studies that show humans are generating far more methane emissions than previously estimated, and that the rates are rising rapidly. Exactly how much methane is in the air is uncertain. Current methods for measuring atmospheric methane, such as ground, drone, and satellite sensors, “are not readily abundant and do not always agree with each other,” says Plata.  

    The Plata Lab is collaborating with Tim Swager in the MIT Department of Chemistry to develop low-cost methane sensors. “We are developing chemiresisitive sensors that cost about a dollar that you could place near energy infrastructure to back-calculate where leaks are coming from,” says Plata.  

    The researchers are working on improving the accuracy of the sensors using machine learning techniques and are planning to integrate internet-of-things technology to transmit alerts. Plata and Swager are not alone in focusing on data collection: the Inflation Reduction Act adds significant funding for methane sensor research. 

    Other research at the Plata Lab includes the development of nanomaterials and heterogeneous catalysis techniques for environmental applications. The lab also explores mitigation solutions for industrial waste, particularly those related to the energy transition. Plata is the co-founder of an lithium-ion battery recycling startup called Nth Cycle. 

    On a more fundamental level, the Plata Lab is exploring how to develop products with environmental and social sustainability in mind. “Our overarching mission is to change the way that we invent materials and processes so that environmental objectives are incorporated along with traditional performance and cost metrics,” says Plata. “It is important to do that rigorous assessment early in the design process.”

    Play video

    MIT amps up methane research 

    The MIT Methane Network brings together 26 researchers from MIT along with representatives of other institutions “that are dedicated to the idea that we can reduce methane levels in our lifetime,” says Plata. The organization supports research such as Plata’s zeolite and sensor projects, as well as designing pipeline-fixing robots, developing methane-based fuels for clean hydrogen, and researching the capture and conversion of methane into liquid chemical precursors for pharmaceuticals and plastics. Other members are researching policies to encourage more sustainable agriculture and land use, as well as methane-related social justice initiatives. 

    “Methane is an especially difficult problem because it comes from all over the place,” says Plata. A recent Global Carbon Project study estimated that half of methane emissions are caused by humans. This is led by waste and agriculture (28 percent), including cow and sheep belching, rice paddies, and landfills.  

    Fossil fuels represent 18 percent of the total budget. Of this, about 63 percent is derived from oil and gas production and pipelines, 33 percent from coal mining activities, and 5 percent from industry and transportation. Human-caused biomass burning, primarily from slash-and-burn agriculture, emits about 4 percent of the global total.  

    The other half of the methane budget includes natural methane emissions from wetlands (20 percent) and other natural sources (30 percent). The latter includes permafrost melting and natural biomass burning, such as forest fires started by lightning.  

    With increases in global warming and population, the line between anthropogenic and natural causes is getting fuzzier. “Human activities are accelerating natural emissions,” says Plata. “Climate change increases the release of methane from wetlands and permafrost and leads to larger forest and peat fires.”  

    The calculations can get complicated. For example, wetlands provide benefits from CO2 capture, biological diversity, and sea level rise resiliency that more than compensate for methane releases. Meanwhile, draining swamps for development increases emissions. 

    Over 100 nations have signed onto the U.N.’s Global Methane Pledge to reduce at least 30 percent of anthropogenic emissions within the next 10 years. The U.N. report estimates that this goal can be achieved using proven technologies and that about 60 percent of these reductions can be accomplished at low cost. 

    Much of the savings would come from greater efficiencies in fossil fuel extraction, processing, and delivery. The methane fees in the Inflation Reduction Act are primarily focused on encouraging fossil fuel companies to accelerate ongoing efforts to cap old wells, flare off excess emissions, and tighten pipeline connections.  

    Fossil fuel companies have already made far greater pledges to reduce methane than they have with CO2, which is central to their business. This is due, in part, to the potential savings, as well as in preparation for methane regulations expected from the Environmental Protection Agency in late 2022. The regulations build upon existing EPA oversight of drilling operations, and will likely be exempt from the U.S. Supreme Court’s ruling that limits the federal government’s ability to regulate GHGs. 

    Zeolite filter targets methane in dairy and coal 

    The “low-hanging fruit” of gas stream mitigation addresses most of the 20 percent of total methane emissions in which the gas is released in sufficiently high concentrations for flaring. Plata’s zeolite filter aims to address the thornier challenge of reducing the 80 percent of non-flammable dilute emissions. 

    Plata found inspiration in decades-old catalysis research for turning methane into methanol. One strategy has been to use an abundant, low-cost aluminosilicate clay called zeolite.  

    “The methanol creation process is challenging because you need to separate a liquid, and it has very low efficiency,” says Plata. “Yet zeolite can be very efficient at converting methane into CO2, and it is much easier because it does not require liquid separation. Converting methane to CO2 sounds like a bad thing, but there is a major anti-warming benefit. And because methane is much more dilute than CO2, the relative CO2 contribution is minuscule.”  

    Using zeolite to create methanol requires highly concentrated methane, high temperatures and pressures, and industrial processing conditions. Yet Plata’s process, which dopes the zeolite with copper, operates in the presence of oxygen at much lower temperatures under typical pressures. “We let the methane proceed the way it wants from a thermodynamic perspective from methane to methanol down to CO2,” says Plata. 

    Researchers around the world are working on other dilute methane removal technologies. Projects include spraying iron salt aerosols into sea air where they react with natural chlorine or bromine radicals, thereby capturing methane. Most of these geoengineering solutions, however, are difficult to measure and would require massive scale to make a difference.  

    Plata is focusing her zeolite filters on environments where concentrations are high, but not so high as to be flammable. “We are trying to scale zeolite into filters that you could snap onto the side of a cross-ventilation fan in a dairy barn or in a ventilation air shaft in a coal mine,” says Plata. “For every packet of air we bring in, we take a lot of methane out, so we get more bang for our buck.”  

    The major challenge is creating a filter that can handle high flow rates without getting clogged or falling apart. Dairy barn air handlers can push air at up to 5,000 cubic feet per minute and coal mine handlers can approach 500,000 CFM. 

    Plata is exploring engineering options including fluidized bed reactors with floating catalyst particles. Another filter solution, based in part on catalytic converters, features “higher-order geometric structures where you have a porous material with a long path length where the gas can interact with the catalyst,” says Plata. “This avoids the challenge with fluidized beds of containing catalyst particles in the reactor. Instead, they are fixed within a structured material.”  

    Competing technologies for removing methane from mine shafts “operate at temperatures of 1,000 to 1,200 degrees C, requiring a lot of energy and risking explosion,” says Plata. “Our technology avoids safety concerns by operating at 300 to 400 degrees C. It reduces energy use and provides more tractable deployment costs.” 

    Potentially, energy and dollar costs could be further reduced in coal mines by capturing the heat generated by the conversion process. “In coal mines, you have enrichments above a half-percent methane, but below the 4 percent flammability threshold,” says Plata. “The excess heat from the process could be used to generate electricity using off-the-shelf converters.” 

    Plata’s dairy barn research is funded by the Gerstner Family Foundation and the coal mining project by the U.S. Department of Energy. “The DOE would like us to spin out the technology for scale-up within three years,” says Plata. “We cannot guarantee we will hit that goal, but we are trying to develop this as quickly as possible. Our society needs to start reducing methane emissions now.”  More