More stories

  • in

    Methane research takes on new urgency at MIT

    One of the most notable climate change provisions in the 2022 Inflation Reduction Act is the first U.S. federal tax on a greenhouse gas (GHG). That the fee targets methane (CH4), rather than carbon dioxide (CO2), emissions is indicative of the urgency the scientific community has placed on reducing this short-lived but powerful gas. Methane persists in the air about 12 years — compared to more than 1,000 years for CO2 — yet it immediately causes about 120 times more warming upon release. The gas is responsible for at least a quarter of today’s gross warming. 

    “Methane has a disproportionate effect on near-term warming,” says Desiree Plata, the director of MIT Methane Network. “CH4 does more damage than CO2 no matter how long you run the clock. By removing methane, we could potentially avoid critical climate tipping points.” 

    Because GHGs have a runaway effect on climate, reductions made now will have a far greater impact than the same reductions made in the future. Cutting methane emissions will slow the thawing of permafrost, which could otherwise lead to massive methane releases, as well as reduce increasing emissions from wetlands.  

    “The goal of MIT Methane Network is to reduce methane emissions by 45 percent by 2030, which would save up to 0.5 degree C of warming by 2100,” says Plata, an associate professor of civil and environmental engineering at MIT and director of the Plata Lab. “When you consider that governments are trying for a 1.5-degree reduction of all GHGs by 2100, this is a big deal.” 

    Under normal concentrations, methane, like CO2, poses no health risks. Yet methane assists in the creation of high levels of ozone. In the lower atmosphere, ozone is a key component of air pollution, which leads to “higher rates of asthma and increased emergency room visits,” says Plata. 

    Methane-related projects at the Plata Lab include a filter made of zeolite — the same clay-like material used in cat litter — designed to convert methane into CO2 at dairy farms and coal mines. At first glance, the technology would appear to be a bit of a hard sell, since it converts one GHG into another. Yet the zeolite filter’s low carbon and dollar costs, combined with the disproportionate warming impact of methane, make it a potential game-changer.

    The sense of urgency about methane has been amplified by recent studies that show humans are generating far more methane emissions than previously estimated, and that the rates are rising rapidly. Exactly how much methane is in the air is uncertain. Current methods for measuring atmospheric methane, such as ground, drone, and satellite sensors, “are not readily abundant and do not always agree with each other,” says Plata.  

    The Plata Lab is collaborating with Tim Swager in the MIT Department of Chemistry to develop low-cost methane sensors. “We are developing chemiresisitive sensors that cost about a dollar that you could place near energy infrastructure to back-calculate where leaks are coming from,” says Plata.  

    The researchers are working on improving the accuracy of the sensors using machine learning techniques and are planning to integrate internet-of-things technology to transmit alerts. Plata and Swager are not alone in focusing on data collection: the Inflation Reduction Act adds significant funding for methane sensor research. 

    Other research at the Plata Lab includes the development of nanomaterials and heterogeneous catalysis techniques for environmental applications. The lab also explores mitigation solutions for industrial waste, particularly those related to the energy transition. Plata is the co-founder of an lithium-ion battery recycling startup called Nth Cycle. 

    On a more fundamental level, the Plata Lab is exploring how to develop products with environmental and social sustainability in mind. “Our overarching mission is to change the way that we invent materials and processes so that environmental objectives are incorporated along with traditional performance and cost metrics,” says Plata. “It is important to do that rigorous assessment early in the design process.”

    Play video

    MIT amps up methane research 

    The MIT Methane Network brings together 26 researchers from MIT along with representatives of other institutions “that are dedicated to the idea that we can reduce methane levels in our lifetime,” says Plata. The organization supports research such as Plata’s zeolite and sensor projects, as well as designing pipeline-fixing robots, developing methane-based fuels for clean hydrogen, and researching the capture and conversion of methane into liquid chemical precursors for pharmaceuticals and plastics. Other members are researching policies to encourage more sustainable agriculture and land use, as well as methane-related social justice initiatives. 

    “Methane is an especially difficult problem because it comes from all over the place,” says Plata. A recent Global Carbon Project study estimated that half of methane emissions are caused by humans. This is led by waste and agriculture (28 percent), including cow and sheep belching, rice paddies, and landfills.  

    Fossil fuels represent 18 percent of the total budget. Of this, about 63 percent is derived from oil and gas production and pipelines, 33 percent from coal mining activities, and 5 percent from industry and transportation. Human-caused biomass burning, primarily from slash-and-burn agriculture, emits about 4 percent of the global total.  

    The other half of the methane budget includes natural methane emissions from wetlands (20 percent) and other natural sources (30 percent). The latter includes permafrost melting and natural biomass burning, such as forest fires started by lightning.  

    With increases in global warming and population, the line between anthropogenic and natural causes is getting fuzzier. “Human activities are accelerating natural emissions,” says Plata. “Climate change increases the release of methane from wetlands and permafrost and leads to larger forest and peat fires.”  

    The calculations can get complicated. For example, wetlands provide benefits from CO2 capture, biological diversity, and sea level rise resiliency that more than compensate for methane releases. Meanwhile, draining swamps for development increases emissions. 

    Over 100 nations have signed onto the U.N.’s Global Methane Pledge to reduce at least 30 percent of anthropogenic emissions within the next 10 years. The U.N. report estimates that this goal can be achieved using proven technologies and that about 60 percent of these reductions can be accomplished at low cost. 

    Much of the savings would come from greater efficiencies in fossil fuel extraction, processing, and delivery. The methane fees in the Inflation Reduction Act are primarily focused on encouraging fossil fuel companies to accelerate ongoing efforts to cap old wells, flare off excess emissions, and tighten pipeline connections.  

    Fossil fuel companies have already made far greater pledges to reduce methane than they have with CO2, which is central to their business. This is due, in part, to the potential savings, as well as in preparation for methane regulations expected from the Environmental Protection Agency in late 2022. The regulations build upon existing EPA oversight of drilling operations, and will likely be exempt from the U.S. Supreme Court’s ruling that limits the federal government’s ability to regulate GHGs. 

    Zeolite filter targets methane in dairy and coal 

    The “low-hanging fruit” of gas stream mitigation addresses most of the 20 percent of total methane emissions in which the gas is released in sufficiently high concentrations for flaring. Plata’s zeolite filter aims to address the thornier challenge of reducing the 80 percent of non-flammable dilute emissions. 

    Plata found inspiration in decades-old catalysis research for turning methane into methanol. One strategy has been to use an abundant, low-cost aluminosilicate clay called zeolite.  

    “The methanol creation process is challenging because you need to separate a liquid, and it has very low efficiency,” says Plata. “Yet zeolite can be very efficient at converting methane into CO2, and it is much easier because it does not require liquid separation. Converting methane to CO2 sounds like a bad thing, but there is a major anti-warming benefit. And because methane is much more dilute than CO2, the relative CO2 contribution is minuscule.”  

    Using zeolite to create methanol requires highly concentrated methane, high temperatures and pressures, and industrial processing conditions. Yet Plata’s process, which dopes the zeolite with copper, operates in the presence of oxygen at much lower temperatures under typical pressures. “We let the methane proceed the way it wants from a thermodynamic perspective from methane to methanol down to CO2,” says Plata. 

    Researchers around the world are working on other dilute methane removal technologies. Projects include spraying iron salt aerosols into sea air where they react with natural chlorine or bromine radicals, thereby capturing methane. Most of these geoengineering solutions, however, are difficult to measure and would require massive scale to make a difference.  

    Plata is focusing her zeolite filters on environments where concentrations are high, but not so high as to be flammable. “We are trying to scale zeolite into filters that you could snap onto the side of a cross-ventilation fan in a dairy barn or in a ventilation air shaft in a coal mine,” says Plata. “For every packet of air we bring in, we take a lot of methane out, so we get more bang for our buck.”  

    The major challenge is creating a filter that can handle high flow rates without getting clogged or falling apart. Dairy barn air handlers can push air at up to 5,000 cubic feet per minute and coal mine handlers can approach 500,000 CFM. 

    Plata is exploring engineering options including fluidized bed reactors with floating catalyst particles. Another filter solution, based in part on catalytic converters, features “higher-order geometric structures where you have a porous material with a long path length where the gas can interact with the catalyst,” says Plata. “This avoids the challenge with fluidized beds of containing catalyst particles in the reactor. Instead, they are fixed within a structured material.”  

    Competing technologies for removing methane from mine shafts “operate at temperatures of 1,000 to 1,200 degrees C, requiring a lot of energy and risking explosion,” says Plata. “Our technology avoids safety concerns by operating at 300 to 400 degrees C. It reduces energy use and provides more tractable deployment costs.” 

    Potentially, energy and dollar costs could be further reduced in coal mines by capturing the heat generated by the conversion process. “In coal mines, you have enrichments above a half-percent methane, but below the 4 percent flammability threshold,” says Plata. “The excess heat from the process could be used to generate electricity using off-the-shelf converters.” 

    Plata’s dairy barn research is funded by the Gerstner Family Foundation and the coal mining project by the U.S. Department of Energy. “The DOE would like us to spin out the technology for scale-up within three years,” says Plata. “We cannot guarantee we will hit that goal, but we are trying to develop this as quickly as possible. Our society needs to start reducing methane emissions now.”  More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More

  • in

    In nanotube science, is boron nitride the new carbon?

    Engineers at MIT and the University of Tokyo have produced centimeter-scale structures, large enough for the eye to see, that are packed with hundreds of billions of hollow aligned fibers, or nanotubes, made from hexagonal boron nitride.

    Hexagonal boron nitride, or hBN, is a single-atom-thin material that has been coined “white graphene” for its transparent appearance and its similarity to carbon-based graphene in molecular structure and strength. It can also withstand higher temperatures than graphene, and is electrically insulating, rather than conductive. When hBN is rolled into nanometer-scale tubes, or nanotubes, its exceptional properties are significantly enhanced.

    The team’s results, published today in the journal ACS Nano, provide a route toward fabricating aligned boron nitride nanotubes (A-BNNTs) in bulk. The researchers plan to harness the technique to fabricate bulk-scale arrays of these nanotubes, which can then be combined with other materials to make stronger, more heat-resistant composites, for instance to shield space structures and hypersonic aircraft.

    As hBN is transparent and electrically insulating, the team also envisions incorporating the BNNTs into transparent windows and using them to electrically insulate sensors within electronic devices. The team is also investigating ways to weave the nanofibers into membranes for water filtration and for “blue energy” — a concept for renewable energy in which electricity is produced from the ionic filtering of salt water into fresh water.

    Brian Wardle, professor of aeronautics and astronautics at MIT, likens the team’s results to scientists’ decades-long, ongoing pursuit of manufacturing bulk-scale carbon nanotubes.

    “In 1991, a single carbon nanotube was identified as an interesting thing, but it’s been 30 years getting to bulk aligned carbon nanotubes, and the world’s not even fully there yet,” Wardle says. “With the work we’re doing, we’ve just short-circuited about 20 years in getting to bulk-scale versions of aligned boron nitride nanotubes.”

    Wardle is the senior author of the new study, which includes lead author and MIT research scientist Luiz Acauan, former MIT postdoc Haozhe Wang, and collaborators at the University of Tokyo.

    A vision, aligned

    Like graphene, hexagonal boron nitride has a molecular structure resembling chicken wire. In graphene, this chicken wire configuration is made entirely of carbon atoms, arranged in a repeating pattern of hexagons. For hBN, the hexagons are composed of alternating atoms of boron and nitrogen. In recent years, researchers have found that two-dimensional sheets of hBN exhibit exceptional properties of strength, stiffness, and resilience at high temperatures. When sheets of hBN are rolled into nanotube form, these properties are further enhanced, particularly when the nanotubes are aligned, like tiny trees in a densely packed forest.

    But finding ways to synthesize stable, high quality BNNTs has proven challenging. A handful of efforts to do so have produced low-quality, nonaligned fibers.

    “If you can align them, you have much better chance of harnessing BNNTs properties at the bulk scale to make actual physical devices, composites, and membranes,” Wardle says.

    In 2020, Rong Xiang and colleagues at the University of Tokyo found they could produce high-quality boron nitride nanotubes by first using a conventional approach of chemical vapor deposition to grow a forest of short, few micron-long carbon nanotubes. They then coated the carbon-based forest with “precursors” of boron and nitrogen gas, which when baked in an oven at high temperatures crystallized onto the carbon nanotubes to form high-quality nanotubes of hexagonal boron nitride with carbon nanotubes inside.

    Burning scaffolds

    In the new study, Wardle and Acauan have extend and scale Xiang’s approach, essentially removing the underlying carbon nanotubes and leaving the long boron nitride nanotubes to stand on their own. The team drew on the expertise of Wardle’s group, which has focused for years on fabricating high-quality aligned arrays of carbon nanotubes. With their current work, the researchers looked for ways to tweak the temperatures and pressures of the chemical vapor deposition process in order to remove the carbon nanotubes while leaving the boron nitride nanotubes intact.

    “The first few times we did it, it was completely ugly garbage,” Wardle recalls. “The tubes curled up into a ball, and they didn’t work.”

    Eventually, the team hit on a combination of temperatures, pressures, and precursors that did the trick. With this combination of processes, the researchers first reproduced the steps that Xiang took to synthesize the boron-nitride-coated carbon nanotubes. As hBN is resistant to higher temperatures than graphene, the team then cranked up the heat to burn away the underlying black carbon nanotube scaffold, while leaving the transparent, freestanding boron nitride nanotubes intact.
    By using carbon nanotubes as a scaffold, MIT engineers grow forests of “white graphene” that emerge (in MIT pattern) after burning away the black carbon scaffold. Courtesy of the researchersIn microscopic images, the team observed clear crystalline structures — evidence that the boron nitride nanotubes have a high quality. The structures were also dense: Within a square centimeter, the researchers were able to synthesize a forest of more than 100 billion aligned boron nitride nanotubes, that measured about a millimeter in height — large enough to be visible by eye. By nanotube engineering standards, these dimensions are considered to be “bulk” in scale.

    “We are now able to make these nanoscale fibers at bulk scale, which has never been shown before,” Acauan says.

    To demonstrate the flexibility of their technique, the team synthesized larger carbon-based structures, including a weave of carbon fibers, a mat of “fuzzy” carbon nanotubes, and sheets of randomly oriented carbon nanotubes known as “buckypaper.” They coated each carbon-based sample with boron and nitrogen precursors, then went through their process to burn away the underlying carbon. In each demonstration, they were left with a boron-nitride replica of the original black carbon scaffold.

    They also were able to “knock down” the forests of BNNTs, producing horizontally aligned fiber films that are a preferred configuration for incorporating into composite materials.

    “We are now working toward fibers to reinforce ceramic matrix composites, for hypersonic and space applications where there are very high temperatures, and for windows for devices that need to be optically transparent,” Wardle says. “You could make transparent materials that are reinforced with these very strong nanotubes.”

    This research was supported, in part, by Airbus, ANSYS, Boeing, Embraer, Lockheed Martin, Saab AB, and Teijin Carbon America through MIT’s Nano-Engineered Composite aerospace STructures (NECST) Consortium. More

  • in

    Coordinating climate and air-quality policies to improve public health

    As America’s largest investment to fight climate change, the Inflation Reduction Act positions the country to reduce its greenhouse gas emissions by an estimated 40 percent below 2005 levels by 2030. But as it edges the United States closer to achieving its international climate commitment, the legislation is also expected to yield significant — and more immediate — improvements in the nation’s health. If successful in accelerating the transition from fossil fuels to clean energy alternatives, the IRA will sharply reduce atmospheric concentrations of fine particulates known to exacerbate respiratory and cardiovascular disease and cause premature deaths, along with other air pollutants that degrade human health. One recent study shows that eliminating air pollution from fossil fuels in the contiguous United States would prevent more than 50,000 premature deaths and avoid more than $600 billion in health costs each year.

    While national climate policies such as those advanced by the IRA can simultaneously help mitigate climate change and improve air quality, their results may vary widely when it comes to improving public health. That’s because the potential health benefits associated with air quality improvements are much greater in some regions and economic sectors than in others. Those benefits can be maximized, however, through a prudent combination of climate and air-quality policies.

    Several past studies have evaluated the likely health impacts of various policy combinations, but their usefulness has been limited due to a reliance on a small set of standard policy scenarios. More versatile tools are needed to model a wide range of climate and air-quality policy combinations and assess their collective effects on air quality and human health. Now researchers at the MIT Joint Program on the Science and Policy of Global Change and MIT Institute for Data, Systems and Society (IDSS) have developed a publicly available, flexible scenario tool that does just that.

    In a study published in the journal Geoscientific Model Development, the MIT team introduces its Tool for Air Pollution Scenarios (TAPS), which can be used to estimate the likely air-quality and health outcomes of a wide range of climate and air-quality policies at the regional, sectoral, and fuel-based level. 

    “This tool can help integrate the siloed sustainability issues of air pollution and climate action,” says the study’s lead author William Atkinson, who recently served as a Biogen Graduate Fellow and research assistant at the IDSS Technology and Policy Program’s (TPP) Research to Policy Engagement Initiative. “Climate action does not guarantee a clean air future, and vice versa — but the issues have similar sources that imply shared solutions if done right.”

    The study’s initial application of TAPS shows that with current air-quality policies and near-term Paris Agreement climate pledges alone, short-term pollution reductions give way to long-term increases — given the expected growth of emissions-intensive industrial and agricultural processes in developing regions. More ambitious climate and air-quality policies could be complementary, each reducing different pollutants substantially to give tremendous near- and long-term health benefits worldwide.

    “The significance of this work is that we can more confidently identify the long-term emission reduction strategies that also support air quality improvements,” says MIT Joint Program Deputy Director C. Adam Schlosser, a co-author of the study. “This is a win-win for setting climate targets that are also healthy targets.”

    TAPS projects air quality and health outcomes based on three integrated components: a recent global inventory of detailed emissions resulting from human activities (e.g., fossil fuel combustion, land-use change, industrial processes); multiple scenarios of emissions-generating human activities between now and the year 2100, produced by the MIT Economic Projection and Policy Analysis model; and emissions intensity (emissions per unit of activity) scenarios based on recent data from the Greenhouse Gas and Air Pollution Interactions and Synergies model.

    “We see the climate crisis as a health crisis, and believe that evidence-based approaches are key to making the most of this historic investment in the future, particularly for vulnerable communities,” says Johanna Jobin, global head of corporate reputation and responsibility at Biogen. “The scientific community has spoken with unanimity and alarm that not all climate-related actions deliver equal health benefits. We’re proud of our collaboration with the MIT Joint Program to develop this tool that can be used to bridge research-to-policy gaps, support policy decisions to promote health among vulnerable communities, and train the next generation of scientists and leaders for far-reaching impact.”

    The tool can inform decision-makers about a wide range of climate and air-quality policies. Policy scenarios can be applied to specific regions, sectors, or fuels to investigate policy combinations at a more granular level, or to target short-term actions with high-impact benefits.

    TAPS could be further developed to account for additional emissions sources and trends.

    “Our new tool could be used to examine a large range of both climate and air quality scenarios. As the framework is expanded, we can add detail for specific regions, as well as additional pollutants such as air toxics,” says study supervising co-author Noelle Selin, professor at IDSS and the MIT Department of Earth, Atmospheric and Planetary Sciences, and director of TPP.    

    This research was supported by the U.S. Environmental Protection Agency and its Science to Achieve Results (STAR) program; Biogen; TPP’s Leading Technology and Policy Initiative; and TPP’s Research to Policy Engagement Initiative. More

  • in

    Two first-year students named Rise Global Winners for 2022

    In 2019, former Google CEO Eric Schmidt and his wife, Wendy, launched a $1 billion philanthropic commitment to identify global talent. Part of that effort is the Rise initiative, which selects 100 young scholars, ages 15-17, from around the world who show unusual promise and a drive to serve others. This year’s cohort of 100 Rise Global Winners includes two MIT first-year students, Jacqueline Prawira and Safiya Sankari.

    Rise intentionally targets younger-aged students and focuses on identifying what the program terms “hidden brilliance” in any form, anywhere in the world, whether it be in a high school or a refugee camp. Another defining aspect of the program is that Rise winners receive sustained support — not just in secondary school, but throughout their lives.

    “We believe that the answers to the world’s toughest problems lie in the imagination of the world’s brightest minds,” says Eric Braverman, CEO of Schmidt Futures, which manages Rise along with the Rhodes Trust. “Rise is an integral part of our mission to create the best, largest, and most enduring pipeline of exceptional talent globally and match it to opportunities to serve others for life.”

    The Rise program creates this enduring pipeline by providing a lifetime of benefits, including funding, programming, and mentoring opportunities. These resources can be tailored to each person as they evolve throughout their career. In addition to a four-year college scholarship, winners receive mentoring and career services; networking opportunities with other Rise recipients and partner organizations; technical equipment such as laptops or tablets; courses on topics like leadership and human-centered design; and opportunities to apply for graduate scholarships and for funding throughout their careers to support their innovative ideas, such as grants or seed money to start a social enterprise.

    Prawira and Sankari’s winning service projects focus on global sustainability and global medical access, respectively. Prawira invented a way to use upcycled fish-scale waste to absorb heavy metals in wastewater. She first started experimenting with fish-scale waste in middle school to try to find a bio-based alternative to plastic. More recently, she discovered that the calcium salts and collagen in fish scales can absorb up to 82 percent of heavy metals from water, and 91 percent if an electric current is passed through the water. Her work has global implications for treating contaminated water at wastewater plants and in developing countries.

    Prawiri published her research in 2021 and has won awards from the U.S. Environmental Protection Agency and several other organizations. She’s planning to major in Course 3 (materials science and engineering), perhaps with an environmentally related minor. “I believe that sustainability and solving environmental problems requires a multifaced approach,” she says. “Creating greener materials for use in our daily lives will have a major impact in solving current environmental issues.”

    For Sankari’s service project, she developed an algorithm to analyze data from electronic nano-sensor devices, or e-noses, which can detect certain diseases from a patient’s breath. The devices are calibrated to detect volatile organic compound biosignatures that are indicative of diseases like diabetes and cancer. “E-nose disease detection is much faster and cheaper than traditional methods of diagnosis, making medical care more accessible to many,” she explains. The Python-based algorithm she created can translate raw data from e-noses into a result that the user can read.

    Sankari is a lifetime member of the American Junior Academy of Science and has been a finalist in several prestigious science competitions. She is considering a major in Course 6-7 (computer science and molecular biology) at MIT and hopes to continue to explore the intersection between nanotechnology and medicine.

    While the 2022 Rise recipients share a desire to tackle some of the world’s most intractable problems, their ideas and interests, as reflected by their service projects, are broad, innovative, and diverse. A winner from Belarus used bioinformatics to predict the molecular effect of a potential Alzheimer’s drug. A Romanian student created a magazine that aims to promote acceptance of transgender bodies. A Vietnamese teen created a prototype of a toothbrush that uses a nano chip to detect cancerous cells in saliva. And a recipient from the United States designed modular, tiny homes for the unhoused that are affordable and sustainable, as an alternative to homeless shelters.

    This year’s winners were selected from over 13,000 applicants from 47 countries, from Azerbaijan and Burkina Faso to Lebanon and Paraguay. The selection process includes group interviews, peer and expert review of each applicant’s service project, and formal talent assessments. More

  • in

    3 Questions: Blue hydrogen and the world’s energy systems

    In the past several years, hydrogen energy has increasingly become a more central aspect of the clean energy transition. Hydrogen can produce clean, on-demand energy that could complement variable renewable energy sources such as wind and solar power. That being said, pathways for deploying hydrogen at scale have yet to be fully explored. In particular, the optimal form of hydrogen production remains in question.

    MIT Energy Initiative Research Scientist Emre Gençer and researchers from a wide range of global academic and research institutions recently published “On the climate impacts of blue hydrogen production,” a comprehensive life-cycle assessment analysis of blue hydrogen, a term referring to natural gas-based hydrogen production with carbon capture and storage. Here, Gençer describes blue hydrogen and the role that hydrogen will play more broadly in decarbonizing the world’s energy systems.

    Q: What are the differences between gray, green, and blue hydrogen?

    A: Though hydrogen does not generate any emissions directly when it is used, hydrogen production can have a huge environmental impact. Colors of hydrogen are increasingly used to distinguish different production methods and as a proxy to represent the associated environmental impact. Today, close to 95 percent of hydrogen production comes from fossil resources. As a result, the carbon dioxide (CO2) emissions from hydrogen production are quite high. Gray, black, and brown hydrogen refer to fossil-based production. Gray is the most common form of production and comes from natural gas, or methane, using steam methane reformation but without capturing CO2.

    There are two ways to move toward cleaner hydrogen production. One is applying carbon capture and storage to the fossil fuel-based hydrogen production processes. Natural gas-based hydrogen production with carbon capture and storage is referred to as blue hydrogen. If substantial amounts of CO2 from natural gas reforming are captured and permanently stored, such hydrogen could be a low-carbon energy carrier. The second way to produce cleaner hydrogen is by using electricity to produce hydrogen via electrolysis. In this case, the source of the electricity determines the environmental impact of the hydrogen, with the lowest impact being achieved when electricity is generated from renewable sources, such as wind and solar. This is known as green hydrogen.

    Q: What insights have you gleaned with a life cycle assessment (LCA) of blue hydrogen and other low-carbon energy systems?

    A: Mitigating climate change requires significant decarbonization of the global economy. Accurate estimation of cumulative greenhouse gas (GHG) emissions and its reduction pathways is critical irrespective of the source of emissions. An LCA approach allows the quantification of the environmental life cycle of a commercial product, process, or service impact with all the stages (cradle-to-grave). The LCA-based comparison of alternative energy pathways, fuel options, etc., provides an apples-to-apples comparison of low-carbon energy choices. In the context of low-carbon hydrogen, it is essential to understand the GHG impact of supply chain options. Depending on the production method, contribution of life-cycle stages to the total emissions might vary. For example, with natural gas–based hydrogen production, emissions associated with production and transport of natural gas might be a significant contributor based on its leakage and flaring rates. If these rates are not precisely accounted for, the environmental impact of blue hydrogen can be underestimated. However, the same rationale is also true for electricity-based hydrogen production. If the electricity is not supplied from low-
carbon sources such as wind, solar, or nuclear, the carbon intensity of hydrogen can be significantly underestimated. In the case of nuclear, there are also other environmental impact considerations.

    An LCA approach — if performed with consistent system boundaries — can provide an accurate environmental impact comparison. It should also be noted that these estimations can only be as good as the assumptions and correlations used unless they are supported by measurements. 

    Q: What conditions are needed to make blue hydrogen production most effective, and how can it complement other decarbonization pathways?

    A: Hydrogen is considered one of the key vectors for the decarbonization of hard-to-abate sectors such as heavy-duty transportation. Currently, more than 95 percent of global hydrogen production is fossil-fuel based. In the next decade, massive amounts of hydrogen must be produced to meet this anticipated demand. It is very hard, if not impossible, to meet this demand without leveraging existing production assets. The immediate and relatively cost-effective option is to retrofit existing plants with carbon capture and storage (blue hydrogen).

    The environmental impact of blue hydrogen may vary over large ranges but depends on only a few key parameters: the methane emission rate of the natural gas supply chain, the CO2 removal rate at the hydrogen production plant, and the global warming metric applied. State-of-the-art reforming with high CO2 capture rates, combined with natural gas supply featuring low methane emissions, substantially reduces GHG emissions compared to conventional natural gas reforming. Under these conditions, blue hydrogen is compatible with low-carbon economies and exhibits climate change impacts at the upper end of the range of those caused by hydrogen production from renewable-based electricity. However, neither current blue nor green hydrogen production pathways render fully “net-zero” hydrogen without additional CO2 removal.

    This article appears in the Spring 2022 issue of Energy Futures, the magazine of the MIT Energy Initiative. More

  • in

    Studying floods to better predict their dangers

    “My job is basically flooding Cambridge,” says Katerina “Katya” Boukin, a graduate student in civil and environmental engineering at MIT and the MIT Concrete Sustainability Hub’s resident expert on flood simulations. 

    You can often find her fine-tuning high-resolution flood risk models for the City of Cambridge, Massachusetts, or talking about hurricanes with fellow researcher Ipek Bensu Manav.

    Flooding represents one of the world’s gravest natural hazards. Extreme climate events inducing flooding, like severe storms, winter storms, and tropical cyclones, caused an estimated $128.1 billion of damages in 2021 alone. 

    Climate simulation models suggest that severe storms will become more frequent in the coming years, necessitating a better understanding of which parts of cities are most vulnerable — an understanding that can be improved through modeling.

    A problem with current flood models is that they struggle to account for an oft-misunderstood type of flooding known as pluvial flooding. 

    “You might think of flooding as the overflowing of a body of water, like a river. This is fluvial flooding. This can be somewhat predictable, as you can think of proximity to water as a risk factor,” Boukin explains.

    However, the “flash flooding” that causes many deaths each year can happen even in places nowhere near a body of water. This is an example of pluvial flooding, which is affected by terrain, urban infrastructure, and the dynamic nature of storm loads.

    “If we don’t know how a flood is propagating, we don’t know the risk it poses to the urban environment. And if we don’t understand the risk, we can’t really discuss mitigation strategies,” says Boukin, “That’s why I pursue improving flood propagation models.”

    Boukin is leading development of a new flood prediction method that seeks to address these shortcomings. By better representing the complex morphology of cities, Boukin’s approach may provide a clearer forecast of future urban flooding.

    Katya Boukin developed this model of the City of Cambridge, Massachusetts. The base model was provided through a collaboration between MIT, the City of Cambridge, and Dewberry Engineering.

    Image: Katya Boukin

    Previous item
    Next item

    “In contrast to the more typical traditional catchment model, our method has rainwater spread around the urban environment based on the city’s topography, below-the-surface features like sewer pipes, and the characteristics of local soils,” notes Boukin.

    “We can simulate the flooding of regions with local rain forecasts. Our results can show how flooding propagates by the foot and by the second,” she adds.

    While Boukin’s current focus is flood simulation, her unconventional academic career has taken her research in many directions, like examining structural bottlenecks in dense urban rail systems and forecasting ground displacement due to tunneling. 

    “I’ve always been interested in the messy side of problem-solving. I think that difficult problems present a real chance to gain a deeper understanding,” says Boukin.

    Boukin credits her upbringing for giving her this perspective. A native of Israel, Boukin says that civil engineering is the family business. “My parents are civil engineers, my mom’s parents are, too, her grandfather was a professor in civil engineering, and so on. Civil engineering is my bloodline.”

    However, the decision to follow the family tradition did not come so easily. “After I took the Israeli equivalent of the SAT, I was at a decision point: Should I go to engineering school or medical school?” she recalls.

    “I decided to go on a backpacking trip to help make up my mind. It’s sort of an Israeli rite to explore internationally, so I spent six months in South America. I think backpacking is something everyone should do.”

    After this soul searching, Boukin landed on engineering school, where she fell in love with structural engineering. “It was the option that felt most familiar and interesting. I grew up playing with AutoCAD on the family computer, and now I use AutoCAD professionally!” she notes.

    “For my master’s degree, I was looking to study in a department that would help me integrate knowledge from fields like climatology and civil engineering. I found the MIT Department of Civil and Environmental Engineering to be an excellent fit,” she says.

    “I am lucky that MIT has so many people that work together as well as they do. I ended up at the Concrete Sustainability Hub, where I’m working on projects which are the perfect fit between what I wanted to do and what the department wanted to do.” 

    Boukin’s move to Cambridge has given her a new perspective on her family and childhood. 

    “My parents brought me to Israel when I was just 1 year old. In moving here as a second-time immigrant, I have a new perspective on what my parents went through during the move to Israel. I moved when I was 27 years old, the same age as they were. They didn’t have a support network and worked any job they could find,” she explains.

    “I am incredibly grateful to them for the morals they instilled in my sister, who recently graduated medical school, and I. I know I can call my parents if I ever need something, and they will do whatever they can to help.”

    Boukin hopes to honor her parents’ efforts through her research.

    “Not only do I want to help stakeholders understand flood risks, I want to make awareness of flooding more accessible. Each community needs different things to be resilient, and different cultures have different ways of delivering and receiving information,” she says.

    “Everyone should understand that they, in addition to the buildings and infrastructure around them, are part of a complex ecosystem. Any change to a city can affect the rest of it. If designers and residents are aware of this when considering flood mitigation strategies, we can better design cities and understand the consequences of damage.” More

  • in

    Simulating neutron behavior in nuclear reactors

    Amelia Trainer applied to MIT because she lost a bet.

    As part of what the fourth-year nuclear science and engineering (NSE) doctoral student labels her “teenage rebellious phase,” Trainer was quite convinced she would just be wasting the application fee were she to submit an application. She wasn’t even “super sure” she wanted to go to college. But a high-school friend was convinced Trainer would get into a “top school” if she only applied. A bet followed: If Trainer lost, she would have to apply to MIT. Trainer lost — and is glad she did.

    Growing up in Daytona Beach, Florida, good grades were Trainer’s thing. Seeing friends participate in interschool math competitions, Trainer decided she would tag along and soon found she loved them. She remembers being adept at reading the room: If teams were especially struggling over a problem, Trainer figured the answer had to be something easy, like zero or one. “The hardest problems would usually have the most goofball answers,” she laughs.

    Simulating neutron behavior

    As a doctoral student, hard problems in math, specifically computational reactor physics, continue to be Trainer’s forte.

    Her research, under the guidance of Professor Benoit Forget in MIT NSE’s Computational Reactor Physics Group (CRPG), focuses on modeling complicated neutron behavior in reactors. Simulation helps forecast the behavior of reactors before millions of dollars sink into development of a potentially uneconomical unit. Using simulations, Trainer can see “where the neutrons are going, how much heat is being produced, and how much power the reactor can generate.” Her research helps form the foundation for the next generation of nuclear power plants.

    To simulate neutron behavior inside of a nuclear reactor, you first need to know how neutrons will interact with the various materials inside the system. These neutrons can have wildly different energies, thereby making them susceptible to different physical phenomena. For the entirety of her graduate studies, Trainer has been primarily interested in the physics regarding slow-moving neutrons and their scattering behavior.

    When a slow neutron scatters off of a material, it can induce or cancel out molecular vibrations between the material’s atoms. The effect that material vibrations can have on neutron energies, and thereby on reactor behavior, has been heavily approximated over the years. Trainer is primarily interested in chipping away at these approximations by creating scattering data for materials that have historically been misrepresented and by exploring new techniques for preparing slow-neutron scattering data.

    Trainer remembers waiting for a simulation to complete in the early days of the Covid-19 pandemic, when she discovered a way to predict neutron behavior with limited input data. Traditionally, “people have to store large tables of what neutrons will do under specific circumstances,” she says. “I’m really happy about it because it’s this really cool method of sampling what your neutron does from very little information,” Trainer says.

    Amelia Trainer — Modeling complicated neutron behavior in nuclear reactors

    As part of her research, Trainer often works closely with two software packages: OpenMC and NJOY. OpenMC is a Monte Carlo neutron transport simulation code that was developed in the CRPG and is used to simulate neutron behavior in reactor systems. NJOY is a nuclear data processing tool, and is used to create, augment, and prepare material data that is fed into tools like OpenMC. By editing both these codes to her specifications, Trainer is able to observe the effect that “upstream” material data has on the “downstream” reactor calculations. Through this, she hopes to identify additional problems: approximations that could lead to a noticeable misrepresentation of the physics.

    A love of geometry and poetry

    Trainer discovered the coolness of science as a child. Her mother, who cares for indoor plants and runs multiple greenhouses, and her father, a blacksmith and farrier, who explored materials science through his craft, were self-taught inspirations.

    Trainer’s father urged his daughter to learn and pursue any topics that she found exciting and encouraged her to read poems from “Calvin and Hobbes” out loud when she struggled with a speech impediment in early childhood. Reading the same passages every day helped her memorize them. “The natural manifestation of that extended into [a love of] poetry,” Trainer says.

    A love of poetry, combined with Trainer’s propensity for fun, led her to compose an ode to pi as part of an MIT-sponsored event for alumni. “I was really only in it for the cupcake,” she laughs. (Participants received an indulgent treat).

    Play video

    MIT Matters: A Love Poem to Pi

    Computations and nuclear science

    After being accepted at MIT, Trainer knew she wanted to study in a field that would take her skills at the levels they were at — “my math skills were pretty underdeveloped in the grand scheme of things,” she says. An open-house weekend at MIT, where she met with faculty from the NSE department, and the opportunity to contribute to a discipline working toward clean energy, cemented Trainer’s decision to join NSE.

    As a high schooler, Trainer won a scholarship to Embry-Riddle Aeronautical University to learn computer coding and knew computational physics might be more aligned with her interests. After she joined MIT as an undergraduate student in 2014, she realized that the CRPG, with its focus on coding and modeling, might be a good fit. Fortunately, a graduate student from Forget’s team welcomed Trainer’s enthusiasm for research even as an undergraduate first-year. She has stayed with the lab ever since. 

    Research internships at Los Alamos National Laboratory, the creators of NJOY, have furthered Trainer’s enthusiasm for modeling and computational physics. She met a Los Alamos scientist after he presented a talk at MIT and it snowballed into a collaboration where she could work on parts of the NJOY code. “It became a really cool collaboration which led me into a deep dive into physics and data preparation techniques, which was just so fulfilling,” Trainer says. As for what’s next, Trainer was awarded the Rickover fellowship in nuclear engineering by the the Department of Energy’s Naval Reactors Division and will join the program in Pittsburgh after she graduates.

    For many years, Trainer’s cats, Jacques and Monster, have been a constant companion. “Neutrons, computers, and cats, that’s my personality,” she laughs. Work continues to fuel her passion. To borrow a favorite phrase from Spaceman Spiff, Trainer’s favorite “Calvin” avatar, Trainer’s approach to research has invariably been: “Another day, another mind-boggling adventure.” More