More stories

  • in

    An AI dataset carves new paths to tornado detection

    The return of spring in the Northern Hemisphere touches off tornado season. A tornado’s twisting funnel of dust and debris seems an unmistakable sight. But that sight can be obscured to radar, the tool of meteorologists. It’s hard to know exactly when a tornado has formed, or even why.

    A new dataset could hold answers. It contains radar returns from thousands of tornadoes that have hit the United States in the past 10 years. Storms that spawned tornadoes are flanked by other severe storms, some with nearly identical conditions, that never did. MIT Lincoln Laboratory researchers who curated the dataset, called TorNet, have now released it open source. They hope to enable breakthroughs in detecting one of nature’s most mysterious and violent phenomena.

    “A lot of progress is driven by easily available, benchmark datasets. We hope TorNet will lay a foundation for machine learning algorithms to both detect and predict tornadoes,” says Mark Veillette, the project’s co-principal investigator with James Kurdzo. Both researchers work in the Air Traffic Control Systems Group. 

    Along with the dataset, the team is releasing models trained on it. The models show promise for machine learning’s ability to spot a twister. Building on this work could open new frontiers for forecasters, helping them provide more accurate warnings that might save lives. 

    Swirling uncertainty

    About 1,200 tornadoes occur in the United States every year, causing millions to billions of dollars in economic damage and claiming 71 lives on average. Last year, one unusually long-lasting tornado killed 17 people and injured at least 165 others along a 59-mile path in Mississippi.  

    Yet tornadoes are notoriously difficult to forecast because scientists don’t have a clear picture of why they form. “We can see two storms that look identical, and one will produce a tornado and one won’t. We don’t fully understand it,” Kurdzo says.

    A tornado’s basic ingredients are thunderstorms with instability caused by rapidly rising warm air and wind shear that causes rotation. Weather radar is the primary tool used to monitor these conditions. But tornadoes lay too low to be detected, even when moderately close to the radar. As the radar beam with a given tilt angle travels further from the antenna, it gets higher above the ground, mostly seeing reflections from rain and hail carried in the “mesocyclone,” the storm’s broad, rotating updraft. A mesocyclone doesn’t always produce a tornado.

    With this limited view, forecasters must decide whether or not to issue a tornado warning. They often err on the side of caution. As a result, the rate of false alarms for tornado warnings is more than 70 percent. “That can lead to boy-who-cried-wolf syndrome,” Kurdzo says.  

    In recent years, researchers have turned to machine learning to better detect and predict tornadoes. However, raw datasets and models have not always been accessible to the broader community, stifling progress. TorNet is filling this gap.

    The dataset contains more than 200,000 radar images, 13,587 of which depict tornadoes. The rest of the images are non-tornadic, taken from storms in one of two categories: randomly selected severe storms or false-alarm storms (those that led a forecaster to issue a warning but that didn’t produce a tornado).

    Each sample of a storm or tornado comprises two sets of six radar images. The two sets correspond to different radar sweep angles. The six images portray different radar data products, such as reflectivity (showing precipitation intensity) or radial velocity (indicating if winds are moving toward or away from the radar).

    A challenge in curating the dataset was first finding tornadoes. Within the corpus of weather radar data, tornadoes are extremely rare events. The team then had to balance those tornado samples with difficult non-tornado samples. If the dataset were too easy, say by comparing tornadoes to snowstorms, an algorithm trained on the data would likely over-classify storms as tornadic.

    “What’s beautiful about a true benchmark dataset is that we’re all working with the same data, with the same level of difficulty, and can compare results,” Veillette says. “It also makes meteorology more accessible to data scientists, and vice versa. It becomes easier for these two parties to work on a common problem.”

    Both researchers represent the progress that can come from cross-collaboration. Veillette is a mathematician and algorithm developer who has long been fascinated by tornadoes. Kurdzo is a meteorologist by training and a signal processing expert. In grad school, he chased tornadoes with custom-built mobile radars, collecting data to analyze in new ways.

    “This dataset also means that a grad student doesn’t have to spend a year or two building a dataset. They can jump right into their research,” Kurdzo says.

    This project was funded by Lincoln Laboratory’s Climate Change Initiative, which aims to leverage the laboratory’s diverse technical strengths to help address climate problems threatening human health and global security.

    Chasing answers with deep learning

    Using the dataset, the researchers developed baseline artificial intelligence (AI) models. They were particularly eager to apply deep learning, a form of machine learning that excels at processing visual data. On its own, deep learning can extract features (key observations that an algorithm uses to make a decision) from images across a dataset. Other machine learning approaches require humans to first manually label features. 

    “We wanted to see if deep learning could rediscover what people normally look for in tornadoes and even identify new things that typically aren’t searched for by forecasters,” Veillette says.

    The results are promising. Their deep learning model performed similar to or better than all tornado-detecting algorithms known in literature. The trained algorithm correctly classified 50 percent of weaker EF-1 tornadoes and over 85 percent of tornadoes rated EF-2 or higher, which make up the most devastating and costly occurrences of these storms.

    They also evaluated two other types of machine-learning models, and one traditional model to compare against. The source code and parameters of all these models are freely available. The models and dataset are also described in a paper submitted to a journal of the American Meteorological Society (AMS). Veillette presented this work at the AMS Annual Meeting in January.

    “The biggest reason for putting our models out there is for the community to improve upon them and do other great things,” Kurdzo says. “The best solution could be a deep learning model, or someone might find that a non-deep learning model is actually better.”

    TorNet could be useful in the weather community for others uses too, such as for conducting large-scale case studies on storms. It could also be augmented with other data sources, like satellite imagery or lightning maps. Fusing multiple types of data could improve the accuracy of machine learning models.

    Taking steps toward operations

    On top of detecting tornadoes, Kurdzo hopes that models might help unravel the science of why they form.

    “As scientists, we see all these precursors to tornadoes — an increase in low-level rotation, a hook echo in reflectivity data, specific differential phase (KDP) foot and differential reflectivity (ZDR) arcs. But how do they all go together? And are there physical manifestations we don’t know about?” he asks.

    Teasing out those answers might be possible with explainable AI. Explainable AI refers to methods that allow a model to provide its reasoning, in a format understandable to humans, of why it came to a certain decision. In this case, these explanations might reveal physical processes that happen before tornadoes. This knowledge could help train forecasters, and models, to recognize the signs sooner. 

    “None of this technology is ever meant to replace a forecaster. But perhaps someday it could guide forecasters’ eyes in complex situations, and give a visual warning to an area predicted to have tornadic activity,” Kurdzo says.

    Such assistance could be especially useful as radar technology improves and future networks potentially grow denser. Data refresh rates in a next-generation radar network are expected to increase from every five minutes to approximately one minute, perhaps faster than forecasters can interpret the new information. Because deep learning can process huge amounts of data quickly, it could be well-suited for monitoring radar returns in real time, alongside humans. Tornadoes can form and disappear in minutes.

    But the path to an operational algorithm is a long road, especially in safety-critical situations, Veillette says. “I think the forecaster community is still, understandably, skeptical of machine learning. One way to establish trust and transparency is to have public benchmark datasets like this one. It’s a first step.”

    The next steps, the team hopes, will be taken by researchers across the world who are inspired by the dataset and energized to build their own algorithms. Those algorithms will in turn go into test beds, where they’ll eventually be shown to forecasters, to start a process of transitioning into operations.

    In the end, the path could circle back to trust.

    “We may never get more than a 10- to 15-minute tornado warning using these tools. But if we could lower the false-alarm rate, we could start to make headway with public perception,” Kurdzo says. “People are going to use those warnings to take the action they need to save their lives.” More

  • in

    Two MIT teams selected for NSF sustainable materials grants

    Two teams led by MIT researchers were selected in December 2023 by the U.S. National Science Foundation (NSF) Convergence Accelerator, a part of the TIP Directorate, to receive awards of $5 million each over three years, to pursue research aimed at helping to bring cutting-edge new sustainable materials and processes from the lab into practical, full-scale industrial production. The selection was made after 16 teams from around the country were chosen last year for one-year grants to develop detailed plans for further research aimed at solving problems of sustainability and scalability for advanced electronic products.

    Of the two MIT-led teams chosen for this current round of funding, one team, Topological Electric, is led by Mingda Li, an associate professor in the Department of Nuclear Science and Engineering. This team will be finding pathways to scale up sustainable topological materials, which have the potential to revolutionize next-generation microelectronics by showing superior electronic performance, such as dissipationless states or high-frequency response. The other team, led by Anuradha Agarwal, a principal research scientist at MIT’s Materials Research Laboratory, will be focusing on developing new materials, devices, and manufacturing processes for microchips that minimize energy consumption using electronic-photonic integration, and that detect and avoid the toxic or scarce materials used in today’s production methods.

    Scaling the use of topological materials

    Li explains that some materials based on quantum effects have achieved successful transitions from lab curiosities to successful mass production, such as blue-light LEDs, and giant magnetorestance (GMR) devices used for magnetic data storage. But he says there are a variety of equally promising materials that have shown promise but have yet to make it into real-world applications.

    “What we really wanted to achieve is to bring newer-generation quantum materials into technology and mass production, for the benefit of broader society,” he says. In particular, he says, “topological materials are really promising to do many different things.”

    Topological materials are ones whose electronic properties are fundamentally protected against disturbance. For example, Li points to the fact that just in the last two years, it has been shown that some topological materials are even better electrical conductors than copper, which are typically used for the wires interconnecting electronic components. But unlike the blue-light LEDs or the GMR devices, which have been widely produced and deployed, when it comes to topological materials, “there’s no company, no startup, there’s really no business out there,” adds Tomas Palacios, the Clarence J. Lebel Professor in Electrical Engineering at MIT and co-principal investigator on Li’s team. Part of the reason is that many versions of such materials are studied “with a focus on fundamental exotic physical properties with little or no consideration on the sustainability aspects,” says Liang Fu, an MIT professor of physics and also a co-PI. Their team will be looking for alternative formulations that are more amenable to mass production.

    One possible application of these topological materials is for detecting terahertz radiation, explains Keith Nelson, an MIT professor of chemistry and co-PI. This extremely high-frequency electronics can carry far more information than conventional radio or microwaves, but at present there are no mature electronic devices available that are scalable at this frequency range. “There’s a whole range of possibilities for topological materials” that could work at these frequencies, he says. In addition, he says, “we hope to demonstrate an entire prototype system like this in a single, very compact solid-state platform.”

    Li says that among the many possible applications of topological devices for microelectronics devices of various kinds, “we don’t know which, exactly, will end up as a product, or will reach real industrial scaleup. That’s why this opportunity from NSF is like a bridge, which is precious, to allow us to dig deeper to unleash the true potential.”

    In addition to Li, Palacios, Fu, and Nelson, the Topological Electric team includes Qiong Ma, assistant professor of physics in Boston College; Farnaz Niroui, assistant professor of electrical engineering and computer science at MIT; Susanne Stemmer, professor of materials at the University of California at Santa Barbara; Judy Cha, professor of materials science and engineering at Cornell University; industrial partners including IBM, Analog Devices, and Raytheon; and professional consultants. “We are taking this opportunity seriously,” Li says. “We really want to see if the topological materials are as good as we show in the lab when being scaled up, and how far we can push to broadly industrialize them.”

    Toward sustainable microchip production and use

    The microchips behind everything from smartphones to medical imaging are associated with a significant percentage of greenhouse gas emissions today, and every year the world produces more than 50 million metric tons of electronic waste, the equivalent of about 5,000 Eiffel Towers. Further, the data centers necessary for complex computations and huge amount of data transfer — think AI and on-demand video — are growing and will require 10 percent of the world’s electricity by 2030.

    “The current microchip manufacturing supply chain, which includes production, distribution, and use, is neither scalable nor sustainable, and cannot continue. We must innovate our way out of this crisis,” says Agarwal.

    The name of Agarwal’s team, FUTUR-IC, is a reference to the future of the integrated circuits, or chips, through a global alliance for sustainable microchip manufacturing. Says Agarwal, “We bring together stakeholders from industry, academia, and government to co-optimize across three dimensions: technology, ecology, and workforce. These were identified as key interrelated areas by some 140 stakeholders. With FUTUR-IC we aim to cut waste and CO2-equivalent emissions associated with electronics by 50 percent every 10 years.”

    The market for microelectronics in the next decade is predicted to be on the order of a trillion dollars, but most of the manufacturing for the industry occurs only in limited geographical pockets around the world. FUTUR-IC aims to diversify and strengthen the supply chain for manufacturing and packaging of electronics. The alliance has 26 collaborators and is growing. Current external collaborators include the International Electronics Manufacturing Initiative (iNEMI), Tyndall National Institute, SEMI, Hewlett Packard Enterprise, Intel, and the Rochester Institute of Technology.

    Agarwal leads FUTUR-IC in close collaboration with others, including, from MIT, Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering; Elsa Olivetti, the Jerry McAfee Professor in Engineering; Randolph Kirchain, principal research scientist in the Materials Research Laboratory; and Greg Norris, director of MIT’s Sustainability and Health Initiative for NetPositive Enterprise (SHINE). All are affiliated with the Materials Research Laboratory. They are joined by Samuel Serna, an MIT visiting professor and assistant professor of physics at Bridgewater State University. Other key personnel include Sajan Saini, education director for the Initiative for Knowledge and Innovation in Manufacturing in MIT’s Department of Materials Science and Engineering; Peter O’Brien, a professor from Tyndall National Institute; and Shekhar Chandrashekhar, CEO of iNEMI.

    “We expect the integration of electronics and photonics to revolutionize microchip manufacturing, enhancing efficiency, reducing energy consumption, and paving the way for unprecedented advancements in computing speed and data-processing capabilities,” says Serna, who is the co-lead on the project’s technology “vector.”

    Common metrics for these efforts are needed, says Norris, co-lead for the ecology vector, adding, “The microchip industry must have transparent and open Life Cycle Assessment (LCA) models and data, which are being developed by FUTUR-IC.” This is especially important given that microelectronics production transcends industries. “Given the scale and scope of microelectronics, it is critical for the industry to lead in the transition to sustainable manufacture and use,” says Kirchain, another co-lead and the co-director of the Concrete Sustainability Hub at MIT. To bring about this cross-fertilization, co-lead Olivetti, also co-director of the MIT Climate and Sustainability Consortium (MCSC), will collaborate with FUTUR-IC to enhance the benefits from microchip recycling, leveraging the learning across industries.

    Saini, the co-lead for the workforce vector, stresses the need for agility. “With a workforce that adapts to a practice of continuous upskilling, we can help increase the robustness of the chip-manufacturing supply chain, and validate a new design for a sustainability curriculum,” he says.

    “We have become accustomed to the benefits forged by the exponential growth of microelectronic technology performance and market size,” says Kimerling, who is also director of MIT’s Materials Research Laboratory and co-director of the MIT Microphotonics Center. “The ecological impact of this growth in terms of materials use, energy consumption and end-of-life disposal has begun to push back against this progress. We believe that concurrently engineered solutions for these three dimensions will build a common learning curve to power the next 40 years of progress in the semiconductor industry.”

    The MIT teams are two of six that received awards addressing sustainable materials for global challenges through phase two of the NSF Convergence Accelerator program. Launched in 2019, the program targets solutions to especially compelling challenges at an accelerated pace by incorporating a multidisciplinary research approach. More

  • in

    Bringing an investigator’s eye to complex social challenges

    Anna Russo likes puzzles. They require patience, organization, and a view of the big picture. She brings an investigator’s eye to big institutional and societal challenges whose solutions can have wide-ranging, long-term impacts.

    Russo’s path to MIT began with questions. She didn’t have the whole picture yet. “I had no idea what I wanted to do with my life,” says Russo, who is completing her PhD in economics in 2024. “I was good at math and science and thought I wanted to be a doctor.”

    While completing her undergraduate studies at Yale University, where she double majored in economics and applied math, Russo discovered a passion for problem-solving, where she could apply an analytical lens to answering the kinds of thorny questions whose solutions could improve policy. “Empirical research is fun and exciting,” Russo says.

    After Yale, Russo considered what to do next. She worked as a full-time research assistant with MIT economist Amy Finkelstein. Russo’s work with Finkelstein led her toward identifying, studying, and developing answers to complex questions. 

    “My research combines ideas from two fields of economic inquiry — public finance and industrial organization — and applies them to questions about the design of environmental and health care policy,” Russo says. “I like the way economists think analytically about social problems.”

    Narrowing her focus

    Studying with and being advised by renowned economists as both an undergraduate and a doctoral student helped Russo narrow her research focus, fitting more pieces into the puzzle. “What drew me to MIT was its investment in its graduate students,” Russo says.

    Economic research meant digging into policy questions, identifying market failures, and proposing solutions. Doctoral study allowed Russo to assemble data to rigorously follow each line of inquiry.

    “Doctoral study means you get to write about something you’re really interested in,” Russo notes. This led her to study policy responses to climate change adaptation and mitigation. 

    “In my first year, I worked on a project exploring the notion that floodplain regulation design doesn’t do a good job of incentivizing the right level of development in flood-prone areas,” she says. “How can economists help governments convince people to act in society’s best interest?”

    It’s important to understand institutional details, Russo adds, which can help investigators identify and implement solutions. 

    “Feedback, advice, and support from faculty were crucial as I grew as a researcher at MIT,” she says. Beyond her two main MIT advisors, Finkelstein and economist Nikhil Agarwal — educators she describes as “phenomenal, dedicated advisors and mentors” — Russo interacted regularly with faculty across the department. 

    Russo later discovered another challenge she hoped to solve: inefficiencies in conservation and carbon offset programs. She set her sights on the United States Department of Agriculture’s Conservation Reserve Program because she believes it and programs like it can be improved. 

    The CRP is a land conservation plan administered by USDA’s Farm Service Agency. In exchange for a yearly rental payment, farmers enrolled in the program agree to remove environmentally sensitive land from agricultural production and plant species that will improve environmental health and quality.

    “I think we can tweak the program’s design to improve cost-effectiveness,” Russo says. “There’s a trove of data available.” The data include information like auction participants’ bids in response to well-specified auction rules, which Russo links to satellite data measuring land use outcomes. Understanding how landowners bid in CRP auctions can help identify and improve the program’s function. 

    “We may be able to improve targeting and achieve more cost-effective conservation by adjusting the CRP’s scoring system,” Russo argues. Opportunities may exist to scale the incremental changes under study for other conservation programs and carbon offset markets more generally.  

    Economics, Russo believes, can help us conceptualize problems and recommend effective alternative solutions.

    The next puzzle

    Russo wants to find her next challenge while continuing her research. She plans to continue her work as a junior fellow at the Harvard Society of Fellows, after which she’ll join the Harvard Department of Economics as an assistant professor. Russo also plans to continue helping other budding economists since she believes in the importance of supporting other students.   

    Russo’s advisors are some of her biggest supporters. 

    Finklestein emphasizes Russo’s curiosity, enthusiasm, and energy as key drivers in her success. “Her genuine curiosity and interest in getting to the bottom of a problem with the data — with an econometric analysis, with a modeling issue — is the best antidote for [the stress that can be associated with research],” Finklestein says. “It’s a key ingredient in her ability to produce important and credible work.”

    “She’s also incredibly generous with her time and advice,” Finklestein continues, “whether it’s helping an undergraduate research assistant with her senior thesis, or helping an advisor such as myself navigate a data access process she’s previously been through.”

    “Instead of an advisor-advisee relationship, working with her on a thesis felt more like a collaboration between equals,” Agarwal adds. “[She] has the maturity and smarts to produce pathbreaking research.

    “Doctoral study is an opportunity for students to find their paths collaboratively,” Russo says. “If I can help someone else solve a small piece of their puzzle, that’s a huge positive. Research is a series of many, many small steps forward.” 

    Identifying important causes for further investigation and study will always be important to Russo. “I also want to dig into some other market that’s not working well and figure out how to make it better,” she says. “Right now I’m really excited about understanding California wildfire mitigation.” 

    Puzzles are made to be solved, after all. More

  • in

    MIT announces 2024 Bose Grants

    MIT Provost Cynthia Barnhart announced four Professor Amar G. Bose Research Grants to support bold research projects across diverse areas of study, including a way to generate clean hydrogen from deep in the Earth, build an environmentally friendly house of basalt, design maternity clothing that monitors fetal health, and recruit sharks as ocean oxygen monitors.

    This year’s recipients are Iwnetim Abate, assistant professor of materials science and engineering; Andrew Babbin, the Cecil and Ida Green Associate Professor in Earth, Atmospheric and Planetary Sciences; Yoel Fink, professor of materials science and engineering and of electrical engineering and computer science; and Skylar Tibbits, associate professor of design research in the Department of Architecture.

    The program was named for the visionary founder of the Bose Corporation and MIT alumnus Amar G. Bose ’51, SM ’52, ScD ’56. After gaining admission to MIT, Bose became a top math student and a Fulbright Scholarship recipient. He spent 46 years as a professor at MIT, led innovations in sound design, and founded the Bose Corp. in 1964. MIT launched the Bose grant program 11 years ago to provide funding over a three-year period to MIT faculty who propose original, cross-disciplinary, and often risky research projects that would likely not be funded by conventional sources.

    “The promise of the Bose Fellowship is to help bold, daring ideas become realities, an approach that honors Amar Bose’s legacy,” says Barnhart. “Thanks to support from this program, these talented faculty members have the freedom to explore their bold and innovative ideas.”

    Deep and clean hydrogen futures

    A green energy future will depend on harnessing hydrogen as a clean energy source, sequestering polluting carbon dioxide, and mining the minerals essential to building clean energy technologies such as advanced batteries. Iwnetim Abate thinks he has a solution for all three challenges: an innovative hydrogen reactor.

    He plans to build a reactor that will create natural hydrogen from ultramafic mineral rocks in the crust. “The Earth is literally a giant hydrogen factory waiting to be tapped,” Abate explains. “A back-of-the-envelope calculation for the first seven kilometers of the Earth’s crust estimates that there is enough ultramafic rock to produce hydrogen for 250,000 years.”

    The reactor envisioned by Abate injects water to create a reaction that releases hydrogen, while also supporting the injection of climate-altering carbon dioxide into the rock, providing a global carbon capacity of 100 trillion tons. At the same time, the reactor process could provide essential elements such as lithium, nickel, and cobalt — some of the most important raw materials used in advanced batteries and electronics.

    “Ultimately, our goal is to design and develop a scalable reactor for simultaneously tapping into the trifecta from the Earth’s subsurface,” Abate says.

    Sharks as oceanographers

    If we want to understand more about how oxygen levels in the world’s seas are disturbed by human activities and climate change, we should turn to a sensing platform “that has been honed by 400 million years of evolution to perfectly sample the ocean: sharks,” says Andrew Babbin.

    As the planet warms, oceans are projected to contain less dissolved oxygen, with impacts on the productivity of global fisheries, natural carbon sequestration, and the flux of climate-altering greenhouse gasses from the ocean to the air. While scientists know dissolved oxygen is important, it has proved difficult to track over seasons, decades, and underexplored regions both shallow and deep.

    Babbin’s goal is to develop a low-cost sensor for dissolved oxygen that can be integrated with preexisting electronic shark tags used by marine biologists. “This fleet of sharks … will finally enable us to measure the extent of the low-oxygen zones of the ocean, how they change seasonally and with El Niño/La Niña oscillation, and how they expand or contract into the future.”

    The partnership with sharks will also spotlight the importance of these often-maligned animals for global marine and fisheries health, Babbin says. “We hope in pursuing this work marrying microscopic and macroscopic life we will inspire future oceanographers and conservationists, and lead to a better appreciation for the chemistry that underlies global habitability.”

    Maternity wear that monitors fetal health

    There are 2 million stillbirths around the world each year, and in the United States alone, 21,000 families suffer this terrible loss. In many cases, mothers and their doctors had no warning of any abnormalities or changes in fetal health leading up to these deaths. Yoel Fink and colleagues are looking for a better way to monitor fetal health and provide proactive treatment.

    Fink is building on years of research on acoustic fabrics to design an affordable shirt for mothers that would monitor and communicate important details of fetal health. His team’s original research drew inspiration from the function of the eardrum, designing a fiber that could be woven into other fabrics to create a kind of fabric microphone.

    “Given the sensitivity of the acoustic fabrics in sensing these nanometer-scale vibrations, could a mother’s clothing transcend its conventional role and become a health monitor, picking up on the acoustic signals and subsequent vibrations that arise from her unborn baby’s heartbeat and motion?” Fink says. “Could a simple and affordable worn fabric allow an expecting mom to sleep better, knowing that her fetus is being listened to continuously?”

    The proposed maternity shirt could measure fetal heart and breathing rate, and might be able to give an indication of the fetal body position, he says. In the final stages of development, he and his colleagues hope to develop machine learning approaches that would identify abnormal fetal heart rate and motion and deliver real-time alerts.

    A basalt house in Iceland

    In the land of volcanoes, Skylar Tibbits wants to build a case-study home almost entirely from the basalt rock that makes up the Icelandic landscape.

    Architects are increasingly interested in building using one natural material — creating a monomaterial structure — that can be easily recycled. At the moment, the building industry represents 40 percent of carbon emissions worldwide, and consists of many materials and structures, from metal to plastics to concrete, that can’t be easily disassembled or reused.

    The proposed basalt house in Iceland, a project co-led by J. Jih, associate professor of the practice in the Department of Architecture, is “an architecture that would be fully composed of the surrounding earth, that melts back into that surrounding earth at the end of its lifespan, and that can be recycled infinitely,” Tibbits explains.

    Basalt, the most common rock form in the Earth’s crust, can be spun into fibers for insulation and rebar. Basalt fiber performs as well as glass and carbon fibers at a lower cost in some applications, although it is not widely used in architecture. In cast form, it can make corrosion- and heat-resistant plumbing, cladding and flooring.

    “A monomaterial architecture is both a simple and radical proposal that unfortunately falls outside of traditional funding avenues,” says Tibbits. “The Bose grant is the perfect and perhaps the only option for our research, which we see as a uniquely achievable moonshot with transformative potential for the entire built environment.” More

  • in

    Featured video: Moooving the needle on methane

    Methane traps much more heat per pound than carbon dioxide, making it a powerful contributor to climate change. “In fact, methane emission removal is the fastest way that we can ensure immediate results for reduced global warming,” says Audrey Parker, a graduate student in the Department of Civil and Environmental Engineering.

    Parker and other researchers in the Methane Emission Removal Project are developing a catalyst that can convert methane to carbon dioxide. They are working to set up systems that would reduce methane in the air at dairy farms, which are major emitters of the gas. Overall, agricultural practices and waste generation are responsible for about 28 percent of the world’s methane emissions.

    “If we do our job really well, within the next five years, we will be able to reduce the operating temperature of this catalyst in a way that is net beneficial to the climate and potentially even economically incentivized for the farmer and for society,” says Desirée Plata, an associate professor of civil and environmental engineering who leads the Methane Emission Removal Project.

    Video by Melanie Gonick/MIT News | 4 minutes, 35 seconds More

  • in

    Using deep learning to image the Earth’s planetary boundary layer

    Although the troposphere is often thought of as the closest layer of the atmosphere to the Earth’s surface, the planetary boundary layer (PBL) — the lowest layer of the troposphere — is actually the part that most significantly influences weather near the surface. In the 2018 planetary science decadal survey, the PBL was raised as an important scientific issue that has the potential to enhance storm forecasting and improve climate projections.  

    “The PBL is where the surface interacts with the atmosphere, including exchanges of moisture and heat that help lead to severe weather and a changing climate,” says Adam Milstein, a technical staff member in Lincoln Laboratory’s Applied Space Systems Group. “The PBL is also where humans live, and the turbulent movement of aerosols throughout the PBL is important for air quality that influences human health.” 

    Although vital for studying weather and climate, important features of the PBL, such as its height, are difficult to resolve with current technology. In the past four years, Lincoln Laboratory staff have been studying the PBL, focusing on two different tasks: using machine learning to make 3D-scanned profiles of the atmosphere, and resolving the vertical structure of the atmosphere more clearly in order to better predict droughts.  

    This PBL-focused research effort builds on more than a decade of related work on fast, operational neural network algorithms developed by Lincoln Laboratory for NASA missions. These missions include the Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of Smallsats (TROPICS) mission as well as Aqua, a satellite that collects data about Earth’s water cycle and observes variables such as ocean temperature, precipitation, and water vapor in the atmosphere. These algorithms retrieve temperature and humidity from the satellite instrument data and have been shown to significantly improve the accuracy and usable global coverage of the observations over previous approaches. For TROPICS, the algorithms help retrieve data that are used to characterize a storm’s rapidly evolving structures in near-real time, and for Aqua, it has helped increase forecasting models, drought monitoring, and fire prediction. 

    These operational algorithms for TROPICS and Aqua are based on classic “shallow” neural networks to maximize speed and simplicity, creating a one-dimensional vertical profile for each spectral measurement collected by the instrument over each location. While this approach has improved observations of the atmosphere down to the surface overall, including the PBL, laboratory staff determined that newer “deep” learning techniques that treat the atmosphere over a region of interest as a three-dimensional image are needed to improve PBL details further.

    “We hypothesized that deep learning and artificial intelligence (AI) techniques could improve on current approaches by incorporating a better statistical representation of 3D temperature and humidity imagery of the atmosphere into the solutions,” Milstein says. “But it took a while to figure out how to create the best dataset — a mix of real and simulated data; we needed to prepare to train these techniques.”

    The team collaborated with Joseph Santanello of the NASA Goddard Space Flight Center and William Blackwell, also of the Applied Space Systems Group, in a recent NASA-funded effort showing that these retrieval algorithms can improve PBL detail, including more accurate determination of the PBL height than the previous state of the art. 

    While improved knowledge of the PBL is broadly useful for increasing understanding of climate and weather, one key application is prediction of droughts. According to a Global Drought Snapshot report released last year, droughts are a pressing planetary issue that the global community needs to address. Lack of humidity near the surface, specifically at the level of the PBL, is the leading indicator of drought. While previous studies using remote-sensing techniques have examined the humidity of soil to determine drought risk, studying the atmosphere can help predict when droughts will happen.  

    In an effort funded by Lincoln Laboratory’s Climate Change Initiative, Milstein, along with laboratory staff member Michael Pieper, are working with scientists at NASA’s Jet Propulsion Laboratory (JPL) to use neural network techniques to improve drought prediction over the continental United States. While the work builds off of existing operational work JPL has done incorporating (in part) the laboratory’s operational “shallow” neural network approach for Aqua, the team believes that this work and the PBL-focused deep learning research work can be combined to further improve the accuracy of drought prediction. 

    “Lincoln Laboratory has been working with NASA for more than a decade on neural network algorithms for estimating temperature and humidity in the atmosphere from space-borne infrared and microwave instruments, including those on the Aqua spacecraft,” Milstein says. “Over that time, we have learned a lot about this problem by working with the science community, including learning about what scientific challenges remain. Our long experience working on this type of remote sensing with NASA scientists, as well as our experience with using neural network techniques, gave us a unique perspective.”

    According to Milstein, the next step for this project is to compare the deep learning results to datasets from the National Oceanic and Atmospheric Administration, NASA, and the Department of Energy collected directly in the PBL using radiosondes, a type of instrument flown on a weather balloon. “These direct measurements can be considered a kind of ‘ground truth’ to quantify the accuracy of the techniques we have developed,” Milstein says.

    This improved neural network approach holds promise to demonstrate drought prediction that can exceed the capabilities of existing indicators, Milstein says, and to be a tool that scientists can rely on for decades to come. More

  • in

    New major crosses disciplines to address climate change

    Lauren Aguilar knew she wanted to study energy systems at MIT, but before Course 1-12 (Climate System Science and Engineering) became a new undergraduate major, she didn’t see an obvious path to study the systems aspects of energy, policy, and climate associated with the energy transition.

    Aguilar was drawn to the new major that was jointly launched by the departments of Civil and Environmental Engineering (CEE) and Earth, Atmospheric and Planetary Sciences (EAPS) in 2023. She could take engineering systems classes and gain knowledge in climate.

    “Having climate knowledge enriches my understanding of how to build reliable and resilient energy systems for climate change mitigation. Understanding upon what scale we can forecast and predict climate change is crucial to build the appropriate level of energy infrastructure,” says Aguilar.

    The interdisciplinary structure of the 1-12 major has students engaging with and learning from professors in different disciplines across the Institute. The blended major was designed to provide a foundational understanding of the Earth system and engineering principles — as well as an understanding of human and institutional behavior as it relates to the climate challenge. Students learn the fundamental sciences through subjects like an atmospheric chemistry class focused on the global carbon cycle or a physics class on low-carbon energy systems. The major also covers topics in data science and machine learning as they relate to forecasting climate risks and building resilience, in addition to policy, economics, and environmental justice studies.

    Junior Ananda Figueiredo was one of the first students to declare the 1-12 major. Her decision to change majors stemmed from a motivation to improve people’s lives, especially when it comes to equality. “I like to look at things from a systems perspective, and climate change is such a complicated issue connected to many different pieces of our society,” says Figueiredo.

    A multifaceted field of study

    The 1-12 major prepares students with the necessary foundational expertise across disciplines to confront climate change. Andrew Babbin, an academic advisor in the new degree program and the Cecil and Ida Green Career Development Associate Professor in EAPS, says the new major harnesses rigorous training encompassing science, engineering, and policy to design and execute a way forward for society.

    Within its first year, Course 1-12 has attracted students with a diverse set of interests, ranging from machine learning for sustainability to nature-based solutions for carbon management to developing the next renewable energy technology and integrating it into the power system.

    Academic advisor Michael Howland, the Esther and Harold E. Edgerton Assistant Professor of Civil and Environmental Engineering, says the best part of this degree is the students, and the enthusiasm and optimism they bring to the climate challenge.

    “We have students seeking to impact policy and students double-majoring in computer science. For this generation, climate change is a challenge for today, not for the future. Their actions inside and outside the classroom speak to the urgency of the challenge and the promise that we can solve it,” Howland says.

    The degree program also leaves plenty of space for students to develop and follow their interests. Sophomore Katherine Kempff began this spring semester as a 1-12 major interested in sustainability and renewable energy. Kempff was worried she wouldn’t be able to finish 1-12 once she made the switch to a different set of classes, but Howland assured her there would be no problems, based on the structure of 1-12.

    “I really like how flexible 1-12 is. There’s a lot of classes that satisfy the requirements, and you are not pigeonholed. I feel like I’m going to be able to do what I’m interested in, rather than just following a set path of a major,” says Kempff.

    Kempff is leveraging her skills she developed this semester and exploring different career interests. She is interviewing for sustainability and energy-sector internships in Boston and MIT this summer, and is particularly interested in assisting MIT in meeting its new sustainability goals.

    Engineering a sustainable future

    The new major dovetail’s MIT’s commitment to address climate change with its steps in prioritizing and enhancing climate education. As the Institute continues making strides to accelerate solutions, students can play a leading role in changing the future.   

    “Climate awareness is critical to all MIT students, most of whom will face the consequences of the projection models for the end of the century,” says Babbin. “One-12 will be a focal point of the climate education mission to train the brightest and most creative students to engineer a better world and understand the complex science necessary to design and verify any solutions they invent.”

    Justin Cole, who transferred to MIT in January from the University of Colorado, served in the U.S. Air Force for nine years. Over the course of his service, he had a front row seat to the changing climate. From helping with the wildfire cleanup in Black Forest, Colorado — after the state’s most destructive fire at the time — to witnessing two category 5 typhoons in Japan in 2018, Cole’s experiences of these natural disasters impressed upon him that climate security was a prerequisite to international security. 

    Cole was recently accepted into the MIT Energy and Climate Club Launchpad initiative where he will work to solve real-world climate and energy problems with professionals in industry.

    “All of the dots are connecting so far in my classes, and all the hopes that I have for studying the climate crisis and the solutions to it at MIT are coming true,” says Cole.

    With a career path that is increasingly growing, there is a rising demand for scientists and engineers who have both deep knowledge of environmental and climate systems and expertise in methods for climate change mitigation.

    “Climate science must be coupled with climate solutions. As we experience worsening climate change, the environmental system will increasingly behave in new ways that we haven’t seen in the past,” says Howland. “Solutions to climate change must go beyond good engineering of small-scale components. We need to ensure that our system-scale solutions are maximally effective in reducing climate change, but are also resilient to climate change. And there is no time to waste,” he says. More

  • in

    Q&A: Claire Walsh on how J-PAL’s King Climate Action Initiative tackles the twin climate and poverty crises

    The King Climate Action Initiative (K-CAI) is the flagship climate change program of the Abdul Latif Jameel Poverty Action Lab (J-PAL), which innovates, tests, and scales solutions at the nexus of climate change and poverty alleviation, together with policy partners worldwide.

    Claire Walsh is the associate director of policy at J-PAL Global at MIT. She is also the project director of K-CAI. Here, Walsh talks about the work of K-CAI since its launch in 2020, and describes the ways its projects are making a difference. This is part of an ongoing series exploring how the MIT School of Humanities, Arts, and Social Sciences is addressing the climate crisis.

    Q: According to the King Climate Action Initiative (K-CAI), any attempt to address poverty effectively must also simultaneously address climate change. Why is that?

    A: Climate change will disproportionately harm people in poverty, particularly in low- and middle-income countries, because they tend to live in places that are more exposed to climate risk. These are nations in sub-Saharan Africa and South and Southeast Asia where low-income communities rely heavily on agriculture for their livelihoods, so extreme weather — heat, droughts, and flooding — can be devastating for people’s jobs and food security. In fact, the World Bank estimates that up to 130 million more people may be pushed into poverty by climate change by 2030.

    This is unjust because these countries have historically emitted the least; their people didn’t cause the climate crisis. At the same time, they are trying to improve their economies and improve people’s welfare, so their energy demands are increasing, and they are emitting more. But they don’t have the same resources as wealthy nations for mitigation or adaptation, and many developing countries understandably don’t feel eager to put solving a problem they didn’t create at the top of their priority list. This makes finding paths forward to cutting emissions on a global scale politically challenging.

    For these reasons, the problems of enhancing the well-being of people experiencing poverty, addressing inequality, and reducing pollution and greenhouse gases are inextricably linked.

    Q: So how does K-CAI tackle this hybrid challenge?

    A: Our initiative is pretty unique. We are a competitive, policy-based research and development fund that focuses on innovating, testing, and scaling solutions. We support researchers from MIT and other universities, and their collaborators, who are actually implementing programs, whether NGOs [nongovernmental organizations], government, or the private sector. We fund pilots of small-scale ideas in a real-world setting to determine if they hold promise, followed by larger randomized, controlled trials of promising solutions in climate change mitigation, adaptation, pollution reduction, and energy access. Our goal is to determine, through rigorous research, if these solutions are actually working — for example, in cutting emissions or protecting forests or helping vulnerable communities adapt to climate change. And finally, we offer path-to-scale grants which enable governments and NGOs to expand access to programs that have been tested and have strong evidence of impact.

    We think this model is really powerful. Since we launched in 2020, we have built a portfolio of over 30 randomized evaluations and 13 scaling projects in more than 35 countries. And to date, these projects have informed the scale ups of evidence-based climate policies that have reached over 15 million people.

    Q: It seems like K-CAI is advancing a kind of policy science, demanding proof of a program’s capacity to deliver results at each stage. 

    A: This is one of the factors that drew me to J-PAL back in 2012. I majored in anthropology and studied abroad in Uganda. From those experiences I became very passionate about pursuing a career focused on poverty reduction. To me, it is unfair that in a world full of so much wealth and so much opportunity there exists so much extreme poverty. I wanted to dedicate my career to that, but I’m also a very detail-oriented nerd who really cares about whether a program that claims to be doing something for people is accomplishing what it claims.

    It’s been really rewarding to see demand from governments and NGOs for evidence-informed policymaking grow over my 12 years at J-PAL. This policy science approach holds exciting promise to help transform public policy and climate policy in the coming decades.  

    Q: Can you point to K-CAI-funded projects that meet this high bar and are now making a significant impact?

    A: Several examples jump to mind. In the state of Gujarat, India, pollution regulators are trying to cut particulate matter air pollution, which is devastating to human health. The region is home to many major industries whose emissions negatively affect most of the state’s 70 million residents.

    We partnered with state pollution regulators — kind of a regional EPA [Environmental Protection Agency] — to test an emissions trading scheme that is used widely in the U.S. and Europe but not in low- and middle-income countries. The government monitors pollution levels using technology installed at factories that sends data in real time, so the regulator knows exactly what their emissions look like. The regulator sets a cap on the overall level of pollution, allocates permits to pollute, and industries can trade emissions permits.

    In 2019, researchers in the J-PAL network conducted the world’s first randomized, controlled trial of this emissions trading scheme and found that it cut pollution by 20 to 30 percent — a surprising reduction. It also reduced firms’ costs, on average, because the costs of compliance went down. The state government was eager to scale up the pilot, and in the past two years, two other cities, including Ahmedabad, the biggest city in the state, have adopted the concept.

    We are also supporting a project in Niger, whose economy is hugely dependent on rain-fed agriculture but with climate change is experiencing rapid desertification. Researchers in the J-PAL network have been testing training farmers in a simple, inexpensive rainwater harvesting technique, where farmers dig a half-moon-shaped hole called a demi-lune right before the rainy season. This demi-lune feeds crops that are grown directly on top of it, and helps return land that resembled flat desert to arable production.

    Researchers found that training farmers in this simple technology increased adoption from 4 percent to 94 percent and that demi-lunes increased agricultural output and revenue for farmers from the first year. K-CAI is funding a path-to-scale grant so local implementers can teach this technique to over 8,000 farmers and build a more cost-effective program model. If this takes hold, the team will work with local partners to scale the training to other relevant regions of the country and potentially other countries in the Sahel.

    One final example that we are really proud of, because we first funded it as a pilot and now it’s in the path to scale phase: We supported a team of researchers working with partners in Bangladesh trying to reduce carbon emissions and other pollution from brick manufacturing, an industry that generates 17 percent of the country’s carbon emissions. The scale of manufacturing is so great that at some times of year, Dhaka (the capital of Bangladesh) looks like Mordor.

    Workers form these bricks and stack hundreds of thousands of them, which they then fire by burning coal. A team of local researchers and collaborators from our J-PAL network found that you can reduce the amount of coal needed for the kilns by making some low-cost changes to the manufacturing process, including stacking the bricks in a way that increases airflow in the kiln and feeding the coal fires more frequently in smaller rather than larger batches.

    In the randomized, controlled trial K-CAI supported, researchers found that this cut carbon and pollution emissions significantly, and now the government has invited the team to train 1,000 brick manufacturers in Dhaka in these techniques.

    Q: These are all fascinating and powerful instances of implementing ideas that address a range of problems in different parts of the world. But can K-CAI go big enough and fast enough to take a real bite out of the twin poverty and climate crisis?

    A: We’re not trying to find silver bullets. We are trying to build a large playbook of real solutions that work to solve specific problems in specific contexts. As you build those up in the hundreds, you have a deep bench of effective approaches to solve problems that can add up in a meaningful way. And because J-PAL works with governments and NGOs that have the capacity to take the research into action, since 2003, over 600 million people around the world have been reached by policies and programs that are informed by evidence that J-PAL-affiliated researchers produced. While global challenges seem daunting, J-PAL has shown that in 20 years we can achieve a great deal, and there is huge potential for future impact.

    But unfortunately, globally, there is an underinvestment in policy innovation to combat climate change that may generate quicker, lower-cost returns at a large scale — especially in policies that determine which technologies get adopted or commercialized. For example, a lot of the huge fall in prices of renewable energy was enabled by early European government investments in solar and wind, and then continuing support for innovation in renewable energy.

    That’s why I think social sciences have so much to offer in the fight against climate change and poverty; we are working where technology meets policy and where technology meets real people, which often determines their success or failure. The world should be investing in policy, economic, and social innovation just as much as it is investing in technological innovation.

    Q: Do you need to be an optimist in your job?

    A: I am half-optimist, half-pragmatist. I have no control over the climate change outcome for the world. And regardless of whether we can successfully avoid most of the potential damages of climate change, when I look back, I’m going to ask myself, “Did I fight or not?” The only choice I have is whether or not I fought, and I want to be a fighter. More