More stories

  • in

    Study: Heavy snowfall and rain may contribute to some earthquakes

    When scientists look for an earthquake’s cause, their search often starts underground. As centuries of seismic studies have made clear, it’s the collision of tectonic plates and the movement of subsurface faults and fissures that primarily trigger a temblor.But MIT scientists have now found that certain weather events may also play a role in setting off some quakes.In a study appearing today in Science Advances, the researchers report that episodes of heavy snowfall and rain likely contributed to a swarm of earthquakes over the past several years in northern Japan. The study is the first to show that climate conditions could initiate some quakes.“We see that snowfall and other environmental loading at the surface impacts the stress state underground, and the timing of intense precipitation events is well-correlated with the start of this earthquake swarm,” says study author William Frank, an assistant professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, climate obviously has an impact on the response of the solid earth, and part of that response is earthquakes.”The new study focuses on a series of ongoing earthquakes in Japan’s Noto Peninsula. The team discovered that seismic activity in the region is surprisingly synchronized with certain changes in underground pressure, and that those changes are influenced by seasonal patterns of snowfall and precipitation. The scientists suspect that this new connection between quakes and climate may not be unique to Japan and could play a role in shaking up other parts of the world.Looking to the future, they predict that the climate’s influence on earthquakes could be more pronounced with global warming.“If we’re going into a climate that’s changing, with more extreme precipitation events, and we expect a redistribution of water in the atmosphere, oceans, and continents, that will change how the Earth’s crust is loaded,” Frank adds. “That will have an impact for sure, and it’s a link we could further explore.”The study’s lead author is former MIT research associate Qing-Yu Wang (now at Grenoble Alpes University), and also includes EAPS postdoc Xin Cui, Yang Lu of the University of Vienna, Takashi Hirose of Tohoku University, and Kazushige Obara of the University of Tokyo.Seismic speedSince late 2020, hundreds of small earthquakes have shaken up Japan’s Noto Peninsula — a finger of land that curves north from the country’s main island into the Sea of Japan. Unlike a typical earthquake sequence, which begins as a main shock that gives way to a series of aftershocks before dying out, Noto’s seismic activity is an “earthquake swarm” — a pattern of multiple, ongoing quakes with no obvious main shock, or seismic trigger.The MIT team, along with their colleagues in Japan, aimed to spot any patterns in the swarm that would explain the persistent quakes. They started by looking through the Japanese Meteorological Agency’s catalog of earthquakes that provides data on seismic activity throughout the country over time. They focused on quakes in the Noto Peninsula over the last 11 years, during which the region has experienced episodic earthquake activity, including the most recent swarm.With seismic data from the catalog, the team counted the number of seismic events that occurred in the region over time, and found that the timing of quakes prior to 2020 appeared sporadic and unrelated, compared to late 2020, when earthquakes grew more intense and clustered in time, signaling the start of the swarm, with quakes that are correlated in some way.The scientists then looked to a second dataset of seismic measurements taken by monitoring stations over the same 11-year period. Each station continuously records any displacement, or local shaking that occurs. The shaking from one station to another can give scientists an idea of how fast a seismic wave travels between stations. This “seismic velocity” is related to the structure of the Earth through which the seismic wave is traveling. Wang used the station measurements to calculate the seismic velocity between every station in and around Noto over the last 11 years.The researchers generated an evolving picture of seismic velocity beneath the Noto Peninsula and observed a surprising pattern: In 2020, around when the earthquake swarm is thought to have begun, changes in seismic velocity appeared to be synchronized with the seasons.“We then had to explain why we were observing this seasonal variation,” Frank says.Snow pressureThe team wondered whether environmental changes from season to season could influence the underlying structure of the Earth in a way that would set off an earthquake swarm. Specifically, they looked at how seasonal precipitation would affect the underground “pore fluid pressure” — the amount of pressure that fluids in the Earth’s cracks and fissures exert within the bedrock.“When it rains or snows, that adds weight, which increases pore pressure, which allows seismic waves to travel through slower,” Frank explains. “When all that weight is removed, through evaporation or runoff, all of a sudden, that pore pressure decreases and seismic waves are faster.”Wang and Cui developed a hydromechanical model of the Noto Peninsula to simulate the underlying pore pressure over the last 11 years in response to seasonal changes in precipitation. They fed into the model meteorological data from this same period, including measurements of daily snow, rainfall, and sea-level changes. From their model, they were able to track changes in excess pore pressure beneath the Noto Peninsula, before and during the earthquake swarm. They then compared this timeline of evolving pore pressure with their evolving picture of seismic velocity.“We had seismic velocity observations, and we had the model of excess pore pressure, and when we overlapped them, we saw they just fit extremely well,” Frank says.In particular, they found that when they included snowfall data, and especially, extreme snowfall events, the fit between the model and observations was stronger than if they only considered rainfall and other events. In other words, the ongoing earthquake swarm that Noto residents have been experiencing can be explained in part by seasonal precipitation, and particularly, heavy snowfall events.“We can see that the timing of these earthquakes lines up extremely well with multiple times where we see intense snowfall,” Frank says. “It’s well-correlated with earthquake activity. And we think there’s a physical link between the two.”The researchers suspect that heavy snowfall and similar extreme precipitation could play a role in earthquakes elsewhere, though they emphasize that the primary trigger will always originate underground.“When we first want to understand how earthquakes work, we look to plate tectonics, because that is and will always be the number one reason why an earthquake happens,” Frank says. “But, what are the other things that could affect when and how an earthquake happens? That’s when you start to go to second-order controlling factors, and the climate is obviously one of those.”This research was supported, in part, by the National Science Foundation. More

  • in

    Two MIT PhD students awarded J-WAFS fellowships for their research on water

    Since 2014, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) has advanced interdisciplinary research aimed at solving the world’s most pressing water and food security challenges to meet human needs. In 2017, J-WAFS established the Rasikbhai L. Meswani Water Solutions Fellowship and the J-WAFS Graduate Student Fellowship. These fellowships provide support to outstanding MIT graduate students who are pursuing research that has the potential to improve water and food systems around the world. Recently, J-WAFS awarded the 2024-25 fellowships to Jonathan Bessette and Akash Ball, two MIT PhD students dedicated to addressing water scarcity by enhancing desalination and purification processes. This work is of important relevance since the world’s freshwater supply has been steadily depleting due to the effects of climate change. In fact, one-third of the global population lacks access to safe drinking water. Bessette and Ball are focused on designing innovative solutions to enhance the resilience and sustainability of global water systems. To support their endeavors, J-WAFS will provide each recipient with funding for one academic semester for continued research and related activities.“This year, we received many strong fellowship applications,” says J-WAFS executive director Renee J. Robins. “Bessette and Ball both stood out, even in a very competitive pool of candidates. The award of the J-WAFS fellowships to these two students underscores our confidence in their potential to bring transformative solutions to global water challenges.”2024-25 Rasikbhai L. Meswani Fellowship for Water SolutionsThe Rasikbhai L. Meswani Fellowship for Water Solutions is a doctoral fellowship for students pursuing research related to water and water supply at MIT. The fellowship is made possible by Elina and Nikhil Meswani and family. Jonathan Bessette is a doctoral student in the Global Engineering and Research (GEAR) Center within the Department of Mechanical Engineering at MIT, advised by Professor Amos Winter. His research is focused on water treatment systems for the developing world, mainly desalination, or the process in which salts are removed from water. Currently, Bessette is working on designing and constructing a low-cost, deployable, community-scale desalination system for humanitarian crises.In arid and semi-arid regions, groundwater often serves as the sole water source, despite its common salinity issues. Many remote and developing areas lack reliable centralized power and water systems, making brackish groundwater desalination a vital, sustainable solution for global water scarcity. “An overlooked need for desalination is inland groundwater aquifers, rather than in coastal areas,” says Bessette. “This is because much of the population lives far enough from a coast that seawater desalination could never reach them. My work involves designing low-cost, sustainable, renewable-powered desalination technologies for highly constrained situations, such as drinking water for remote communities,” he adds.To achieve this goal, Bessette developed a batteryless, renewable electrodialysis desalination system. The technology is energy-efficient, conserves water, and is particularly suited for challenging environments, as it is decentralized and sustainable. The system offers significant advantages over the conventional reverse osmosis method, especially in terms of reduced energy consumption for treating brackish water. Highlighting Bessette’s capacity for engineering insight, his advisor noted the “simple and elegant solution” that Bessette and a staff engineer, Shane Pratt, devised that negated the need for the system to have large batteries. Bessette is now focusing on simplifying the system’s architecture to make it more reliable and cost-effective for deployment in remote areas.Growing up in upstate New York, Bessette completed a bachelor’s degree at the State University of New York at Buffalo. As an undergrad, he taught middle and high school students in low-income areas of Buffalo about engineering and sustainability. However, he cited his junior-year travel to India and his experience there measuring water contaminants in rural sites as cementing his dedication to a career addressing food, water, and sanitation challenges. In addition to his doctoral research, his commitment to these goals is further evidenced by another project he is pursuing, funded by a J-WAFS India grant, that uses low-cost, remote sensors to better understand water fetching practices. Bessette is conducting this work with fellow MIT student Gokul Sampath in order to help families in rural India gain access to safe drinking water.2024-25 J-WAFS Graduate Student Fellowship for Water and Food SolutionsThe J-WAFS Graduate Student Fellowship is supported by the J-WAFS Research Affiliate Program, which offers companies the opportunity to engage with MIT on water and food research. Current fellowship support was provided by two J-WAFS Research Affiliates: Xylem, a leading U.S.-based provider of water treatment and infrastructure solutions, and GoAigua, a Spanish company at the forefront of digital transformation in the water industry through innovative solutions. Akash Ball is a doctoral candidate in the Department of Chemical Engineering, advised by Professor Heather Kulik. His research focuses on the computational discovery of novel functional materials for energy-efficient ion separation membranes with high selectivity. Advanced membranes like these are increasingly needed for applications such as water desalination, battery recycling, and removal of heavy metals from industrial wastewater. “Climate change, water pollution, and scarce freshwater reserves cause severe water distress for about 4 billion people annually, with 2 billion in India and China’s semiarid regions,” Ball notes. “One potential solution to this global water predicament is the desalination of seawater, since seawater accounts for 97 percent of all water on Earth.”Although several commercial reverse osmosis membranes are currently available, these membranes suffer several problems, like slow water permeation, permeability-selectivity trade-off, and high fabrication costs. Metal-organic frameworks (MOFs) are porous crystalline materials that are promising candidates for highly selective ion separation with fast water transport due to high surface area, the presence of different pore windows, and the tunability of chemical functionality.In the Kulik lab, Ball is developing a systematic understanding of how MOF chemistry and pore geometry affect water transport and ion rejection rates. By the end of his PhD, Ball plans to identify existing, best-performing MOFs with unparalleled water uptake using machine learning models, propose novel hypothetical MOFs tailored to specific ion separations from water, and discover experimental design rules that enable the synthesis of next-generation membranes.  Ball’s advisor praised the creativity he brings to his research, and his leadership skills that benefit her whole lab. Before coming to MIT, Ball obtained a master’s degree in chemical engineering from the Indian Institute of Technology (IIT) Bombay and a bachelor’s degree in chemical engineering from Jadavpur University in India. During a research internship at IIT Bombay in 2018, he worked on developing a technology for in situ arsenic detection in water. Like Bessette, he noted the impact of this prior research experience on his interest in global water challenges, along with his personal experience growing up in an area in India where access to safe drinking water was not guaranteed. More

  • in

    HPI-MIT design research collaboration creates powerful teams

    The recent ransomware attack on ChangeHealthcare, which severed the network connecting health care providers, pharmacies, and hospitals with health insurance companies, demonstrates just how disruptive supply chain attacks can be. In this case, it hindered the ability of those providing medical services to submit insurance claims and receive payments.This sort of attack and other forms of data theft are becoming increasingly common and often target large, multinational corporations through the small and mid-sized vendors in their corporate supply chains, enabling breaks in these enormous systems of interwoven companies.Cybersecurity researchers at MIT and the Hasso Plattner Institute (HPI) in Potsdam, Germany, are focused on the different organizational security cultures that exist within large corporations and their vendors because it’s that difference that creates vulnerabilities, often due to the lack of emphasis on cybersecurity by the senior leadership in these small to medium-sized enterprises (SMEs).Keri Pearlson, executive director of Cybersecurity at MIT Sloan (CAMS); Jillian Kwong, a research scientist at CAMS; and Christian Doerr, a professor of cybersecurity and enterprise security at HPI, are co-principal investigators (PIs) on the research project, “Culture and the Supply Chain: Transmitting Shared Values, Attitudes and Beliefs across Cybersecurity Supply Chains.”Their project was selected in the 2023 inaugural round of grants from the HPI-MIT Designing for Sustainability program, a multiyear partnership funded by HPI and administered by the MIT Morningside Academy for Design (MAD). The program awards about 10 grants annually of up to $200,000 each to multidisciplinary teams with divergent backgrounds in computer science, artificial intelligence, machine learning, engineering, design, architecture, the natural sciences, humanities, and business and management. The 2024 Call for Applications is open through June 3.Designing for Sustainability grants support scientific research that promotes the United Nations’ Sustainable Development Goals (SDGs) on topics involving sustainable design, innovation, and digital technologies, with teams made up of PIs from both institutions. The PIs on these projects, who have common interests but different strengths, create more powerful teams by working together.Transmitting shared values, attitudes, and beliefs to improve cybersecurity across supply chainsThe MIT and HPI cybersecurity researchers say that most ransomware attacks aren’t reported. Smaller companies hit with ransomware attacks just shut down, because they can’t afford the payment to retrieve their data. This makes it difficult to know just how many attacks and data breaches occur. “As more data and processes move online and into the cloud, it becomes even more important to focus on securing supply chains,” Kwong says. “Investing in cybersecurity allows information to be exchanged freely while keeping data safe. Without it, any progress towards sustainability is stalled.”One of the first large data breaches in the United States to be widely publicized provides a clear example of how an SME cybersecurity can leave a multinational corporation vulnerable to attack. In 2013, hackers entered the Target Corporation’s own network by obtaining the credentials of a small vendor in its supply chain: a Pennsylvania HVAC company. Through that breach, thieves were able to install malware that stole the financial and personal information of 110 million Target customers, which they sold to card shops on the black market.To prevent such attacks, SME vendors in a large corporation’s supply chain are required to agree to follow certain security measures, but the SMEs usually don’t have the expertise or training to make good on these cybersecurity promises, leaving their own systems, and therefore any connected to them, vulnerable to attack.“Right now, organizations are connected economically, but not aligned in terms of organizational culture, values, beliefs, and practices around cybersecurity,” explains Kwong. “Basically, the big companies are realizing the smaller ones are not able to implement all the cybersecurity requirements. We have seen some larger companies address this by reducing requirements or making the process shorter. However, this doesn’t mean companies are more secure; it just lowers the bar for the smaller suppliers to clear it.”Pearlson emphasizes the importance of board members and senior management taking responsibility for cybersecurity in order to change the culture at SMEs, rather than pushing that down to a single department, IT office, or in some cases, one IT employee.The research team is using case studies based on interviews, field studies, focus groups, and direct observation of people in their natural work environments to learn how companies engage with vendors, and the specific ways cybersecurity is implemented, or not, in everyday operations. The goal is to create a shared culture around cybersecurity that can be adopted correctly by all vendors in a supply chain.This approach is in line with the goals of the Charter of Trust Initiative, a partnership of large, multinational corporations formed to establish a better means of implementing cybersecurity in the supply chain network. The HPI-MIT team worked with companies from the Charter of Trust and others last year to understand the impacts of cybersecurity regulation on SME participation in supply chains and develop a conceptual framework to implement changes for stabilizing supply chains.Cybersecurity is a prerequisite needed to achieve any of the United Nations’ SDGs, explains Kwong. Without secure supply chains, access to key resources and institutions can be abruptly cut off. This could include food, clean water and sanitation, renewable energy, financial systems, health care, education, and resilient infrastructure. Securing supply chains helps enable progress on all SDGs, and the HPI-MIT project specifically supports SMEs, which are a pillar of the U.S. and European economies.Personalizing product designs while minimizing material wasteIn a vastly different Designing for Sustainability joint research project that employs AI with engineering, “Personalizing Product Designs While Minimizing Material Waste” will use AI design software to lay out multiple parts of a pattern on a sheet of plywood, acrylic, or other material, so that they can be laser cut to create new products in real time without wasting material.Stefanie Mueller, the TIBCO Career Development Associate Professor in the MIT Department of Electrical Engineering and Computer Science and a member of the Computer Science and Artificial Intelligence Laboratory, and Patrick Baudisch, a professor of computer science and chair of the Human Computer Interaction Lab at HPI, are co-PIs on the project. The two have worked together for years; Baudisch was Mueller’s PhD research advisor at HPI.Baudisch’s lab developed an online design teaching system called Kyub that lets students design 3D objects in pieces that are laser cut from sheets of wood and assembled to become chairs, speaker boxes, radio-controlled aircraft, or even functional musical instruments. For instance, each leg of a chair would consist of four identical vertical pieces attached at the edges to create a hollow-centered column, four of which will provide stability to the chair, even though the material is very lightweight.“By designing and constructing such furniture, students learn not only design, but also structural engineering,” Baudisch says. “Similarly, by designing and constructing musical instruments, they learn about structural engineering, as well as resonance, types of musical tuning, etc.”Mueller was at HPI when Baudisch developed the Kyub software, allowing her to observe “how they were developing and making all the design decisions,” she says. “They built a really neat piece for people to quickly design these types of 3D objects.” However, using Kyub for material-efficient design is not fast; in order to fabricate a model, the software has to break the 3D models down into 2D parts and lay these out on sheets of material. This takes time, and makes it difficult to see the impact of design decisions on material use in real-time.Mueller’s lab at MIT developed software based on a layout algorithm that uses AI to lay out pieces on sheets of material in real time. This allows AI to explore multiple potential layouts while the user is still editing, and thus provide ongoing feedback. “As the user develops their design, Fabricaide decides good placements of parts onto the user’s available materials, provides warnings if the user does not have enough material for a design, and makes suggestions for how the user can resolve insufficient material cases,” according to the project website.The joint MIT-HPI project integrates Mueller’s AI software with Baudisch’s Kyub software and adds machine learning to train the AI to offer better design suggestions that save material while adhering to the user’s design intent.“The project is all about minimizing the waste on these materials sheets,” Mueller says. She already envisions the next step in this AI design process: determining how to integrate the laws of physics into the AI’s knowledge base to ensure the structural integrity and stability of objects it designs.AI-powered startup design for the Anthropocene: Providing guidance for novel enterprisesThrough her work with the teams of MITdesignX and its international programs, Svafa Grönfeldt, faculty director of MITdesignX and professor of the practice in MIT MAD, has helped scores of people in startup companies use the tools and methods of design to ensure that the solution a startup proposes actually fits the problem it seeks to solve. This is often called the problem-solution fit.Grönfeldt and MIT postdoc Norhan Bayomi are now extending this work to incorporate AI into the process, in collaboration with MIT Professor John Fernández and graduate student Tyler Kim. The HPI team includes Professor Gerard de Melo; HPI School of Entrepreneurship Director Frank Pawlitschek; and doctoral student Michael Mansfeld.“The startup ecosystem is characterized by uncertainty and volatility compounded by growing uncertainties in climate and planetary systems,” Grönfeldt says. “Therefore, there is an urgent need for a robust model that can objectively predict startup success and guide design for the Anthropocene.”While startup-success forecasting is gaining popularity, it currently focuses on aiding venture capitalists in selecting companies to fund, rather than guiding the startups in the design of their products, services and business plans.“The coupling of climate and environmental priorities with startup agendas requires deeper analytics for effective enterprise design,” Grönfeldt says. The project aims to explore whether AI-augmented decision-support systems can enhance startup-success forecasting.“We’re trying to develop a machine learning approach that will give a forecasting of probability of success based on a number of parameters, including the type of business model proposed, how the team came together, the team members’ backgrounds and skill sets, the market and industry sector they’re working in and the problem-solution fit,” says Bayomi, who works with Fernández in the MIT Environmental Solutions Initiative. The two are co-founders of the startup Lamarr.AI, which employs robotics and AI to help reduce the carbon dioxide impact of the built environment.The team is studying “how company founders make decisions across four key areas, starting from the opportunity recognition, how they are selecting the team members, how they are selecting the business model, identifying the most automatic strategy, all the way through the product market fit to gain an understanding of the key governing parameters in each of these areas,” explains Bayomi.The team is “also developing a large language model that will guide the selection of the business model by using large datasets from different companies in Germany and the U.S. We train the model based on the specific industry sector, such as a technology solution or a data solution, to find what would be the most suitable business model that would increase the success probability of a company,” she says.The project falls under several of the United Nations’ Sustainable Development Goals, including economic growth, innovation and infrastructure, sustainable cities and communities, and climate action.Furthering the goals of the HPI-MIT Joint Research ProgramThese three diverse projects all advance the mission of the HPI-MIT collaboration. MIT MAD aims to use design to transform learning, catalyze innovation, and empower society by inspiring people from all disciplines to interweave design into problem-solving. HPI uses digital engineering concentrated on the development and research of user-oriented innovations for all areas of life.Interdisciplinary teams with members from both institutions are encouraged to develop and submit proposals for ambitious, sustainable projects that use design strategically to generate measurable, impactful solutions to the world’s problems. More

  • in

    An AI dataset carves new paths to tornado detection

    The return of spring in the Northern Hemisphere touches off tornado season. A tornado’s twisting funnel of dust and debris seems an unmistakable sight. But that sight can be obscured to radar, the tool of meteorologists. It’s hard to know exactly when a tornado has formed, or even why.

    A new dataset could hold answers. It contains radar returns from thousands of tornadoes that have hit the United States in the past 10 years. Storms that spawned tornadoes are flanked by other severe storms, some with nearly identical conditions, that never did. MIT Lincoln Laboratory researchers who curated the dataset, called TorNet, have now released it open source. They hope to enable breakthroughs in detecting one of nature’s most mysterious and violent phenomena.

    “A lot of progress is driven by easily available, benchmark datasets. We hope TorNet will lay a foundation for machine learning algorithms to both detect and predict tornadoes,” says Mark Veillette, the project’s co-principal investigator with James Kurdzo. Both researchers work in the Air Traffic Control Systems Group. 

    Along with the dataset, the team is releasing models trained on it. The models show promise for machine learning’s ability to spot a twister. Building on this work could open new frontiers for forecasters, helping them provide more accurate warnings that might save lives. 

    Swirling uncertainty

    About 1,200 tornadoes occur in the United States every year, causing millions to billions of dollars in economic damage and claiming 71 lives on average. Last year, one unusually long-lasting tornado killed 17 people and injured at least 165 others along a 59-mile path in Mississippi.  

    Yet tornadoes are notoriously difficult to forecast because scientists don’t have a clear picture of why they form. “We can see two storms that look identical, and one will produce a tornado and one won’t. We don’t fully understand it,” Kurdzo says.

    A tornado’s basic ingredients are thunderstorms with instability caused by rapidly rising warm air and wind shear that causes rotation. Weather radar is the primary tool used to monitor these conditions. But tornadoes lay too low to be detected, even when moderately close to the radar. As the radar beam with a given tilt angle travels further from the antenna, it gets higher above the ground, mostly seeing reflections from rain and hail carried in the “mesocyclone,” the storm’s broad, rotating updraft. A mesocyclone doesn’t always produce a tornado.

    With this limited view, forecasters must decide whether or not to issue a tornado warning. They often err on the side of caution. As a result, the rate of false alarms for tornado warnings is more than 70 percent. “That can lead to boy-who-cried-wolf syndrome,” Kurdzo says.  

    In recent years, researchers have turned to machine learning to better detect and predict tornadoes. However, raw datasets and models have not always been accessible to the broader community, stifling progress. TorNet is filling this gap.

    The dataset contains more than 200,000 radar images, 13,587 of which depict tornadoes. The rest of the images are non-tornadic, taken from storms in one of two categories: randomly selected severe storms or false-alarm storms (those that led a forecaster to issue a warning but that didn’t produce a tornado).

    Each sample of a storm or tornado comprises two sets of six radar images. The two sets correspond to different radar sweep angles. The six images portray different radar data products, such as reflectivity (showing precipitation intensity) or radial velocity (indicating if winds are moving toward or away from the radar).

    A challenge in curating the dataset was first finding tornadoes. Within the corpus of weather radar data, tornadoes are extremely rare events. The team then had to balance those tornado samples with difficult non-tornado samples. If the dataset were too easy, say by comparing tornadoes to snowstorms, an algorithm trained on the data would likely over-classify storms as tornadic.

    “What’s beautiful about a true benchmark dataset is that we’re all working with the same data, with the same level of difficulty, and can compare results,” Veillette says. “It also makes meteorology more accessible to data scientists, and vice versa. It becomes easier for these two parties to work on a common problem.”

    Both researchers represent the progress that can come from cross-collaboration. Veillette is a mathematician and algorithm developer who has long been fascinated by tornadoes. Kurdzo is a meteorologist by training and a signal processing expert. In grad school, he chased tornadoes with custom-built mobile radars, collecting data to analyze in new ways.

    “This dataset also means that a grad student doesn’t have to spend a year or two building a dataset. They can jump right into their research,” Kurdzo says.

    This project was funded by Lincoln Laboratory’s Climate Change Initiative, which aims to leverage the laboratory’s diverse technical strengths to help address climate problems threatening human health and global security.

    Chasing answers with deep learning

    Using the dataset, the researchers developed baseline artificial intelligence (AI) models. They were particularly eager to apply deep learning, a form of machine learning that excels at processing visual data. On its own, deep learning can extract features (key observations that an algorithm uses to make a decision) from images across a dataset. Other machine learning approaches require humans to first manually label features. 

    “We wanted to see if deep learning could rediscover what people normally look for in tornadoes and even identify new things that typically aren’t searched for by forecasters,” Veillette says.

    The results are promising. Their deep learning model performed similar to or better than all tornado-detecting algorithms known in literature. The trained algorithm correctly classified 50 percent of weaker EF-1 tornadoes and over 85 percent of tornadoes rated EF-2 or higher, which make up the most devastating and costly occurrences of these storms.

    They also evaluated two other types of machine-learning models, and one traditional model to compare against. The source code and parameters of all these models are freely available. The models and dataset are also described in a paper submitted to a journal of the American Meteorological Society (AMS). Veillette presented this work at the AMS Annual Meeting in January.

    “The biggest reason for putting our models out there is for the community to improve upon them and do other great things,” Kurdzo says. “The best solution could be a deep learning model, or someone might find that a non-deep learning model is actually better.”

    TorNet could be useful in the weather community for others uses too, such as for conducting large-scale case studies on storms. It could also be augmented with other data sources, like satellite imagery or lightning maps. Fusing multiple types of data could improve the accuracy of machine learning models.

    Taking steps toward operations

    On top of detecting tornadoes, Kurdzo hopes that models might help unravel the science of why they form.

    “As scientists, we see all these precursors to tornadoes — an increase in low-level rotation, a hook echo in reflectivity data, specific differential phase (KDP) foot and differential reflectivity (ZDR) arcs. But how do they all go together? And are there physical manifestations we don’t know about?” he asks.

    Teasing out those answers might be possible with explainable AI. Explainable AI refers to methods that allow a model to provide its reasoning, in a format understandable to humans, of why it came to a certain decision. In this case, these explanations might reveal physical processes that happen before tornadoes. This knowledge could help train forecasters, and models, to recognize the signs sooner. 

    “None of this technology is ever meant to replace a forecaster. But perhaps someday it could guide forecasters’ eyes in complex situations, and give a visual warning to an area predicted to have tornadic activity,” Kurdzo says.

    Such assistance could be especially useful as radar technology improves and future networks potentially grow denser. Data refresh rates in a next-generation radar network are expected to increase from every five minutes to approximately one minute, perhaps faster than forecasters can interpret the new information. Because deep learning can process huge amounts of data quickly, it could be well-suited for monitoring radar returns in real time, alongside humans. Tornadoes can form and disappear in minutes.

    But the path to an operational algorithm is a long road, especially in safety-critical situations, Veillette says. “I think the forecaster community is still, understandably, skeptical of machine learning. One way to establish trust and transparency is to have public benchmark datasets like this one. It’s a first step.”

    The next steps, the team hopes, will be taken by researchers across the world who are inspired by the dataset and energized to build their own algorithms. Those algorithms will in turn go into test beds, where they’ll eventually be shown to forecasters, to start a process of transitioning into operations.

    In the end, the path could circle back to trust.

    “We may never get more than a 10- to 15-minute tornado warning using these tools. But if we could lower the false-alarm rate, we could start to make headway with public perception,” Kurdzo says. “People are going to use those warnings to take the action they need to save their lives.” More

  • in

    Two MIT teams selected for NSF sustainable materials grants

    Two teams led by MIT researchers were selected in December 2023 by the U.S. National Science Foundation (NSF) Convergence Accelerator, a part of the TIP Directorate, to receive awards of $5 million each over three years, to pursue research aimed at helping to bring cutting-edge new sustainable materials and processes from the lab into practical, full-scale industrial production. The selection was made after 16 teams from around the country were chosen last year for one-year grants to develop detailed plans for further research aimed at solving problems of sustainability and scalability for advanced electronic products.

    Of the two MIT-led teams chosen for this current round of funding, one team, Topological Electric, is led by Mingda Li, an associate professor in the Department of Nuclear Science and Engineering. This team will be finding pathways to scale up sustainable topological materials, which have the potential to revolutionize next-generation microelectronics by showing superior electronic performance, such as dissipationless states or high-frequency response. The other team, led by Anuradha Agarwal, a principal research scientist at MIT’s Materials Research Laboratory, will be focusing on developing new materials, devices, and manufacturing processes for microchips that minimize energy consumption using electronic-photonic integration, and that detect and avoid the toxic or scarce materials used in today’s production methods.

    Scaling the use of topological materials

    Li explains that some materials based on quantum effects have achieved successful transitions from lab curiosities to successful mass production, such as blue-light LEDs, and giant magnetorestance (GMR) devices used for magnetic data storage. But he says there are a variety of equally promising materials that have shown promise but have yet to make it into real-world applications.

    “What we really wanted to achieve is to bring newer-generation quantum materials into technology and mass production, for the benefit of broader society,” he says. In particular, he says, “topological materials are really promising to do many different things.”

    Topological materials are ones whose electronic properties are fundamentally protected against disturbance. For example, Li points to the fact that just in the last two years, it has been shown that some topological materials are even better electrical conductors than copper, which are typically used for the wires interconnecting electronic components. But unlike the blue-light LEDs or the GMR devices, which have been widely produced and deployed, when it comes to topological materials, “there’s no company, no startup, there’s really no business out there,” adds Tomas Palacios, the Clarence J. Lebel Professor in Electrical Engineering at MIT and co-principal investigator on Li’s team. Part of the reason is that many versions of such materials are studied “with a focus on fundamental exotic physical properties with little or no consideration on the sustainability aspects,” says Liang Fu, an MIT professor of physics and also a co-PI. Their team will be looking for alternative formulations that are more amenable to mass production.

    One possible application of these topological materials is for detecting terahertz radiation, explains Keith Nelson, an MIT professor of chemistry and co-PI. This extremely high-frequency electronics can carry far more information than conventional radio or microwaves, but at present there are no mature electronic devices available that are scalable at this frequency range. “There’s a whole range of possibilities for topological materials” that could work at these frequencies, he says. In addition, he says, “we hope to demonstrate an entire prototype system like this in a single, very compact solid-state platform.”

    Li says that among the many possible applications of topological devices for microelectronics devices of various kinds, “we don’t know which, exactly, will end up as a product, or will reach real industrial scaleup. That’s why this opportunity from NSF is like a bridge, which is precious, to allow us to dig deeper to unleash the true potential.”

    In addition to Li, Palacios, Fu, and Nelson, the Topological Electric team includes Qiong Ma, assistant professor of physics in Boston College; Farnaz Niroui, assistant professor of electrical engineering and computer science at MIT; Susanne Stemmer, professor of materials at the University of California at Santa Barbara; Judy Cha, professor of materials science and engineering at Cornell University; industrial partners including IBM, Analog Devices, and Raytheon; and professional consultants. “We are taking this opportunity seriously,” Li says. “We really want to see if the topological materials are as good as we show in the lab when being scaled up, and how far we can push to broadly industrialize them.”

    Toward sustainable microchip production and use

    The microchips behind everything from smartphones to medical imaging are associated with a significant percentage of greenhouse gas emissions today, and every year the world produces more than 50 million metric tons of electronic waste, the equivalent of about 5,000 Eiffel Towers. Further, the data centers necessary for complex computations and huge amount of data transfer — think AI and on-demand video — are growing and will require 10 percent of the world’s electricity by 2030.

    “The current microchip manufacturing supply chain, which includes production, distribution, and use, is neither scalable nor sustainable, and cannot continue. We must innovate our way out of this crisis,” says Agarwal.

    The name of Agarwal’s team, FUTUR-IC, is a reference to the future of the integrated circuits, or chips, through a global alliance for sustainable microchip manufacturing. Says Agarwal, “We bring together stakeholders from industry, academia, and government to co-optimize across three dimensions: technology, ecology, and workforce. These were identified as key interrelated areas by some 140 stakeholders. With FUTUR-IC we aim to cut waste and CO2-equivalent emissions associated with electronics by 50 percent every 10 years.”

    The market for microelectronics in the next decade is predicted to be on the order of a trillion dollars, but most of the manufacturing for the industry occurs only in limited geographical pockets around the world. FUTUR-IC aims to diversify and strengthen the supply chain for manufacturing and packaging of electronics. The alliance has 26 collaborators and is growing. Current external collaborators include the International Electronics Manufacturing Initiative (iNEMI), Tyndall National Institute, SEMI, Hewlett Packard Enterprise, Intel, and the Rochester Institute of Technology.

    Agarwal leads FUTUR-IC in close collaboration with others, including, from MIT, Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering; Elsa Olivetti, the Jerry McAfee Professor in Engineering; Randolph Kirchain, principal research scientist in the Materials Research Laboratory; and Greg Norris, director of MIT’s Sustainability and Health Initiative for NetPositive Enterprise (SHINE). All are affiliated with the Materials Research Laboratory. They are joined by Samuel Serna, an MIT visiting professor and assistant professor of physics at Bridgewater State University. Other key personnel include Sajan Saini, education director for the Initiative for Knowledge and Innovation in Manufacturing in MIT’s Department of Materials Science and Engineering; Peter O’Brien, a professor from Tyndall National Institute; and Shekhar Chandrashekhar, CEO of iNEMI.

    “We expect the integration of electronics and photonics to revolutionize microchip manufacturing, enhancing efficiency, reducing energy consumption, and paving the way for unprecedented advancements in computing speed and data-processing capabilities,” says Serna, who is the co-lead on the project’s technology “vector.”

    Common metrics for these efforts are needed, says Norris, co-lead for the ecology vector, adding, “The microchip industry must have transparent and open Life Cycle Assessment (LCA) models and data, which are being developed by FUTUR-IC.” This is especially important given that microelectronics production transcends industries. “Given the scale and scope of microelectronics, it is critical for the industry to lead in the transition to sustainable manufacture and use,” says Kirchain, another co-lead and the co-director of the Concrete Sustainability Hub at MIT. To bring about this cross-fertilization, co-lead Olivetti, also co-director of the MIT Climate and Sustainability Consortium (MCSC), will collaborate with FUTUR-IC to enhance the benefits from microchip recycling, leveraging the learning across industries.

    Saini, the co-lead for the workforce vector, stresses the need for agility. “With a workforce that adapts to a practice of continuous upskilling, we can help increase the robustness of the chip-manufacturing supply chain, and validate a new design for a sustainability curriculum,” he says.

    “We have become accustomed to the benefits forged by the exponential growth of microelectronic technology performance and market size,” says Kimerling, who is also director of MIT’s Materials Research Laboratory and co-director of the MIT Microphotonics Center. “The ecological impact of this growth in terms of materials use, energy consumption and end-of-life disposal has begun to push back against this progress. We believe that concurrently engineered solutions for these three dimensions will build a common learning curve to power the next 40 years of progress in the semiconductor industry.”

    The MIT teams are two of six that received awards addressing sustainable materials for global challenges through phase two of the NSF Convergence Accelerator program. Launched in 2019, the program targets solutions to especially compelling challenges at an accelerated pace by incorporating a multidisciplinary research approach. More

  • in

    MIT announces 2024 Bose Grants

    MIT Provost Cynthia Barnhart announced four Professor Amar G. Bose Research Grants to support bold research projects across diverse areas of study, including a way to generate clean hydrogen from deep in the Earth, build an environmentally friendly house of basalt, design maternity clothing that monitors fetal health, and recruit sharks as ocean oxygen monitors.

    This year’s recipients are Iwnetim Abate, assistant professor of materials science and engineering; Andrew Babbin, the Cecil and Ida Green Associate Professor in Earth, Atmospheric and Planetary Sciences; Yoel Fink, professor of materials science and engineering and of electrical engineering and computer science; and Skylar Tibbits, associate professor of design research in the Department of Architecture.

    The program was named for the visionary founder of the Bose Corporation and MIT alumnus Amar G. Bose ’51, SM ’52, ScD ’56. After gaining admission to MIT, Bose became a top math student and a Fulbright Scholarship recipient. He spent 46 years as a professor at MIT, led innovations in sound design, and founded the Bose Corp. in 1964. MIT launched the Bose grant program 11 years ago to provide funding over a three-year period to MIT faculty who propose original, cross-disciplinary, and often risky research projects that would likely not be funded by conventional sources.

    “The promise of the Bose Fellowship is to help bold, daring ideas become realities, an approach that honors Amar Bose’s legacy,” says Barnhart. “Thanks to support from this program, these talented faculty members have the freedom to explore their bold and innovative ideas.”

    Deep and clean hydrogen futures

    A green energy future will depend on harnessing hydrogen as a clean energy source, sequestering polluting carbon dioxide, and mining the minerals essential to building clean energy technologies such as advanced batteries. Iwnetim Abate thinks he has a solution for all three challenges: an innovative hydrogen reactor.

    He plans to build a reactor that will create natural hydrogen from ultramafic mineral rocks in the crust. “The Earth is literally a giant hydrogen factory waiting to be tapped,” Abate explains. “A back-of-the-envelope calculation for the first seven kilometers of the Earth’s crust estimates that there is enough ultramafic rock to produce hydrogen for 250,000 years.”

    The reactor envisioned by Abate injects water to create a reaction that releases hydrogen, while also supporting the injection of climate-altering carbon dioxide into the rock, providing a global carbon capacity of 100 trillion tons. At the same time, the reactor process could provide essential elements such as lithium, nickel, and cobalt — some of the most important raw materials used in advanced batteries and electronics.

    “Ultimately, our goal is to design and develop a scalable reactor for simultaneously tapping into the trifecta from the Earth’s subsurface,” Abate says.

    Sharks as oceanographers

    If we want to understand more about how oxygen levels in the world’s seas are disturbed by human activities and climate change, we should turn to a sensing platform “that has been honed by 400 million years of evolution to perfectly sample the ocean: sharks,” says Andrew Babbin.

    As the planet warms, oceans are projected to contain less dissolved oxygen, with impacts on the productivity of global fisheries, natural carbon sequestration, and the flux of climate-altering greenhouse gasses from the ocean to the air. While scientists know dissolved oxygen is important, it has proved difficult to track over seasons, decades, and underexplored regions both shallow and deep.

    Babbin’s goal is to develop a low-cost sensor for dissolved oxygen that can be integrated with preexisting electronic shark tags used by marine biologists. “This fleet of sharks … will finally enable us to measure the extent of the low-oxygen zones of the ocean, how they change seasonally and with El Niño/La Niña oscillation, and how they expand or contract into the future.”

    The partnership with sharks will also spotlight the importance of these often-maligned animals for global marine and fisheries health, Babbin says. “We hope in pursuing this work marrying microscopic and macroscopic life we will inspire future oceanographers and conservationists, and lead to a better appreciation for the chemistry that underlies global habitability.”

    Maternity wear that monitors fetal health

    There are 2 million stillbirths around the world each year, and in the United States alone, 21,000 families suffer this terrible loss. In many cases, mothers and their doctors had no warning of any abnormalities or changes in fetal health leading up to these deaths. Yoel Fink and colleagues are looking for a better way to monitor fetal health and provide proactive treatment.

    Fink is building on years of research on acoustic fabrics to design an affordable shirt for mothers that would monitor and communicate important details of fetal health. His team’s original research drew inspiration from the function of the eardrum, designing a fiber that could be woven into other fabrics to create a kind of fabric microphone.

    “Given the sensitivity of the acoustic fabrics in sensing these nanometer-scale vibrations, could a mother’s clothing transcend its conventional role and become a health monitor, picking up on the acoustic signals and subsequent vibrations that arise from her unborn baby’s heartbeat and motion?” Fink says. “Could a simple and affordable worn fabric allow an expecting mom to sleep better, knowing that her fetus is being listened to continuously?”

    The proposed maternity shirt could measure fetal heart and breathing rate, and might be able to give an indication of the fetal body position, he says. In the final stages of development, he and his colleagues hope to develop machine learning approaches that would identify abnormal fetal heart rate and motion and deliver real-time alerts.

    A basalt house in Iceland

    In the land of volcanoes, Skylar Tibbits wants to build a case-study home almost entirely from the basalt rock that makes up the Icelandic landscape.

    Architects are increasingly interested in building using one natural material — creating a monomaterial structure — that can be easily recycled. At the moment, the building industry represents 40 percent of carbon emissions worldwide, and consists of many materials and structures, from metal to plastics to concrete, that can’t be easily disassembled or reused.

    The proposed basalt house in Iceland, a project co-led by J. Jih, associate professor of the practice in the Department of Architecture, is “an architecture that would be fully composed of the surrounding earth, that melts back into that surrounding earth at the end of its lifespan, and that can be recycled infinitely,” Tibbits explains.

    Basalt, the most common rock form in the Earth’s crust, can be spun into fibers for insulation and rebar. Basalt fiber performs as well as glass and carbon fibers at a lower cost in some applications, although it is not widely used in architecture. In cast form, it can make corrosion- and heat-resistant plumbing, cladding and flooring.

    “A monomaterial architecture is both a simple and radical proposal that unfortunately falls outside of traditional funding avenues,” says Tibbits. “The Bose grant is the perfect and perhaps the only option for our research, which we see as a uniquely achievable moonshot with transformative potential for the entire built environment.” More

  • in

    How light can vaporize water without the need for heat

    It’s the most fundamental of processes — the evaporation of water from the surfaces of oceans and lakes, the burning off of fog in the morning sun, and the drying of briny ponds that leaves solid salt behind. Evaporation is all around us, and humans have been observing it and making use of it for as long as we have existed.

    And yet, it turns out, we’ve been missing a major part of the picture all along.

    In a series of painstakingly precise experiments, a team of researchers at MIT has demonstrated that heat isn’t alone in causing water to evaporate. Light, striking the water’s surface where air and water meet, can break water molecules away and float them into the air, causing evaporation in the absence of any source of heat.

    The astonishing new discovery could have a wide range of significant implications. It could help explain mysterious measurements over the years of how sunlight affects clouds, and therefore affect calculations of the effects of climate change on cloud cover and precipitation. It could also lead to new ways of designing industrial processes such as solar-powered desalination or drying of materials.

    The findings, and the many different lines of evidence that demonstrate the reality of the phenomenon and the details of how it works, are described today in the journal PNAS, in a paper by Carl Richard Soderberg Professor of Power Engineering Gang Chen, postdocs Guangxin Lv and Yaodong Tu, and graduate student James Zhang.

    The authors say their study suggests that the effect should happen widely in nature— everywhere from clouds to fogs to the surfaces of oceans, soils, and plants — and that it could also lead to new practical applications, including in energy and clean water production. “I think this has a lot of applications,” Chen says. “We’re exploring all these different directions. And of course, it also affects the basic science, like the effects of clouds on climate, because clouds are the most uncertain aspect of climate models.”

    A newfound phenomenon

    The new work builds on research reported last year, which described this new “photomolecular effect” but only under very specialized conditions: on the surface of specially prepared hydrogels soaked with water. In the new study, the researchers demonstrate that the hydrogel is not necessary for the process; it occurs at any water surface exposed to light, whether it’s a flat surface like a body of water or a curved surface like a droplet of cloud vapor.

    Because the effect was so unexpected, the team worked to prove its existence with as many different lines of evidence as possible. In this study, they report 14 different kinds of tests and measurements they carried out to establish that water was indeed evaporating — that is, molecules of water were being knocked loose from the water’s surface and wafted into the air — due to the light alone, not by heat, which was long assumed to be the only mechanism involved.

    One key indicator, which showed up consistently in four different kinds of experiments under different conditions, was that as the water began to evaporate from a test container under visible light, the air temperature measured above the water’s surface cooled down and then leveled off, showing that thermal energy was not the driving force behind the effect.

    Other key indicators that showed up included the way the evaporation effect varied depending on the angle of the light, the exact color of the light, and its polarization. None of these varying characteristics should happen because at these wavelengths, water hardly absorbs light at all — and yet the researchers observed them.

    The effect is strongest when light hits the water surface at an angle of 45 degrees. It is also strongest with a certain type of polarization, called transverse magnetic polarization. And it peaks in green light — which, oddly, is the color for which water is most transparent and thus interacts the least.

    Chen and his co-researchers have proposed a physical mechanism that can explain the angle and polarization dependence of the effect, showing that the photons of light can impart a net force on water molecules at the water surface that is sufficient to knock them loose from the body of water. But they cannot yet account for the color dependence, which they say will require further study.

    They have named this the photomolecular effect, by analogy with the photoelectric effect that was discovered by Heinrich Hertz in 1887 and finally explained by Albert Einstein in 1905. That effect was one of the first demonstrations that light also has particle characteristics, which had major implications in physics and led to a wide variety of applications, including LEDs. Just as the photoelectric effect liberates electrons from atoms in a material in response to being hit by a photon of light, the photomolecular effect shows that photons can liberate entire molecules from a liquid surface, the researchers say.

    “The finding of evaporation caused by light instead of heat provides new disruptive knowledge of light-water interaction,” says Xiulin Ruan, professor of mechanical engineering at Purdue University, who was not involved in the study. “It could help us gain new understanding of how sunlight interacts with cloud, fog, oceans, and other natural water bodies to affect weather and climate. It has significant potential practical applications such as high-performance water desalination driven by solar energy. This research is among the rare group of truly revolutionary discoveries which are not widely accepted by the community right away but take time, sometimes a long time, to be confirmed.”

    Solving a cloud conundrum

    The finding may solve an 80-year-old mystery in climate science. Measurements of how clouds absorb sunlight have often shown that they are absorbing more sunlight than conventional physics dictates possible. The additional evaporation caused by this effect could account for the longstanding discrepancy, which has been a subject of dispute since such measurements are difficult to make.

    “Those experiments are based on satellite data and flight data,“ Chen explains. “They fly an airplane on top of and below the clouds, and there are also data based on the ocean temperature and radiation balance. And they all conclude that there is more absorption by clouds than theory could calculate. However, due to the complexity of clouds and the difficulties of making such measurements, researchers have been debating whether such discrepancies are real or not. And what we discovered suggests that hey, there’s another mechanism for cloud absorption, which was not accounted for, and this mechanism might explain the discrepancies.”

    Chen says he recently spoke about the phenomenon at an American Physical Society conference, and one physicist there who studies clouds and climate said they had never thought about this possibility, which could affect calculations of the complex effects of clouds on climate. The team conducted experiments using LEDs shining on an artificial cloud chamber, and they observed heating of the fog, which was not supposed to happen since water does not absorb in the visible spectrum. “Such heating can be explained based on the photomolecular effect more easily,” he says.

    Lv says that of the many lines of evidence, “the flat region in the air-side temperature distribution above hot water will be the easiest for people to reproduce.” That temperature profile “is a signature” that demonstrates the effect clearly, he says.

    Zhang adds: “It is quite hard to explain how this kind of flat temperature profile comes about without invoking some other mechanism” beyond the accepted theories of thermal evaporation. “It ties together what a whole lot of people are reporting in their solar desalination devices,” which again show evaporation rates that cannot be explained by the thermal input.

    The effect can be substantial. Under the optimum conditions of color, angle, and polarization, Lv says, “the evaporation rate is four times the thermal limit.”

    Already, since publication of the first paper, the team has been approached by companies that hope to harness the effect, Chen says, including for evaporating syrup and drying paper in a paper mill. The likeliest first applications will come in the areas of solar desalinization systems or other industrial drying processes, he says. “Drying consumes 20 percent of all industrial energy usage,” he points out.

    Because the effect is so new and unexpected, Chen says, “This phenomenon should be very general, and our experiment is really just the beginning.” The experiments needed to demonstrate and quantify the effect are very time-consuming. “There are many variables, from understanding water itself, to extending to other materials, other liquids and even solids,” he says.

    “The observations in the manuscript points to a new physical mechanism that foundationally alters our thinking on the kinetics of evaporation,” says Shannon Yee, an associate professor of mechanical engineering at Georgia Tech, who was not associated with this work. He adds, “Who would have thought that we are still learning about something as quotidian as water evaporating?”

    “I think this work is very significant scientifically because it presents a new mechanism,” says University of Alberta Distinguished Professor Janet A.W. Elliott, who also was not associated with this work. “It may also turn out to be practically important for technology and our understanding of nature, because evaporation of water is ubiquitous and the effect appears to deliver significantly higher evaporation rates than the known thermal mechanism. …  My overall impression is this work is outstanding. It appears to be carefully done with many precise experiments lending support for one another.”

    The work was partly supported by an MIT Bose Award. More

  • in

    Using deep learning to image the Earth’s planetary boundary layer

    Although the troposphere is often thought of as the closest layer of the atmosphere to the Earth’s surface, the planetary boundary layer (PBL) — the lowest layer of the troposphere — is actually the part that most significantly influences weather near the surface. In the 2018 planetary science decadal survey, the PBL was raised as an important scientific issue that has the potential to enhance storm forecasting and improve climate projections.  

    “The PBL is where the surface interacts with the atmosphere, including exchanges of moisture and heat that help lead to severe weather and a changing climate,” says Adam Milstein, a technical staff member in Lincoln Laboratory’s Applied Space Systems Group. “The PBL is also where humans live, and the turbulent movement of aerosols throughout the PBL is important for air quality that influences human health.” 

    Although vital for studying weather and climate, important features of the PBL, such as its height, are difficult to resolve with current technology. In the past four years, Lincoln Laboratory staff have been studying the PBL, focusing on two different tasks: using machine learning to make 3D-scanned profiles of the atmosphere, and resolving the vertical structure of the atmosphere more clearly in order to better predict droughts.  

    This PBL-focused research effort builds on more than a decade of related work on fast, operational neural network algorithms developed by Lincoln Laboratory for NASA missions. These missions include the Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of Smallsats (TROPICS) mission as well as Aqua, a satellite that collects data about Earth’s water cycle and observes variables such as ocean temperature, precipitation, and water vapor in the atmosphere. These algorithms retrieve temperature and humidity from the satellite instrument data and have been shown to significantly improve the accuracy and usable global coverage of the observations over previous approaches. For TROPICS, the algorithms help retrieve data that are used to characterize a storm’s rapidly evolving structures in near-real time, and for Aqua, it has helped increase forecasting models, drought monitoring, and fire prediction. 

    These operational algorithms for TROPICS and Aqua are based on classic “shallow” neural networks to maximize speed and simplicity, creating a one-dimensional vertical profile for each spectral measurement collected by the instrument over each location. While this approach has improved observations of the atmosphere down to the surface overall, including the PBL, laboratory staff determined that newer “deep” learning techniques that treat the atmosphere over a region of interest as a three-dimensional image are needed to improve PBL details further.

    “We hypothesized that deep learning and artificial intelligence (AI) techniques could improve on current approaches by incorporating a better statistical representation of 3D temperature and humidity imagery of the atmosphere into the solutions,” Milstein says. “But it took a while to figure out how to create the best dataset — a mix of real and simulated data; we needed to prepare to train these techniques.”

    The team collaborated with Joseph Santanello of the NASA Goddard Space Flight Center and William Blackwell, also of the Applied Space Systems Group, in a recent NASA-funded effort showing that these retrieval algorithms can improve PBL detail, including more accurate determination of the PBL height than the previous state of the art. 

    While improved knowledge of the PBL is broadly useful for increasing understanding of climate and weather, one key application is prediction of droughts. According to a Global Drought Snapshot report released last year, droughts are a pressing planetary issue that the global community needs to address. Lack of humidity near the surface, specifically at the level of the PBL, is the leading indicator of drought. While previous studies using remote-sensing techniques have examined the humidity of soil to determine drought risk, studying the atmosphere can help predict when droughts will happen.  

    In an effort funded by Lincoln Laboratory’s Climate Change Initiative, Milstein, along with laboratory staff member Michael Pieper, are working with scientists at NASA’s Jet Propulsion Laboratory (JPL) to use neural network techniques to improve drought prediction over the continental United States. While the work builds off of existing operational work JPL has done incorporating (in part) the laboratory’s operational “shallow” neural network approach for Aqua, the team believes that this work and the PBL-focused deep learning research work can be combined to further improve the accuracy of drought prediction. 

    “Lincoln Laboratory has been working with NASA for more than a decade on neural network algorithms for estimating temperature and humidity in the atmosphere from space-borne infrared and microwave instruments, including those on the Aqua spacecraft,” Milstein says. “Over that time, we have learned a lot about this problem by working with the science community, including learning about what scientific challenges remain. Our long experience working on this type of remote sensing with NASA scientists, as well as our experience with using neural network techniques, gave us a unique perspective.”

    According to Milstein, the next step for this project is to compare the deep learning results to datasets from the National Oceanic and Atmospheric Administration, NASA, and the Department of Energy collected directly in the PBL using radiosondes, a type of instrument flown on a weather balloon. “These direct measurements can be considered a kind of ‘ground truth’ to quantify the accuracy of the techniques we have developed,” Milstein says.

    This improved neural network approach holds promise to demonstrate drought prediction that can exceed the capabilities of existing indicators, Milstein says, and to be a tool that scientists can rely on for decades to come. More