More stories

  • in

    Detailed images from space offer clearer picture of drought effects on plants

    “MIT is a place where dreams come true,” says César Terrer, an assistant professor in the Department of Civil and Environmental Engineering. Here at MIT, Terrer says he’s given the resources needed to explore ideas he finds most exciting, and at the top of his list is climate science. In particular, he is interested in plant-soil interactions, and how the two can mitigate impacts of climate change. In 2022, Terrer received seed grant funding from the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) to produce drought monitoring systems for farmers. The project is leveraging a new generation of remote sensing devices to provide high-resolution plant water stress at regional to global scales.

    Growing up in Granada, Spain, Terrer always had an aptitude and passion for science. He studied environmental science at the University of Murcia, where he interned in the Department of Ecology. Using computational analysis tools, he worked on modeling species distribution in response to human development. Early on in his undergraduate experience, Terrer says he regarded his professors as “superheroes” with a kind of scholarly prowess. He knew he wanted to follow in their footsteps by one day working as a faculty member in academia. Of course, there would be many steps along the way before achieving that dream. 

    Upon completing his undergraduate studies, Terrer set his sights on exciting and adventurous research roles. He thought perhaps he would conduct field work in the Amazon, engaging with native communities. But when the opportunity arose to work in Australia on a state-of-the-art climate change experiment that simulates future levels of carbon dioxide, he headed south to study how plants react to CO2 in a biome of native Australian eucalyptus trees. It was during this experience that Terrer started to take a keen interest in the carbon cycle and the capacity of ecosystems to buffer rising levels of CO2 caused by human activity.

    Around 2014, he began to delve deeper into the carbon cycle as he began his doctoral studies at Imperial College London. The primary question Terrer sought to answer during his PhD was “will plants be able to absorb predicted future levels of CO2 in the atmosphere?” To answer the question, Terrer became an early adopter of artificial intelligence, machine learning, and remote sensing to analyze data from real-life, global climate change experiments. His findings from these “ground truth” values and observations resulted in a paper in the journal Science. In it, he claimed that climate models most likely overestimated how much carbon plants will be able to absorb by the end of the century, by a factor of three. 

    After postdoctoral positions at Stanford University and the Universitat Autonoma de Barcelona, followed by a prestigious Lawrence Fellowship, Terrer says he had “too many ideas and not enough time to accomplish all those ideas.” He knew it was time to lead his own group. Not long after applying for faculty positions, he landed at MIT. 

    New ways to monitor drought

    Terrer is employing similar methods to those he used during his PhD to analyze data from all over the world for his J-WAFS project. He and postdoc Wenzhe Jiao collect data from remote sensing satellites and field experiments and use machine learning to come up with new ways to monitor drought. Terrer says Jiao is a “remote sensing wizard,” who fuses data from different satellite products to understand the water cycle. With Jiao’s hydrology expertise and Terrer’s knowledge of plants, soil, and the carbon cycle, the duo is a formidable team to tackle this project.

    According to the U.N. World Meteorological Organization, the number and duration of droughts has increased by 29 percent since 2000, as compared to the two previous decades. From the Horn of Africa to the Western United States, drought is devastating vegetation and severely stressing water supplies, compromising food production and spiking food insecurity. Drought monitoring can offer fundamental information on drought location, frequency, and severity, but assessing the impact of drought on vegetation is extremely challenging. This is because plants’ sensitivity to water deficits varies across species and ecosystems. 

    Terrer and Jiao are able to obtain a clearer picture of how drought is affecting plants by employing the latest generation of remote sensing observations, which offer images of the planet with incredible spatial and temporal resolution. Satellite products such as Sentinel, Landsat, and Planet can provide daily images from space with such high resolution that individual trees can be discerned. Along with the images and datasets from satellites, the team is using ground-based observations from meteorological data. They are also using the MIT SuperCloud at MIT Lincoln Laboratory to process and analyze all of the data sets. The J-WAFS project is among one of the first to leverage high-resolution data to quantitatively measure plant drought impacts in the United States with the hopes of expanding to a global assessment in the future.

    Assisting farmers and resource managers 

    Every week, the U.S. Drought Monitor provides a map of drought conditions in the United States. The map has zero resolution and is more of a drought recap or summary, unable to predict future drought scenarios. The lack of a comprehensive spatiotemporal evaluation of historic and future drought impacts on global vegetation productivity is detrimental to farmers both in the United States and worldwide.  

    Terrer and Jiao plan to generate metrics for plant water stress at an unprecedented resolution of 10-30 meters. This means that they will be able to provide drought monitoring maps at the scale of a typical U.S. farm, giving farmers more precise, useful data every one to two days. The team will use the information from the satellites to monitor plant growth and soil moisture, as well as the time lag of plant growth response to soil moisture. In this way, Terrer and Jiao say they will eventually be able to create a kind of “plant water stress forecast” that may be able to predict adverse impacts of drought four weeks in advance. “According to the current soil moisture and lagged response time, we hope to predict plant water stress in the future,” says Jiao. 

    The expected outcomes of this project will give farmers, land and water resource managers, and decision-makers more accurate data at the farm-specific level, allowing for better drought preparation, mitigation, and adaptation. “We expect to make our data open-access online, after we finish the project, so that farmers and other stakeholders can use the maps as tools,” says Jiao. 

    Terrer adds that the project “has the potential to help us better understand the future states of climate systems, and also identify the regional hot spots more likely to experience water crises at the national, state, local, and tribal government scales.” He also expects the project will enhance our understanding of global carbon-water-energy cycle responses to drought, with applications in determining climate change impacts on natural ecosystems as a whole. More

  • in

    New nanosatellite tests autonomy in space

    In May 2022, a SpaceX Falcon 9 rocket launched the Transporter-5 mission into orbit. The mission contained a collection of micro and nanosatellites from both industry and government, including one from MIT Lincoln Laboratory called the Agile MicroSat (AMS).

    AMS’s primary mission is to test automated maneuvering capabilities in the tumultuous very low-Earth orbit (VLEO) environment, starting at 525 kilometers above the surface and lowering down. VLEO is a challenging location for satellites because the higher air density, coupled with variable space weather, causes increased and unpredictable drag that requires frequent maneuvers to maintain position. Using a commercial off-the-shelf electric-ion propulsion system and custom algorithms, AMS is testing how well it can execute automated navigation and control over an initial mission period of six months.

    “AMS integrates electric propulsion and autonomous navigation and guidance control algorithms that push a lot of the operation of the thruster onto the spacecraft — somewhat like a self-driving car,” says Andrew Stimac, who is the principal investigator for the AMS program and the leader of the laboratory’s Integrated Systems and Concepts Group.

    Stimac sees AMS as a kind of pathfinder mission for the field of small satellite autonomy. Autonomy is essential to support the growing number of small satellite launches for industry and science because it can reduce the cost and labor needed to maintain them, enable missions that call for quick and impromptu responses, and help to avoid collisions in an already-crowded sky.

    AMS is the first-ever test of a nanosatellite with this type of automated maneuvering capability.

    AMS uses an electric propulsion thruster that was selected to meet the size and power constraints of a nanosatellite while providing enough thrust and endurance to enable multiyear missions that operate in VLEO. The flight software, called the Bus Hosted Onboard Software Suite, was designed to autonomously operate the thruster to change the spacecraft’s orbit. Operators on the ground can give AMS a high-level command, such as to descend to and maintain a 300-kilometer orbit, and the software will schedule thruster burns to achieve that command autonomously, using measurements from the onboard GPS receiver as feedback. This experimental software is separate from the bus flight software, which allows AMS to safely test its novel algorithms without endangering the spacecraft.

    “One of the enablers for AMS is the way in which we’ve created this software sandbox onboard the spacecraft,” says Robert Legge, who is another member of the AMS team. “We have our own hosted software that’s running on the primary flight computer, but it’s separate from the critical health and safety avionics software. Basically, you can view this as being a little development environment on the spacecraft where we can test out different algorithms.”

    AMS has two secondary missions called Camera and Beacon. Camera’s mission is to take photos and short video clips of the Earth’s surface while AMS is in different low-Earth orbit positions.

    “One of the things we’re hoping to demonstrate is the ability to respond to current events,” says Rebecca Keenan, who helped to prepare the Camera payload. “We could hear about something that happened, like a fire or flood, and then respond pretty quickly to maneuver the satellite to image it.”

    Keenan and the rest of the AMS team are collaborating with the laboratory’s DisasterSat program, which aims to improve satellite image processing pipelines to help relief agencies respond to disasters more quickly. Small satellites that could schedule operations on-demand, rather than planning them months in advance before launch, could be a great asset to disaster response efforts.

    The other payload, Beacon, is testing new adaptive optics capabilities for tracking fast-moving targets by sending laser light from the moving satellite to a ground station at the laboratory’s Haystack Observatory in Westford, Massachusetts. Enabling precise laser pointing from an agile satellite could aid many different types of space missions, such as communications and tracking space debris. It could also be used for emerging programs such as Breakthrough Starshot, which is developing a satellite that can accelerate to high speeds using a laser-propelled lightsail.

    “As far as we know, this is the first on-orbit artificial guide star that has launched for a dedicated adaptive optics purpose,” says Lulu Liu, who worked on the Beacon payload. “Theoretically, the laser it carries can be maneuvered into position on other spacecraft to support a large number of science missions in different regions of the sky.”

    The team developed Beacon with a strict budget and timeline and hope that its success will shorten the design and test loop of next-generation laser transmitter systems. “The idea is that we could have a number of these flying in the sky at once, and a ground system can point to one of them and get near-real-time feedback on its performance,” says Liu.

    AMS weighs under 12 kilograms with 6U dimensions (23 x 11 x 36 centimeters). The bus was designed by Blue Canyon Technologies and the thruster was designed by Enpulsion GmbH.

    Legge says that the AMS program was approached as an opportunity for Lincoln Laboratory to showcase its ability to conduct work in the space domain quickly and flexibly. Some major roadblocks to rapid development of new space technology have been long timelines, high costs, and the extremely low risk tolerance associated with traditional space programs. “We wanted to show that we can really do rapid prototyping and testing of space hardware and software on orbit at an affordable cost,” Legge says.

    “AMS shows the value and fast time-to-orbit afforded by teaming with rapid space commercial partners for spacecraft core bus technologies and launch and ground segment operations, while allowing the laboratory to focus on innovative mission concepts, advanced components and payloads, and algorithms and processing software,” says Dan Cousins, who is the program manager for AMS. “The AMS team appreciates the support from the laboratory’s Technology Office for allowing us to showcase an effective operating model for rapid space programs.”

    AMS took its first image on June 1, completed its thruster commissioning in July, and has begun to descend toward its target VLEO position. More

  • in

    New materials could enable longer-lasting implantable batteries

    For the last few decades, battery research has largely focused on rechargeable lithium-ion batteries, which are used in everything from electric cars to portable electronics and have improved dramatically in terms of affordability and capacity. But nonrechargeable batteries have seen little improvement during that time, despite their crucial role in many important uses such as implantable medical devices like pacemakers.

    Now, researchers at MIT have come up with a way to improve the energy density of these nonrechargeable, or “primary,” batteries. They say it could enable up to a 50 percent increase in useful lifetime, or a corresponding decrease in size and weight for a given amount of power or energy capacity, while also improving safety, with little or no increase in cost.

    The new findings, which involve substituting the conventionally inactive battery electrolyte with a material that is active for energy delivery, are reported today in the journal Proceedings of the National Academy of Sciences, in a paper by MIT Kavanaugh Postdoctoral Fellow Haining Gao, graduate student Alejandro Sevilla, associate professor of mechanical engineering Betar Gallant, and four others at MIT and Caltech.

    Replacing the battery in a pacemaker or other medical implant requires a surgical procedure, so any increase in the longevity of their batteries could have a significant impact on the patient’s quality of life, Gallant says. Primary batteries are used for such essential applications because they can provide about three times as much energy for a given size and weight as rechargeable batteries.

    That difference in capacity, Gao says, makes primary batteries “critical for applications where charging is not possible or is impractical.” The new materials work at human body temperature, so would be suitable for medical implants. In addition to implantable devices, with further development to make the batteries operate efficiently at cooler temperatures, applications could also include sensors in tracking devices for shipments, for example to ensure that temperature and humidity requirements for food or drug shipments are properly maintained throughout the shipping process. Or, they might be used in remotely operated aerial or underwater vehicles that need to remain ready for deployment over long periods.

    Pacemaker batteries typically last from five to 10 years, and even less if they require high-voltage functions such as defibrillation. Yet for such batteries, Gao says, the technology is considered mature, and “there haven’t been any major innovations in fundamental cell chemistries in the past 40 years.”

    The key to the team’s innovation is a new kind of electrolyte — the material that lies between the two electrical poles of the battery, the cathode and the anode, and allows charge carriers to pass through from one side to the other. Using a new liquid fluorinated compound, the team found that they could combine some of the functions of the cathode and the electrolyte in one compound, called a catholyte. This allows for saving much of the weight of typical primary batteries, Gao says.

    While there are other materials besides this new compound that could theoretically function in a similar catholyte role in a high-capacity battery, Gallant explains, those materials have lower inherent voltages that do not match those of the remainder of the material in a conventional pacemaker battery, a type known as CFx. Because the overall output from the battery can’t be more than that of the lesser of the two electrode materials,  the extra capacity would go to waste because of the voltage mismatch. But with the new material, “one of the key merits of our fluorinated liquids is that their voltage aligns very well with that of CFx,” Gallant says.

    In a conventional  CFx battery, the liquid electrolyte is essential because it allows charged particles to pass through from one electrode to the other. But “those electrolytes are actually chemically inactive, so they’re basically dead weight,” Gao says. This means about 50 percent of the battery’s key components, mainly the electrolyte, is inactive material. But in the new design with the fluorinated catholyte material, the amount of dead weight can be reduced to about 20 percent, she says.

    The new cells also provide safety improvements over other kinds of proposed chemistries that would use toxic and corrosive catholyte materials, which their formula does not, Gallant says. And preliminary tests have demonstrated a stable shelf life over more than a year, an important characteristic for primary batteries, she says.

    So far, the team has not yet experimentally achieved the full 50 percent improvement in energy density predicted by their analysis. They have demonstrated a 20 percent improvement, which in itself would be an important gain for some applications, Gallant says. The design of the cell itself has not yet been fully optimized, but the researchers can project the cell performance based on the performance of the active material itself. “We can see the projected cell-level performance when it’s scaled up can reach around 50 percent higher than the CFx cell,” she says. Achieving that level experimentally is the team’s next goal.

    Sevilla, a doctoral student in the mechanical engineering department, will be focusing on that work in the coming year. “I was brought into this project to try to understand some of the limitations of why we haven’t been able to attain the full energy density possible,” he says. “My role has been trying to fill in the gaps in terms of understanding the underlying reaction.”

    One big advantage of the new material, Gao says, is that it can easily be integrated into existing battery manufacturing processes, as a simple substitution of one material for another. Preliminary discussions with manufacturers confirm this potentially easy substitution, Gao says. The basic starting material, used for other purposes, has already been scaled up for production, she says, and its price is comparable to that of the materials currently used in CFx batteries. The cost of batteries using the new material is likely to be comparable to the existing batteries as well, she says. The team has already applied for a patent on the catholyte, and they expect that the medical applications are likely to be the first to be commercialized, perhaps with a full-scale prototype ready for testing in real devices within about a year.

    Further down the road, other applications could likely take advantage of the new materials as well, such as smart water or gas meters that can be read out remotely, or devices like EZPass transponders, increasing their usable lifetime, the researchers say. Power for drone aircraft or undersea vehicles would require higher power and so may take longer to be developed. Other uses could include batteries for equipment used at remote sites, such as drilling rigs for oil and gas, including devices sent down into the wells to monitor conditions.

    The team also included Gustavo Hobold, Aaron Melemed, and Rui Guo at MIT and Simon Jones at Caltech. The work was supported by MIT Lincoln Laboratory and the Army Research Office. More

  • in

    Taking a magnifying glass to data center operations

    When the MIT Lincoln Laboratory Supercomputing Center (LLSC) unveiled its TX-GAIA supercomputer in 2019, it provided the MIT community a powerful new resource for applying artificial intelligence to their research. Anyone at MIT can submit a job to the system, which churns through trillions of operations per second to train models for diverse applications, such as spotting tumors in medical images, discovering new drugs, or modeling climate effects. But with this great power comes the great responsibility of managing and operating it in a sustainable manner — and the team is looking for ways to improve.

    “We have these powerful computational tools that let researchers build intricate models to solve problems, but they can essentially be used as black boxes. What gets lost in there is whether we are actually using the hardware as effectively as we can,” says Siddharth Samsi, a research scientist in the LLSC. 

    To gain insight into this challenge, the LLSC has been collecting detailed data on TX-GAIA usage over the past year. More than a million user jobs later, the team has released the dataset open source to the computing community.

    Their goal is to empower computer scientists and data center operators to better understand avenues for data center optimization — an important task as processing needs continue to grow. They also see potential for leveraging AI in the data center itself, by using the data to develop models for predicting failure points, optimizing job scheduling, and improving energy efficiency. While cloud providers are actively working on optimizing their data centers, they do not often make their data or models available for the broader high-performance computing (HPC) community to leverage. The release of this dataset and associated code seeks to fill this space.

    “Data centers are changing. We have an explosion of hardware platforms, the types of workloads are evolving, and the types of people who are using data centers is changing,” says Vijay Gadepally, a senior researcher at the LLSC. “Until now, there hasn’t been a great way to analyze the impact to data centers. We see this research and dataset as a big step toward coming up with a principled approach to understanding how these variables interact with each other and then applying AI for insights and improvements.”

    Papers describing the dataset and potential applications have been accepted to a number of venues, including the IEEE International Symposium on High-Performance Computer Architecture, the IEEE International Parallel and Distributed Processing Symposium, the Annual Conference of the North American Chapter of the Association for Computational Linguistics, the IEEE High-Performance and Embedded Computing Conference, and International Conference for High Performance Computing, Networking, Storage and Analysis. 

    Workload classification

    Among the world’s TOP500 supercomputers, TX-GAIA combines traditional computing hardware (central processing units, or CPUs) with nearly 900 graphics processing unit (GPU) accelerators. These NVIDIA GPUs are specialized for deep learning, the class of AI that has given rise to speech recognition and computer vision.

    The dataset covers CPU, GPU, and memory usage by job; scheduling logs; and physical monitoring data. Compared to similar datasets, such as those from Google and Microsoft, the LLSC dataset offers “labeled data, a variety of known AI workloads, and more detailed time series data compared with prior datasets. To our knowledge, it’s one of the most comprehensive and fine-grained datasets available,” Gadepally says. 

    Notably, the team collected time-series data at an unprecedented level of detail: 100-millisecond intervals on every GPU and 10-second intervals on every CPU, as the machines processed more than 3,000 known deep-learning jobs. One of the first goals is to use this labeled dataset to characterize the workloads that different types of deep-learning jobs place on the system. This process would extract features that reveal differences in how the hardware processes natural language models versus image classification or materials design models, for example.   

    The team has now launched the MIT Datacenter Challenge to mobilize this research. The challenge invites researchers to use AI techniques to identify with 95 percent accuracy the type of job that was run, using their labeled time-series data as ground truth.

    Such insights could enable data centers to better match a user’s job request with the hardware best suited for it, potentially conserving energy and improving system performance. Classifying workloads could also allow operators to quickly notice discrepancies resulting from hardware failures, inefficient data access patterns, or unauthorized usage.

    Too many choices

    Today, the LLSC offers tools that let users submit their job and select the processors they want to use, “but it’s a lot of guesswork on the part of users,” Samsi says. “Somebody might want to use the latest GPU, but maybe their computation doesn’t actually need it and they could get just as impressive results on CPUs, or lower-powered machines.”

    Professor Devesh Tiwari at Northeastern University is working with the LLSC team to develop techniques that can help users match their workloads to appropriate hardware. Tiwari explains that the emergence of different types of AI accelerators, GPUs, and CPUs has left users suffering from too many choices. Without the right tools to take advantage of this heterogeneity, they are missing out on the benefits: better performance, lower costs, and greater productivity.

    “We are fixing this very capability gap — making users more productive and helping users do science better and faster without worrying about managing heterogeneous hardware,” says Tiwari. “My PhD student, Baolin Li, is building new capabilities and tools to help HPC users leverage heterogeneity near-optimally without user intervention, using techniques grounded in Bayesian optimization and other learning-based optimization methods. But, this is just the beginning. We are looking into ways to introduce heterogeneity in our data centers in a principled approach to help our users achieve the maximum advantage of heterogeneity autonomously and cost-effectively.”

    Workload classification is the first of many problems to be posed through the Datacenter Challenge. Others include developing AI techniques to predict job failures, conserve energy, or create job scheduling approaches that improve data center cooling efficiencies.

    Energy conservation 

    To mobilize research into greener computing, the team is also planning to release an environmental dataset of TX-GAIA operations, containing rack temperature, power consumption, and other relevant data.

    According to the researchers, huge opportunities exist to improve the power efficiency of HPC systems being used for AI processing. As one example, recent work in the LLSC determined that simple hardware tuning, such as limiting the amount of power an individual GPU can draw, could reduce the energy cost of training an AI model by 20 percent, with only modest increases in computing time. “This reduction translates to approximately an entire week’s worth of household energy for a mere three-hour time increase,” Gadepally says.

    They have also been developing techniques to predict model accuracy, so that users can quickly terminate experiments that are unlikely to yield meaningful results, saving energy. The Datacenter Challenge will share relevant data to enable researchers to explore other opportunities to conserve energy.

    The team expects that lessons learned from this research can be applied to the thousands of data centers operated by the U.S. Department of Defense. The U.S. Air Force is a sponsor of this work, which is being conducted under the USAF-MIT AI Accelerator.

    Other collaborators include researchers at MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Professor Charles Leiserson’s Supertech Research Group is investigating performance-enhancing techniques for parallel computing, and research scientist Neil Thompson is designing studies on ways to nudge data center users toward climate-friendly behavior.

    Samsi presented this work at the inaugural AI for Datacenter Optimization (ADOPT’22) workshop last spring as part of the IEEE International Parallel and Distributed Processing Symposium. The workshop officially introduced their Datacenter Challenge to the HPC community.

    “We hope this research will allow us and others who run supercomputing centers to be more responsive to user needs while also reducing the energy consumption at the center level,” Samsi says. More

  • in

    Cracking the case of Arctic sea ice breakup

    Despite its below-freezing temperatures, the Arctic is warming twice as fast as the rest of the planet. As Arctic sea ice melts, fewer bright surfaces are available to reflect sunlight back into space. When fractures open in the ice cover, the water underneath gets exposed. Dark, ice-free water absorbs the sun’s energy, heating the ocean and driving further melting — a vicious cycle. This warming in turn melts glacial ice, contributing to rising sea levels.

    Warming climate and rising sea levels endanger the nearly 40 percent of the U.S. population living in coastal areas, the billions of people who depend on the ocean for food and their livelihoods, and species such as polar bears and Artic foxes. Reduced ice coverage is also making the once-impassable region more accessible, opening up new shipping lanes and ports. Interest in using these emerging trans-Arctic routes for product transit, extraction of natural resources (e.g., oil and gas), and military activity is turning an area traditionally marked by low tension and cooperation into one of global geopolitical competition.

    As the Arctic opens up, predicting when and where the sea ice will fracture becomes increasingly important in strategic decision-making. However, huge gaps exist in our understanding of the physical processes contributing to ice breakup. Researchers at MIT Lincoln Laboratory seek to help close these gaps by turning a data-sparse environment into a data-rich one. They envision deploying a distributed set of unattended sensors across the Arctic that will persistently detect and geolocate ice fracturing events. Concurrently, the network will measure various environmental conditions, including water temperature and salinity, wind speed and direction, and ocean currents at different depths. By correlating these fracturing events and environmental conditions, they hope to discover meaningful insights about what is causing the sea ice to break up. Such insights could help predict the future state of Arctic sea ice to inform climate modeling, climate change planning, and policy decision-making at the highest levels.

    “We’re trying to study the relationship between ice cracking, climate change, and heat flow in the ocean,” says Andrew March, an assistant leader of Lincoln Laboratory’s Advanced Undersea Systems and Technology Group. “Do cracks in the ice cause warm water to rise and more ice to melt? Do undersea currents and waves cause cracking? Does cracking cause undersea waves? These are the types of questions we aim to investigate.”

    Arctic access

    In March 2022, Ben Evans and Dave Whelihan, both researchers in March’s group, traveled for 16 hours across three flights to Prudhoe Bay, located on the North Slope of Alaska. From there, they boarded a small specialized aircraft and flew another 90 minutes to a three-and-a-half-mile-long sheet of ice floating 160 nautical miles offshore in the Arctic Ocean. In the weeks before their arrival, the U.S. Navy’s Arctic Submarine Laboratory had transformed this inhospitable ice floe into a temporary operating base called Ice Camp Queenfish, named after the first Sturgeon-class submarine to operate under the ice and the fourth to reach the North Pole. The ice camp featured a 2,500-foot-long runway, a command center, sleeping quarters to accommodate up to 60 personnel, a dining tent, and an extremely limited internet connection.

    At Queenfish, for the next four days, Evans and Whelihan joined U.S. Navy, Army, Air Force, Marine Corps, and Coast Guard members, and members of the Royal Canadian Air Force and Navy and United Kingdom Royal Navy, who were participating in Ice Exercise (ICEX) 2022. Over the course of about three weeks, more than 200 personnel stationed at Queenfish, Prudhoe Bay, and aboard two U.S. Navy submarines participated in this biennial exercise. The goals of ICEX 2022 were to assess U.S. operational readiness in the Arctic; increase our country’s experience in the region; advance our understanding of the Arctic environment; and continue building relationships with other services, allies, and partner organizations to ensure a free and peaceful Arctic. The infrastructure provided for ICEX concurrently enables scientists to conduct research in an environment — either in person or by sending their research equipment for exercise organizers to deploy on their behalf — that would be otherwise extremely difficult and expensive to access.

    In the Arctic, windchill temperatures can plummet to as low as 60 degrees Fahrenheit below zero, cold enough to freeze exposed skin within minutes. Winds and ocean currents can drift the entire camp beyond the reach of nearby emergency rescue aircraft, and the ice can crack at any moment. To ensure the safety of participants, a team of Navy meteorological specialists continually monitors the ever-changing conditions. The original camp location for ICEX 2022 had to be evacuated and relocated after a massive crack formed in the ice, delaying Evans’ and Whelihan’s trip. Even the newly selected site had a large crack form behind the camp and another crack that necessitated moving a number of tents.

    “Such cracking events are only going to increase as the climate warms, so it’s more critical now than ever to understand the physical processes behind them,” Whelihan says. “Such an understanding will require building technology that can persist in the environment despite these incredibly harsh conditions. So, it’s a challenge not only from a scientific perspective but also an engineering one.”

    “The weather always gets a vote, dictating what you’re able to do out here,” adds Evans. “The Arctic Submarine Laboratory does a lot of work to construct the camp and make it a safe environment where researchers like us can come to do good science. ICEX is really the only opportunity we have to go onto the sea ice in a place this remote to collect data.”

    A legacy of sea ice experiments

    Though this trip was Whelihan’s and Evans’ first to the Arctic region, staff from the laboratory’s Advanced Undersea Systems and Technology Group have been conducting experiments at ICEX since 2018. However, because of the Arctic’s remote location and extreme conditions, data collection has rarely been continuous over long periods of time or widespread across large areas. The team now hopes to change that by building low-cost, expendable sensing platforms consisting of co-located devices that can be left unattended for automated, persistent, near-real-time monitoring. 

    “The laboratory’s extensive expertise in rapid prototyping, seismo-acoustic signal processing, remote sensing, and oceanography make us a natural fit to build this sensor network,” says Evans.

    In the months leading up to the Arctic trip, the team collected seismometer data at Firepond, part of the laboratory’s Haystack Observatory site in Westford, Massachusetts. Through this local data collection, they aimed to gain a sense of what anthropogenic (human-induced) noise would look like so they could begin to anticipate the kinds of signatures they might see in the Arctic. They also collected ice melting/fracturing data during a thaw cycle and correlated these data with the weather conditions (air temperature, humidity, and pressure). Through this analysis, they detected an increase in seismic signals as the temperature rose above 32 F — an indication that air temperature and ice cracking may be related.

    A sensing network

    At ICEX, the team deployed various commercial off-the-shelf sensors and new sensors developed by the laboratory and University of New Hampshire (UNH) to assess their resiliency in the frigid environment and to collect an initial dataset.

    “One aspect that differentiates these experiments from those of the past is that we concurrently collected seismo-acoustic data and environmental parameters,” says Evans.

    The commercial technologies were seismometers to detect the vibrational energy released when sea ice fractures or collides with other ice floes; a hydrophone (underwater microphone) array to record the acoustic energy created by ice-fracturing events; a sound speed profiler to measure the speed of sound through the water column; and a conductivity, temperature, and depth (CTD) profiler to measure the salinity (related to conductivity), temperature, and pressure (related to depth) throughout the water column. The speed of sound in the ocean primarily depends on these three quantities. 

    To precisely measure the temperature across the entire water column at one location, they deployed an array of transistor-based temperature sensors developed by the laboratory’s Advanced Materials and Microsystems Group in collaboration with the Advanced Functional Fabrics of America Manufacturing Innovation Institute. The small temperature sensors run along the length of a thread-like polymer fiber embedded with multiple conductors. This fiber platform, which can support a broad range of sensors, can be unspooled hundreds of feet below the water’s surface to concurrently measure temperature or other water properties — the fiber deployed in the Arctic also contained accelerometers to measure depth — at many points in the water column. Traditionally, temperature profiling has required moving a device up and down through the water column.

    The team also deployed a high-frequency echosounder supplied by Anthony Lyons and Larry Mayer, collaborators at UNH’s Center for Coastal and Ocean Mapping. This active sonar uses acoustic energy to detect internal waves, or waves occurring beneath the ocean’s surface.

    “You may think of the ocean as a homogenous body of water, but it’s not,” Evans explains. “Different currents can exist as you go down in depth, much like how you can get different winds when you go up in altitude. The UNH echosounder allows us to see the different currents in the water column, as well as ice roughness when we turn the sensor to look upward.”

    “The reason we care about currents is that we believe they will tell us something about how warmer water from the Atlantic Ocean is coming into contact with sea ice,” adds Whelihan. “Not only is that water melting ice but it also has lower salt content, resulting in oceanic layers and affecting how long ice lasts and where it lasts.”

    Back home, the team has begun analyzing their data. For the seismic data, this analysis involves distinguishing any ice events from various sources of anthropogenic noise, including generators, snowmobiles, footsteps, and aircraft. Similarly, the researchers know their hydrophone array acoustic data are contaminated by energy from a sound source that another research team participating in ICEX placed in the water. Based on their physics, icequakes — the seismic events that occur when ice cracks — have characteristic signatures that can be used to identify them. One approach is to manually find an icequake and use that signature as a guide for finding other icequakes in the dataset.

    From their water column profiling sensors, they identified an interesting evolution in the sound speed profile 30 to 40 meters below the ocean surface, related to a mass of colder water moving in later in the day. The group’s physical oceanographer believes this change in the profile is due to water coming up from the Bering Sea, water that initially comes from the Atlantic Ocean. The UNH-supplied echosounder also generated an interesting signal at a similar depth.

    “Our supposition is that this result has something to do with the large sound speed variation we detected, either directly because of reflections off that layer or because of plankton, which tend to rise on top of that layer,” explains Evans.  

    A future predictive capability

    Going forward, the team will continue mining their collected data and use these data to begin building algorithms capable of automatically detecting and localizing — and ultimately predicting — ice events correlated with changes in environmental conditions. To complement their experimental data, they have initiated conversations with organizations that model the physical behavior of sea ice, including the National Oceanic and Atmospheric Administration and the National Ice Center. Merging the laboratory’s expertise in sensor design and signal processing with their expertise in ice physics would provide a more complete understanding of how the Arctic is changing.

    The laboratory team will also start exploring cost-effective engineering approaches for integrating the sensors into packages hardened for deployment in the harsh environment of the Arctic.

    “Until these sensors are truly unattended, the human factor of usability is front and center,” says Whelihan. “Because it’s so cold, equipment can break accidentally. For example, at ICEX 2022, our waterproof enclosure for the seismometers survived, but the enclosure for its power supply, which was made out of a cheaper plastic, shattered in my hand when I went to pick it up.”

    The sensor packages will not only need to withstand the frigid environment but also be able to “phone home” over some sort of satellite data link and sustain their power. The team plans to investigate whether waste heat from processing can keep the instruments warm and how energy could be harvested from the Arctic environment.

    Before the next ICEX scheduled for 2024, they hope to perform preliminary testing of their sensor packages and concepts in Arctic-like environments. While attending ICEX 2022, they engaged with several other attendees — including the U.S. Navy, Arctic Submarine Laboratory, National Ice Center, and University of Alaska Fairbanks (UAF) — and identified cold room experimentation as one area of potential collaboration. Testing can also be performed at outdoor locations a bit closer to home and more easily accessible, such as the Great Lakes in Michigan and a UAF-maintained site in Barrow, Alaska. In the future, the laboratory team may have an opportunity to accompany U.S. Coast Guard personnel on ice-breaking vessels traveling from Alaska to Greenland. The team is also thinking about possible venues for collecting data far removed from human noise sources.

    “Since I’ve told colleagues, friends, and family I was going to the Arctic, I’ve had a lot of interesting conversations about climate change and what we’re doing there and why we’re doing it,” Whelihan says. “People don’t have an intrinsic, automatic understanding of this environment and its impact because it’s so far removed from us. But the Arctic plays a crucial role in helping to keep the global climate in balance, so it’s imperative we understand the processes leading to sea ice fractures.”

    This work is funded through Lincoln Laboratory’s internally administered R&D portfolio on climate. More

  • in

    Empowering people to adapt on the frontlines of climate change

    On April 11, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the fifth in a five-part series highlighting the most promising concepts to emerge from the competition and the interdisciplinary research teams behind them.

    In the coastal south of Bangladesh, rice paddies that farmers could once harvest three times a year lie barren. Sea-level rise brings saltwater to the soil, ruining the staple crop. It’s one of many impacts, and inequities, of climate change. Despite producing less than 1 percent of global carbon emissions, Bangladesh is suffering more than most countries. Rising seas, heat waves, flooding, and cyclones threaten 90 million people.

    A platform being developed in a collaboration between MIT and BRAC, a Bangladesh-based global development organization, aims to inform and empower climate-threatened communities to proactively adapt to a changing future. Selected as one of five MIT Climate Grand Challenges flagship projects, the Climate Resilience Early Warning System (CREWSnet) will forecast the local impacts of climate change on people’s lives, homes, and livelihoods. These forecasts will guide BRAC’s development of climate-resiliency programs to help residents prepare for and adapt to life-altering conditions.

    “The communities that CREWSnet will focus on have done little to contribute to the problem of climate change in the first place. However, because of socioeconomic situations, they may be among the most vulnerable. We hope that by providing state-of-the-art projections and sharing them broadly with communities, and working through partners like BRAC, we can help improve the capacity of local communities to adapt to climate change, significantly,” says Elfatih Eltahir, the H.M. King Bhumibol Professor in the Department of Civil and Environmental Engineering.

    Eltahir leads the project with John Aldridge and Deborah Campbell in the Humanitarian Assistance and Disaster Relief Systems Group at Lincoln Laboratory. Additional partners across MIT include the Center for Global Change Science; the Department of Earth, Atmospheric and Planetary Sciences; the Joint Program on the Science and Policy of Global Change; and the Abdul Latif Jameel Poverty Action Lab. 

    Predicting local risks

    CREWSnet’s forecasts rely upon a sophisticated model, developed in Eltahir’s research group over the past 25 years, called the MIT Regional Climate Model. This model zooms in on climate processes at local scales, at a resolution as granular as 6 miles. In Bangladesh’s population-dense cities, a 6-mile area could encompass tens, or even hundreds, of thousands of people. The model takes into account the details of a region’s topography, land use, and coastline to predict changes in local conditions.

    When applying this model over Bangladesh, researchers found that heat waves will get more severe and more frequent over the next 30 years. In particular, wet-bulb temperatures, which indicate the ability for humans to cool down by sweating, will rise to dangerous levels rarely observed today, particularly in western, inland cities.

    Such hot spots exacerbate other challenges predicted to worsen near Bangladesh’s coast. Rising sea levels and powerful cyclones are eroding and flooding coastal communities, causing saltwater to surge into land and freshwater. This salinity intrusion is detrimental to human health, ruins drinking water supplies, and harms crops, livestock, and aquatic life that farmers and fishermen depend on for food and income.

    CREWSnet will fuse climate science with forecasting tools that predict the social and economic impacts to villages and cities. These forecasts — such as how often a crop season may fail, or how far floodwaters will reach — can steer decision-making.

    “What people need to know, whether they’re a governor or head of a household, is ‘What is going to happen in my area, and what decisions should I make for the people I’m responsible for?’ Our role is to integrate this science and technology together into a decision support system,” says Aldridge, whose group at Lincoln Laboratory specializes in this area. Most recently, they transitioned a hurricane-evacuation planning system to the U.S. government. “We know that making decisions based on climate change requires a deep level of trust. That’s why having a powerful partner like BRAC is so important,” he says.

    Testing interventions

    Established 50 years ago, just after Bangladesh’s independence, BRAC works in every district of the nation to provide social services that help people rise from extreme poverty. Today, it is one of the world’s largest nongovernmental organizations, serving 110 million people across 11 countries in Asia and Africa, but its success is cultivated locally.

    “BRAC is thrilled to partner with leading researchers at MIT to increase climate resilience in Bangladesh and provide a model that can be scaled around the globe,” says Donella Rapier, president and CEO of BRAC USA. “Locally led climate adaptation solutions that are developed in partnership with communities are urgently needed, particularly in the most vulnerable regions that are on the frontlines of climate change.”

    CREWSnet will help BRAC identify communities most vulnerable to forecasted impacts. In these areas, they will share knowledge and innovate or bolster programs to improve households’ capacity to adapt.

    Many climate initiatives are already underway. One program equips homes to filter and store rainwater, as salinity intrusion makes safe drinking water hard to access. Another program is building resilient housing, able to withstand 120-mile-per-hour winds, that can double as local shelters during cyclones and flooding. Other services are helping farmers switch to different livestock or crops better suited for wetter or saltier conditions (e.g., ducks instead of chickens, or salt-tolerant rice), providing interest-free loans to enable this change.

    But adapting in place will not always be possible, for example in areas predicted to be submerged or unbearably hot by midcentury. “Bangladesh is working on identifying and developing climate-resilient cities and towns across the country, as closer-by alternative destinations as compared to moving to Dhaka, the overcrowded capital of Bangladesh,” says Campbell. “CREWSnet can help identify regions better suited for migration, and climate-resilient adaptation strategies for those regions.” At the same time, BRAC’s Climate Bridge Fund is helping to prepare cities for climate-induced migration, building up infrastructure and financial services for people who have been displaced.

    Evaluating impact

    While CREWSnet’s goal is to enable action, it can’t quite measure the impact of those actions. The Abdul Latif Jameel Poverty Action Lab (J-PAL), a development economics program in the MIT School of Humanities, Arts, and Social Sciences, will help evaluate the effectiveness of the climate-adaptation programs.

    “We conduct randomized controlled trials, similar to medical trials, that help us understand if a program improved people’s lives,” says Claire Walsh, the project director of the King Climate Action Initiative at J-PAL. “Once CREWSnet helps BRAC implement adaptation programs, we will generate scientific evidence on their impacts, so that BRAC and CREWSnet can make a case to funders and governments to expand effective programs.”

    The team aspires to bring CREWSnet to other nations disproportionately impacted by climate change. “Our vision is to have this be a globally extensible capability,” says Campbell. CREWSnet’s name evokes another early-warning decision-support system, FEWSnet, that helped organizations address famine in eastern Africa in the 1980s. Today it is a pillar of food-security planning around the world.

    CREWSnet hopes for a similar impact in climate change planning. Its selection as an MIT Climate Grand Challenges flagship project will inject the project with more funding and resources, momentum that will also help BRAC’s fundraising. The team plans to deploy CREWSnet to southwestern Bangladesh within five years.

    “The communities that we are aspiring to reach with CREWSnet are deeply aware that their lives are changing — they have been looking climate change in the eye for many years. They are incredibly resilient, creative, and talented,” says Ashley Toombs, the external affairs director for BRAC USA. “As a team, we are excited to bring this system to Bangladesh. And what we learn together, we will apply at potentially even larger scales.” More

  • in

    MIT announces five flagship projects in first-ever Climate Grand Challenges competition

    MIT today announced the five flagship projects selected in its first-ever Climate Grand Challenges competition. These multiyear projects will define a dynamic research agenda focused on unraveling some of the toughest unsolved climate problems and bringing high-impact, science-based solutions to the world on an accelerated basis.

    Representing the most promising concepts to emerge from the two-year competition, the five flagship projects will receive additional funding and resources from MIT and others to develop their ideas and swiftly transform them into practical solutions at scale.

    “Climate Grand Challenges represents a whole-of-MIT drive to develop game-changing advances to confront the escalating climate crisis, in time to make a difference,” says MIT President L. Rafael Reif. “We are inspired by the creativity and boldness of the flagship ideas and by their potential to make a significant contribution to the global climate response. But given the planet-wide scale of the challenge, success depends on partnership. We are eager to work with visionary leaders in every sector to accelerate this impact-oriented research, implement serious solutions at scale, and inspire others to join us in confronting this urgent challenge for humankind.”

    Brief descriptions of the five Climate Grand Challenges flagship projects are provided below.

    Bringing Computation to the Climate Challenge

    This project leverages advances in artificial intelligence, machine learning, and data sciences to improve the accuracy of climate models and make them more useful to a variety of stakeholders — from communities to industry. The team is developing a digital twin of the Earth that harnesses more data than ever before to reduce and quantify uncertainties in climate projections.

    Research leads: Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in the Department of Earth, Atmospheric and Planetary Sciences, and director of the Program in Atmospheres, Oceans, and Climate; and Noelle Eckley Selin, director of the Technology and Policy Program and professor with a joint appointment in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences

    Center for Electrification and Decarbonization of Industry

    This project seeks to reinvent and electrify the processes and materials behind hard-to-decarbonize industries like steel, cement, ammonia, and ethylene production. A new innovation hub will perform targeted fundamental research and engineering with urgency, pushing the technological envelope on electricity-driven chemical transformations.

    Research leads: Yet-Ming Chiang, the Kyocera Professor of Materials Science and Engineering, and Bilge Yıldız, the Breene M. Kerr Professor in the Department of Nuclear Science and Engineering and professor in the Department of Materials Science and Engineering

    Preparing for a new world of weather and climate extremes

    This project addresses key gaps in knowledge about intensifying extreme events such as floods, hurricanes, and heat waves, and quantifies their long-term risk in a changing climate. The team is developing a scalable climate-change adaptation toolkit to help vulnerable communities and low-carbon energy providers prepare for these extreme weather events.

    Research leads: Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Sciences and co-director of the MIT Lorenz Center; Miho Mazereeuw, associate professor of architecture and urbanism in the Department of Architecture and director of the Urban Risk Lab; and Paul O’Gorman, professor in the Program in Atmospheres, Oceans, and Climate in the Department of Earth, Atmospheric and Planetary Sciences

    The Climate Resilience Early Warning System

    The CREWSnet project seeks to reinvent climate change adaptation with a novel forecasting system that empowers underserved communities to interpret local climate risk, proactively plan for their futures incorporating resilience strategies, and minimize losses. CREWSnet will initially be demonstrated in southwestern Bangladesh, serving as a model for similarly threatened regions around the world.

    Research leads: John Aldridge, assistant leader of the Humanitarian Assistance and Disaster Relief Systems Group at MIT Lincoln Laboratory, and Elfatih Eltahir, the H.M. King Bhumibol Professor of Hydrology and Climate in the Department of Civil and Environmental Engineering

    Revolutionizing agriculture with low-emissions, resilient crops

    This project works to revolutionize the agricultural sector with climate-resilient crops and fertilizers that have the ability to dramatically reduce greenhouse gas emissions from food production.

    Research lead: Christopher Voigt, the Daniel I.C. Wang Professor in the Department of Biological Engineering

    “As one of the world’s leading institutions of research and innovation, it is incumbent upon MIT to draw on our depth of knowledge, ingenuity, and ambition to tackle the hard climate problems now confronting the world,” says Richard Lester, MIT associate provost for international activities. “Together with collaborators across industry, finance, community, and government, the Climate Grand Challenges teams are looking to develop and implement high-impact, path-breaking climate solutions rapidly and at a grand scale.”

    The initial call for ideas in 2020 yielded nearly 100 letters of interest from almost 400 faculty members and senior researchers, representing 90 percent of MIT departments. After an extensive evaluation, 27 finalist teams received a total of $2.7 million to develop comprehensive research and innovation plans. The projects address four broad research themes:

    To select the winning projects, research plans were reviewed by panels of international experts representing relevant scientific and technical domains as well as experts in processes and policies for innovation and scalability.

    “In response to climate change, the world really needs to do two things quickly: deploy the solutions we already have much more widely, and develop new solutions that are urgently needed to tackle this intensifying threat,” says Maria Zuber, MIT vice president for research. “These five flagship projects exemplify MIT’s strong determination to bring its knowledge and expertise to bear in generating new ideas and solutions that will help solve the climate problem.”

    “The Climate Grand Challenges flagship projects set a new standard for inclusive climate solutions that can be adapted and implemented across the globe,” says MIT Chancellor Melissa Nobles. “This competition propels the entire MIT research community — faculty, students, postdocs, and staff — to act with urgency around a worsening climate crisis, and I look forward to seeing the difference these projects can make.”

    “MIT’s efforts on climate research amid the climate crisis was a primary reason that I chose to attend MIT, and remains a reason that I view the Institute favorably. MIT has a clear opportunity to be a thought leader in the climate space in our own MIT way, which is why CGC fits in so well,” says senior Megan Xu, who served on the Climate Grand Challenges student committee and is studying ways to make the food system more sustainable.

    The Climate Grand Challenges competition is a key initiative of “Fast Forward: MIT’s Climate Action Plan for the Decade,” which the Institute published in May 2021. Fast Forward outlines MIT’s comprehensive plan for helping the world address the climate crisis. It consists of five broad areas of action: sparking innovation, educating future generations, informing and leveraging government action, reducing MIT’s own climate impact, and uniting and coordinating all of MIT’s climate efforts. More

  • in

    Q&A: Climate Grand Challenges finalists on using data and science to forecast climate-related risk

    Note: This is the final article in a four-part interview series featuring the work of the 27 MIT Climate Grand Challenges finalist teams, which received a total of $2.7 million in startup funding to advance their projects. This month, the Institute will name a subset of the finalists as multiyear flagship projects.

    Advances in computation, artificial intelligence, robotics, and data science are enabling a new generation of observational tools and scientific modeling with the potential to produce timely, reliable, and quantitative analysis of future climate risks at a local scale. These projections can increase the accuracy and efficacy of early warning systems, improve emergency planning, and provide actionable information for climate mitigation and adaptation efforts, as human actions continue to change planetary conditions.

    In conversations prepared for MIT News, faculty from four Climate Grand Challenges teams with projects in the competition’s “Using data and science to forecast climate-related risk” category describe the promising new technologies that can help scientists understand the Earth’s climate system on a finer scale than ever before. (The other Climate Grand Challenges research themes include building equity and fairness into climate solutions, removing, managing, and storing greenhouse gases, and decarbonizing complex industries and processes.) The following responses have been edited for length and clarity.

    An observational system that can initiate a climate risk forecasting revolution

    Despite recent technological advances and massive volumes of data, climate forecasts remain highly uncertain. Gaps in observational capabilities create substantial challenges to predicting extreme weather events and establishing effective mitigation and adaptation strategies. R. John Hansman, the T. Wilson Professor of Aeronautics and Astronautics and director of the MIT International Center for Air Transportation, discusses the Stratospheric Airborne Climate Observatory System (SACOS) being developed together with Brent Minchew, the Cecil and Ida Green Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and a team that includes researchers from MIT Lincoln Laboratory and Harvard University.

    Q: How does SACOS reduce uncertainty in climate risk forecasting?

    A: There is a critical need for higher spatial and temporal resolution observations of the climate system than are currently available through remote (satellite or airborne) and surface (in-situ) sensing. We are developing an ensemble of high-endurance, solar-powered aircraft with instrument systems capable of performing months-long climate observing missions that satellites or aircraft alone cannot fulfill. Summer months are ideal for SACOS operations, as many key climate phenomena are active and short night periods reduce the battery mass, vehicle size, and technical risks. These observations hold the potential to inform and predict, allowing emergency planners, policymakers, and the rest of society to better prepare for the changes to come.

    Q: Describe the types of observing missions where SACOS could provide critical improvements.

    A: The demise of the Antarctic Ice Sheet, which is leading to rising sea levels around the world and threatening the displacement of millions of people, is one example. Current sea level forecasts struggle to account for giant fissures that create massive icebergs and cause the Antarctic Ice Sheet to flow more rapidly into the ocean. SACOS can track these fissures to accurately forecast ice slippage and give impacted populations enough time to prepare or evacuate. Elsewhere, widespread droughts cause rampant wildfires and water shortages. SACOS has the ability to monitor soil moisture and humidity in critically dry regions to identify where and when wildfires and droughts are imminent. SACOS also offers the most effective method to measure, track, and predict local ozone depletion over North America, which has resulted in increasingly severe summer thunderstorms.

    Quantifying and managing the risks of sea-level rise

    Prevailing estimates of sea-level rise range from approximately 20 centimeters to 2 meters by the end of the century, with the associated costs on the order of trillions of dollars. The instability of certain portions of the world’s ice sheets creates vast uncertainties, complicating how the world prepares for and responds to these potential changes. EAPS Professor Brent Minchew is leading another Climate Grand Challenges finalist team working on an integrated, multidisciplinary effort to improve the scientific understanding of sea-level rise and provide actionable information and tools to manage the risks it poses.

    Q: What have been the most significant challenges to understanding the potential rates of sea-level rise?

    A: West Antarctica is one of the most remote, inaccessible, and hostile places on Earth — to people and equipment. Thus, opportunities to observe the collapse of the West Antarctic Ice Sheet, which contains enough ice to raise global sea levels by about 3 meters, are limited and current observations crudely resolved. It is essential that we understand how the floating edge of the ice sheets, often called ice shelves, fracture and collapse because they provide critical forces that govern the rate of ice mass loss and can stabilize the West Antarctic Ice Sheet.

    Q: How will your project advance what is currently known about sea-level rise?

    A: We aim to advance global-scale projections of sea-level rise through novel observational technologies and computational models of ice sheet change and to link those predictions to region- to neighborhood-scale estimates of costs and adaptation strategies. To do this, we propose two novel instruments: a first-of-its-kind drone that can fly for months at a time over Antarctica making continuous observations of critical areas and an airdropped seismometer and GPS bundle that can be deployed to vulnerable and hard-to-reach areas of the ice sheet. This technology will provide greater data quality and density and will observe the ice sheet at frequencies that are currently inaccessible — elements that are essential for understanding the physics governing the evolution of the ice sheet and sea-level rise.

    Changing flood risk for coastal communities in the developing world

    Globally, more than 600 million people live in low-elevation coastal areas that face an increasing risk of flooding from sea-level rise. This includes two-thirds of cities with populations of more than 5 million and regions that conduct the vast majority of global trade. Dara Entekhabi, the Bacardi and Stockholm Water Foundations Professor in the Department of Civil and Environmental Engineering and professor in the Department of Earth, Atmospheric, and Planetary Sciences, outlines an interdisciplinary partnership that leverages data and technology to guide short-term and chart long-term adaptation pathways with Miho Mazereeuw, associate professor of architecture and urbanism and director of the Urban Risk Lab in the School of Architecture and Planning, and Danielle Wood, assistant professor in the Program in Media Arts and Sciences and the Department of Aeronautics and Astronautics.

    Q: What is the key problem this program seeks to address?

    A: The accumulated heating of the Earth system due to fossil burning is largely absorbed by the oceans, and the stored heat expands the ocean volume leading to increased base height for tides. When the high tides inundate a city, the condition is referred to as “sunny day” flooding, but the saline waters corrode infrastructure and wreak havoc on daily routines. The danger ahead for many coastal cities in the developing world is the combination of increasing high tide intrusions, coupled with heavy precipitation storm events.

    Q: How will your proposed solutions impact flood risk management?

    A: We are producing detailed risk maps for coastal cities in developing countries using newly available, very high-resolution remote-sensing data from space-borne instruments, as well as historical tides records and regional storm characteristics. Using these datasets, we aim to produce street-by-street risk maps that provide local decision-makers and stakeholders with a way to estimate present and future flood risks. With the model of future tides and probabilistic precipitation events, we can forecast future inundation by a flooding event, decadal changes with various climate-change and sea-level rise projections, and an increase in the likelihood of sunny-day flooding. Working closely with local partners, we will develop toolkits to explore short-term emergency response, as well as long-term mitigation and adaptation techniques in six pilot locations in South and Southeast Asia, Africa, and South America.

    Ocean vital signs

    On average, every person on Earth generates fossil fuel emissions equivalent to an 8-pound bag of carbon, every day. Much of this is absorbed by the ocean, but there is wide variability in the estimates of oceanic absorption, which translates into differences of trillions of dollars in the required cost of mitigation. In the Department of Earth, Atmospheric and Planetary Sciences, Christopher Hill, a principal research engineer specializing in Earth and planetary computational science, works with Ryan Woosley, a principal research scientist focusing on the carbon cycle and ocean acidification. Hill explains that they hope to use artificial intelligence and machine learning to help resolve this uncertainty.

    Q: What is the current state of knowledge on air-sea interactions?

    A: Obtaining specific, accurate field measurements of critical physical, chemical, and biological exchanges between the ocean and the planet have historically entailed expensive science missions with large ship-based infrastructure that leave gaps in real-time data about significant ocean climate processes. Recent advances in highly scalable in-situ autonomous observing and navigation combined with airborne, remote sensing, and machine learning innovations have the potential to transform data gathering, provide more accurate information, and address fundamental scientific questions around air-sea interaction.

    Q: How will your approach accelerate real-time, autonomous surface ocean observing from an experimental research endeavor to a permanent and impactful solution?

    A: Our project seeks to demonstrate how a scalable surface ocean observing network can be launched and operated, and to illustrate how this can reduce uncertainties in estimates of air-sea carbon dioxide exchange. With an initial high-impact goal of substantially eliminating the vast uncertainties that plague our understanding of ocean uptake of carbon dioxide, we will gather critical measurements for improving extended weather and climate forecast models and reducing climate impact uncertainty. The results have the potential to more accurately identify trillions of dollars worth of economic activity. More