More stories

  • in

    Cracking the case of Arctic sea ice breakup

    Despite its below-freezing temperatures, the Arctic is warming twice as fast as the rest of the planet. As Arctic sea ice melts, fewer bright surfaces are available to reflect sunlight back into space. When fractures open in the ice cover, the water underneath gets exposed. Dark, ice-free water absorbs the sun’s energy, heating the ocean and driving further melting — a vicious cycle. This warming in turn melts glacial ice, contributing to rising sea levels.

    Warming climate and rising sea levels endanger the nearly 40 percent of the U.S. population living in coastal areas, the billions of people who depend on the ocean for food and their livelihoods, and species such as polar bears and Artic foxes. Reduced ice coverage is also making the once-impassable region more accessible, opening up new shipping lanes and ports. Interest in using these emerging trans-Arctic routes for product transit, extraction of natural resources (e.g., oil and gas), and military activity is turning an area traditionally marked by low tension and cooperation into one of global geopolitical competition.

    As the Arctic opens up, predicting when and where the sea ice will fracture becomes increasingly important in strategic decision-making. However, huge gaps exist in our understanding of the physical processes contributing to ice breakup. Researchers at MIT Lincoln Laboratory seek to help close these gaps by turning a data-sparse environment into a data-rich one. They envision deploying a distributed set of unattended sensors across the Arctic that will persistently detect and geolocate ice fracturing events. Concurrently, the network will measure various environmental conditions, including water temperature and salinity, wind speed and direction, and ocean currents at different depths. By correlating these fracturing events and environmental conditions, they hope to discover meaningful insights about what is causing the sea ice to break up. Such insights could help predict the future state of Arctic sea ice to inform climate modeling, climate change planning, and policy decision-making at the highest levels.

    “We’re trying to study the relationship between ice cracking, climate change, and heat flow in the ocean,” says Andrew March, an assistant leader of Lincoln Laboratory’s Advanced Undersea Systems and Technology Group. “Do cracks in the ice cause warm water to rise and more ice to melt? Do undersea currents and waves cause cracking? Does cracking cause undersea waves? These are the types of questions we aim to investigate.”

    Arctic access

    In March 2022, Ben Evans and Dave Whelihan, both researchers in March’s group, traveled for 16 hours across three flights to Prudhoe Bay, located on the North Slope of Alaska. From there, they boarded a small specialized aircraft and flew another 90 minutes to a three-and-a-half-mile-long sheet of ice floating 160 nautical miles offshore in the Arctic Ocean. In the weeks before their arrival, the U.S. Navy’s Arctic Submarine Laboratory had transformed this inhospitable ice floe into a temporary operating base called Ice Camp Queenfish, named after the first Sturgeon-class submarine to operate under the ice and the fourth to reach the North Pole. The ice camp featured a 2,500-foot-long runway, a command center, sleeping quarters to accommodate up to 60 personnel, a dining tent, and an extremely limited internet connection.

    At Queenfish, for the next four days, Evans and Whelihan joined U.S. Navy, Army, Air Force, Marine Corps, and Coast Guard members, and members of the Royal Canadian Air Force and Navy and United Kingdom Royal Navy, who were participating in Ice Exercise (ICEX) 2022. Over the course of about three weeks, more than 200 personnel stationed at Queenfish, Prudhoe Bay, and aboard two U.S. Navy submarines participated in this biennial exercise. The goals of ICEX 2022 were to assess U.S. operational readiness in the Arctic; increase our country’s experience in the region; advance our understanding of the Arctic environment; and continue building relationships with other services, allies, and partner organizations to ensure a free and peaceful Arctic. The infrastructure provided for ICEX concurrently enables scientists to conduct research in an environment — either in person or by sending their research equipment for exercise organizers to deploy on their behalf — that would be otherwise extremely difficult and expensive to access.

    In the Arctic, windchill temperatures can plummet to as low as 60 degrees Fahrenheit below zero, cold enough to freeze exposed skin within minutes. Winds and ocean currents can drift the entire camp beyond the reach of nearby emergency rescue aircraft, and the ice can crack at any moment. To ensure the safety of participants, a team of Navy meteorological specialists continually monitors the ever-changing conditions. The original camp location for ICEX 2022 had to be evacuated and relocated after a massive crack formed in the ice, delaying Evans’ and Whelihan’s trip. Even the newly selected site had a large crack form behind the camp and another crack that necessitated moving a number of tents.

    “Such cracking events are only going to increase as the climate warms, so it’s more critical now than ever to understand the physical processes behind them,” Whelihan says. “Such an understanding will require building technology that can persist in the environment despite these incredibly harsh conditions. So, it’s a challenge not only from a scientific perspective but also an engineering one.”

    “The weather always gets a vote, dictating what you’re able to do out here,” adds Evans. “The Arctic Submarine Laboratory does a lot of work to construct the camp and make it a safe environment where researchers like us can come to do good science. ICEX is really the only opportunity we have to go onto the sea ice in a place this remote to collect data.”

    A legacy of sea ice experiments

    Though this trip was Whelihan’s and Evans’ first to the Arctic region, staff from the laboratory’s Advanced Undersea Systems and Technology Group have been conducting experiments at ICEX since 2018. However, because of the Arctic’s remote location and extreme conditions, data collection has rarely been continuous over long periods of time or widespread across large areas. The team now hopes to change that by building low-cost, expendable sensing platforms consisting of co-located devices that can be left unattended for automated, persistent, near-real-time monitoring. 

    “The laboratory’s extensive expertise in rapid prototyping, seismo-acoustic signal processing, remote sensing, and oceanography make us a natural fit to build this sensor network,” says Evans.

    In the months leading up to the Arctic trip, the team collected seismometer data at Firepond, part of the laboratory’s Haystack Observatory site in Westford, Massachusetts. Through this local data collection, they aimed to gain a sense of what anthropogenic (human-induced) noise would look like so they could begin to anticipate the kinds of signatures they might see in the Arctic. They also collected ice melting/fracturing data during a thaw cycle and correlated these data with the weather conditions (air temperature, humidity, and pressure). Through this analysis, they detected an increase in seismic signals as the temperature rose above 32 F — an indication that air temperature and ice cracking may be related.

    A sensing network

    At ICEX, the team deployed various commercial off-the-shelf sensors and new sensors developed by the laboratory and University of New Hampshire (UNH) to assess their resiliency in the frigid environment and to collect an initial dataset.

    “One aspect that differentiates these experiments from those of the past is that we concurrently collected seismo-acoustic data and environmental parameters,” says Evans.

    The commercial technologies were seismometers to detect the vibrational energy released when sea ice fractures or collides with other ice floes; a hydrophone (underwater microphone) array to record the acoustic energy created by ice-fracturing events; a sound speed profiler to measure the speed of sound through the water column; and a conductivity, temperature, and depth (CTD) profiler to measure the salinity (related to conductivity), temperature, and pressure (related to depth) throughout the water column. The speed of sound in the ocean primarily depends on these three quantities. 

    To precisely measure the temperature across the entire water column at one location, they deployed an array of transistor-based temperature sensors developed by the laboratory’s Advanced Materials and Microsystems Group in collaboration with the Advanced Functional Fabrics of America Manufacturing Innovation Institute. The small temperature sensors run along the length of a thread-like polymer fiber embedded with multiple conductors. This fiber platform, which can support a broad range of sensors, can be unspooled hundreds of feet below the water’s surface to concurrently measure temperature or other water properties — the fiber deployed in the Arctic also contained accelerometers to measure depth — at many points in the water column. Traditionally, temperature profiling has required moving a device up and down through the water column.

    The team also deployed a high-frequency echosounder supplied by Anthony Lyons and Larry Mayer, collaborators at UNH’s Center for Coastal and Ocean Mapping. This active sonar uses acoustic energy to detect internal waves, or waves occurring beneath the ocean’s surface.

    “You may think of the ocean as a homogenous body of water, but it’s not,” Evans explains. “Different currents can exist as you go down in depth, much like how you can get different winds when you go up in altitude. The UNH echosounder allows us to see the different currents in the water column, as well as ice roughness when we turn the sensor to look upward.”

    “The reason we care about currents is that we believe they will tell us something about how warmer water from the Atlantic Ocean is coming into contact with sea ice,” adds Whelihan. “Not only is that water melting ice but it also has lower salt content, resulting in oceanic layers and affecting how long ice lasts and where it lasts.”

    Back home, the team has begun analyzing their data. For the seismic data, this analysis involves distinguishing any ice events from various sources of anthropogenic noise, including generators, snowmobiles, footsteps, and aircraft. Similarly, the researchers know their hydrophone array acoustic data are contaminated by energy from a sound source that another research team participating in ICEX placed in the water. Based on their physics, icequakes — the seismic events that occur when ice cracks — have characteristic signatures that can be used to identify them. One approach is to manually find an icequake and use that signature as a guide for finding other icequakes in the dataset.

    From their water column profiling sensors, they identified an interesting evolution in the sound speed profile 30 to 40 meters below the ocean surface, related to a mass of colder water moving in later in the day. The group’s physical oceanographer believes this change in the profile is due to water coming up from the Bering Sea, water that initially comes from the Atlantic Ocean. The UNH-supplied echosounder also generated an interesting signal at a similar depth.

    “Our supposition is that this result has something to do with the large sound speed variation we detected, either directly because of reflections off that layer or because of plankton, which tend to rise on top of that layer,” explains Evans.  

    A future predictive capability

    Going forward, the team will continue mining their collected data and use these data to begin building algorithms capable of automatically detecting and localizing — and ultimately predicting — ice events correlated with changes in environmental conditions. To complement their experimental data, they have initiated conversations with organizations that model the physical behavior of sea ice, including the National Oceanic and Atmospheric Administration and the National Ice Center. Merging the laboratory’s expertise in sensor design and signal processing with their expertise in ice physics would provide a more complete understanding of how the Arctic is changing.

    The laboratory team will also start exploring cost-effective engineering approaches for integrating the sensors into packages hardened for deployment in the harsh environment of the Arctic.

    “Until these sensors are truly unattended, the human factor of usability is front and center,” says Whelihan. “Because it’s so cold, equipment can break accidentally. For example, at ICEX 2022, our waterproof enclosure for the seismometers survived, but the enclosure for its power supply, which was made out of a cheaper plastic, shattered in my hand when I went to pick it up.”

    The sensor packages will not only need to withstand the frigid environment but also be able to “phone home” over some sort of satellite data link and sustain their power. The team plans to investigate whether waste heat from processing can keep the instruments warm and how energy could be harvested from the Arctic environment.

    Before the next ICEX scheduled for 2024, they hope to perform preliminary testing of their sensor packages and concepts in Arctic-like environments. While attending ICEX 2022, they engaged with several other attendees — including the U.S. Navy, Arctic Submarine Laboratory, National Ice Center, and University of Alaska Fairbanks (UAF) — and identified cold room experimentation as one area of potential collaboration. Testing can also be performed at outdoor locations a bit closer to home and more easily accessible, such as the Great Lakes in Michigan and a UAF-maintained site in Barrow, Alaska. In the future, the laboratory team may have an opportunity to accompany U.S. Coast Guard personnel on ice-breaking vessels traveling from Alaska to Greenland. The team is also thinking about possible venues for collecting data far removed from human noise sources.

    “Since I’ve told colleagues, friends, and family I was going to the Arctic, I’ve had a lot of interesting conversations about climate change and what we’re doing there and why we’re doing it,” Whelihan says. “People don’t have an intrinsic, automatic understanding of this environment and its impact because it’s so far removed from us. But the Arctic plays a crucial role in helping to keep the global climate in balance, so it’s imperative we understand the processes leading to sea ice fractures.”

    This work is funded through Lincoln Laboratory’s internally administered R&D portfolio on climate. More

  • in

    Engineers use artificial intelligence to capture the complexity of breaking waves

    Waves break once they swell to a critical height, before cresting and crashing into a spray of droplets and bubbles. These waves can be as large as a surfer’s point break and as small as a gentle ripple rolling to shore. For decades, the dynamics of how and when a wave breaks have been too complex to predict.

    Now, MIT engineers have found a new way to model how waves break. The team used machine learning along with data from wave-tank experiments to tweak equations that have traditionally been used to predict wave behavior. Engineers typically rely on such equations to help them design resilient offshore platforms and structures. But until now, the equations have not been able to capture the complexity of breaking waves.

    The updated model made more accurate predictions of how and when waves break, the researchers found. For instance, the model estimated a wave’s steepness just before breaking, and its energy and frequency after breaking, more accurately than the conventional wave equations.

    Their results, published today in the journal Nature Communications, will help scientists understand how a breaking wave affects the water around it. Knowing precisely how these waves interact can help hone the design of offshore structures. It can also improve predictions for how the ocean interacts with the atmosphere. Having better estimates of how waves break can help scientists predict, for instance, how much carbon dioxide and other atmospheric gases the ocean can absorb.

    “Wave breaking is what puts air into the ocean,” says study author Themis Sapsis, an associate professor of mechanical and ocean engineering and an affiliate of the Institute for Data, Systems, and Society at MIT. “It may sound like a detail, but if you multiply its effect over the area of the entire ocean, wave breaking starts becoming fundamentally important to climate prediction.”

    The study’s co-authors include lead author and MIT postdoc Debbie Eeltink, Hubert Branger and Christopher Luneau of Aix-Marseille University, Amin Chabchoub of Kyoto University, Jerome Kasparian of the University of Geneva, and T.S. van den Bremer of Delft University of Technology.

    Learning tank

    To predict the dynamics of a breaking wave, scientists typically take one of two approaches: They either attempt to precisely simulate the wave at the scale of individual molecules of water and air, or they run experiments to try and characterize waves with actual measurements. The first approach is computationally expensive and difficult to simulate even over a small area; the second requires a huge amount of time to run enough experiments to yield statistically significant results.

    The MIT team instead borrowed pieces from both approaches to develop a more efficient and accurate model using machine learning. The researchers started with a set of equations that is considered the standard description of wave behavior. They aimed to improve the model by “training” the model on data of breaking waves from actual experiments.

    “We had a simple model that doesn’t capture wave breaking, and then we had the truth, meaning experiments that involve wave breaking,” Eeltink explains. “Then we wanted to use machine learning to learn the difference between the two.”

    The researchers obtained wave breaking data by running experiments in a 40-meter-long tank. The tank was fitted at one end with a paddle which the team used to initiate each wave. The team set the paddle to produce a breaking wave in the middle of the tank. Gauges along the length of the tank measured the water’s height as waves propagated down the tank.

    “It takes a lot of time to run these experiments,” Eeltink says. “Between each experiment you have to wait for the water to completely calm down before you launch the next experiment, otherwise they influence each other.”

    Safe harbor

    In all, the team ran about 250 experiments, the data from which they used to train a type of machine-learning algorithm known as a neural network. Specifically, the algorithm is trained to compare the real waves in experiments with the predicted waves in the simple model, and based on any differences between the two, the algorithm tunes the model to fit reality.

    After training the algorithm on their experimental data, the team introduced the model to entirely new data — in this case, measurements from two independent experiments, each run at separate wave tanks with different dimensions. In these tests, they found the updated model made more accurate predictions than the simple, untrained model, for instance making better estimates of a breaking wave’s steepness.

    The new model also captured an essential property of breaking waves known as the “downshift,” in which the frequency of a wave is shifted to a lower value. The speed of a wave depends on its frequency. For ocean waves, lower frequencies move faster than higher frequencies. Therefore, after the downshift, the wave will move faster. The new model predicts the change in frequency, before and after each breaking wave, which could be especially relevant in preparing for coastal storms.

    “When you want to forecast when high waves of a swell would reach a harbor, and you want to leave the harbor before those waves arrive, then if you get the wave frequency wrong, then the speed at which the waves are approaching is wrong,” Eeltink says.

    The team’s updated wave model is in the form of an open-source code that others could potentially use, for instance in climate simulations of the ocean’s potential to absorb carbon dioxide and other atmospheric gases. The code can also be worked into simulated tests of offshore platforms and coastal structures.

    “The number one purpose of this model is to predict what a wave will do,” Sapsis says. “If you don’t model wave breaking right, it would have tremendous implications for how structures behave. With this, you could simulate waves to help design structures better, more efficiently, and without huge safety factors.”

    This research is supported, in part, by the Swiss National Science Foundation, and by the U.S. Office of Naval Research. More

  • in

    Ocean vital signs

    Without the ocean, the climate crisis would be even worse than it is. Each year, the ocean absorbs billions of tons of carbon from the atmosphere, preventing warming that greenhouse gas would otherwise cause. Scientists estimate about 25 to 30 percent of all carbon released into the atmosphere by both human and natural sources is absorbed by the ocean.

    “But there’s a lot of uncertainty in that number,” says Ryan Woosley, a marine chemist and a principal research scientist in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) at MIT. Different parts of the ocean take in different amounts of carbon depending on many factors, such as the season and the amount of mixing from storms. Current models of the carbon cycle don’t adequately capture this variation.

    To close the gap, Woosley and a team of other MIT scientists developed a research proposal for the MIT Climate Grand Challenges competition — an Institute-wide campaign to catalyze and fund innovative research addressing the climate crisis. The team’s proposal, “Ocean Vital Signs,” involves sending a fleet of sailing drones to cruise the oceans taking detailed measurements of how much carbon the ocean is really absorbing. Those data would be used to improve the precision of global carbon cycle models and improve researchers’ ability to verify emissions reductions claimed by countries.

    “If we start to enact mitigation strategies—either through removing CO2 from the atmosphere or reducing emissions — we need to know where CO2 is going in order to know how effective they are,” says Woosley. Without more precise models there’s no way to confirm whether observed carbon reductions were thanks to policy and people, or thanks to the ocean.

    “So that’s the trillion-dollar question,” says Woosley. “If countries are spending all this money to reduce emissions, is it enough to matter?”

    In February, the team’s Climate Grand Challenges proposal was named one of 27 finalists out of the almost 100 entries submitted. From among this list of finalists, MIT will announce in April the selection of five flagship projects to receive further funding and support.

    Woosley is leading the team along with Christopher Hill, a principal research engineer in EAPS. The team includes physical and chemical oceanographers, marine microbiologists, biogeochemists, and experts in computational modeling from across the department, in addition to collaborators from the Media Lab and the departments of Mathematics, Aeronautics and Astronautics, and Electrical Engineering and Computer Science.

    Today, data on the flux of carbon dioxide between the air and the oceans are collected in a piecemeal way. Research ships intermittently cruise out to gather data. Some commercial ships are also fitted with sensors. But these present a limited view of the entire ocean, and include biases. For instance, commercial ships usually avoid storms, which can increase the turnover of water exposed to the atmosphere and cause a substantial increase in the amount of carbon absorbed by the ocean.

    “It’s very difficult for us to get to it and measure that,” says Woosley. “But these drones can.”

    If funded, the team’s project would begin by deploying a few drones in a small area to test the technology. The wind-powered drones — made by a California-based company called Saildrone — would autonomously navigate through an area, collecting data on air-sea carbon dioxide flux continuously with solar-powered sensors. This would then scale up to more than 5,000 drone-days’ worth of observations, spread over five years, and in all five ocean basins.

    Those data would be used to feed neural networks to create more precise maps of how much carbon is absorbed by the oceans, shrinking the uncertainties involved in the models. These models would continue to be verified and improved by new data. “The better the models are, the more we can rely on them,” says Woosley. “But we will always need measurements to verify the models.”

    Improved carbon cycle models are relevant beyond climate warming as well. “CO2 is involved in so much of how the world works,” says Woosley. “We’re made of carbon, and all the other organisms and ecosystems are as well. What does the perturbation to the carbon cycle do to these ecosystems?”

    One of the best understood impacts is ocean acidification. Carbon absorbed by the ocean reacts to form an acid. A more acidic ocean can have dire impacts on marine organisms like coral and oysters, whose calcium carbonate shells and skeletons can dissolve in the lower pH. Since the Industrial Revolution, the ocean has become about 30 percent more acidic on average.

    “So while it’s great for us that the oceans have been taking up the CO2, it’s not great for the oceans,” says Woosley. “Knowing how this uptake affects the health of the ocean is important as well.” More

  • in

    MIT entrepreneurs think globally, act locally

    Born and raised amid the natural beauty of the Dominican Republic, Andrés Bisonó León feels a deep motivation to help solve a problem that has been threatening the Caribbean island nation’s tourism industry, its economy, and its people.

    As Bisonó León discussed with his long-time friend and mentor, the Walter M. May and A. Hazel May Professor of Mechanical Engineering (MechE) Alexander Slocum Sr., ugly mats of toxic sargassum seaweed have been encroaching on the Dominican Republic’s pristine beaches and other beaches in the Caribbean region, and public and private organizations have fought a losing battle using expensive, environmentally damaging methods to clean it up. Slocum, who was on the U.S. Department of Energy’s Deepwater Horizon team, has extensive experience with systems that operate in the ocean.

    “In the last 10 years,” says Bisonó León, now an MBA candidate in the MIT Sloan School of Management, “sargassum, a toxic seaweed invasion, has cost the Caribbean as much as $120 million a year in cleanup and has meant a 30 to 35 percent tourism reduction, affecting not only the tourism industry, but also the environment, marine life, local economies, and human health.”

    One of Bisonó León’s discussions with Slocum took place within earshot of MechE alumnus Luke Gray ’18, SM ’20, who had worked with Slocum on other projects and was at the time was about to begin his master’s program.

    “Professor Slocum and Andrés happened to be discussing the sargassum problem in Andrés’ home country,” Gray says. “A week later I was on a plane to the DR to collect sargassum samples and survey the problem in Punta Cana. When I returned, my master’s program was underway, and I already had my thesis project!”

    Gray also had started a working partnership with Bisonó León, which both say proceeded seamlessly right from the first moment.

    “I feel that Luke right away understood the magnitude of the problem and the value we could create in the Dominican Republic and across the Caribbean by teaming up,” Bisonó León says.

    Both Bisonó León and Gray also say they felt a responsibility to work toward helping the global environment.

    “All of my major projects up until now have involved machines for climate restoration and/or adaptation,” says Gray.

    The technologies Bisonó León and Gray arrived at after 18 months of R&D were designed to provide solutions both locally and globally.

    Their Littoral Collection Module (LCM) skims sargassum seaweed off the surface of the water with nets that can be mounted on any boat. The device sits across the boat, with two large hoops holding the nets open, one on each side. As the boat travels forward, it cuts through the seaweed, which flows to the sides of the vessel and through the hoops into the nets. Effective at sweeping the seaweed from the water, the device can be employed by anyone with a boat, including local fishermen whose livelihoods have been disrupted by the seaweed’s damaging effect on tourism and the local economy.

    The sargassum can then be towed out to sea, where Bisonó León’s and Gray’s second technology can come into play. By pumping the seaweed into very deep water, where it then sinks to the bottom of the ocean, the carbon in the seaweed can be sequestered. Other methods for disposing of the seaweed generally involve putting it into landfills, where it emits greenhouse gases such as methane and carbon dioxide as it breaks down. Although some seaweed can be put to other uses, including as fertilizer, sargassum has been found to contain hard-to-remove toxic substances such as arsenic and heavy metals.

    In spring 2020, Bisonó León and Gray formed a company, SOS (Sargassum Ocean Sequestration) Carbon.

    Bisonó León says he comes from a long line of entrepreneurs who often expressed much commitment to social impact. His family has been involved in several different industries, his grandfather and great uncles having opened the first cigar factory in the Dominican Republic in 1903.

    Gray says internships with startup companies and the undergraduate projects he did with Slocum developed his interest in entrepreneurship, and his involvement with the sargassum problem only reinforced that inclination. During his master’s program, he says he became “obsessed” with finding a solution.

    “Professor Slocum let me think extremely big, and so it was almost inevitable that the distillation of our two years of work would continue in some form, and starting a company happened to be the right path. My master’s experience of taking an essentially untouched problem like sargassum and then one year later designing, building, and sending 15,000 pounds of custom equipment to test for three months on a Dominican Navy ship made me realize I had discovered a recipe I could repeat — and machine design had become my core competency,” Gray says.

    During the initial research and development of their technologies, Bisonó León and Gray raised $258,000 from 20 different organizations. Between June and December 2021, they succeeded in removing 3.5 million pounds of sargassum and secured contracts with Grupo Puntacana, which operates several tourist resorts, and with other hotels such as Club Med in Punta Cana. The company subcontracts with the association of fishermen in Punta Cana, employing 15 fishermen who operate LCMs and training 35 others to join as the operation expands.

    Their success so far demonstrates “’mens et manus’ at work,” says Slocum, referring to MIT’s motto, which is Latin for “mind and hand.” “Geeks hear about a very real problem that affects very real people who have no other option for their livelihoods, and they respond by inventing a solution so elegant that it can be readily deployed by those most hurt by the problem to address the problem.

    “The team was always focused on the numbers, from physics to finance, and did not let hype or doubts deter their determination to rationally solve this huge problem.”

    Slocum says he could predict Bisonó León and Gray would work well together “because they started out as good, smart people with complementary skills whose hearts and minds were in the right place.”

    “We are working on having a global impact to reduce millions of tons of CO2 per year,” says Bisonó León. “With training from Sloan and cross-disciplinary collaborative spirit, we will be able to further expand environmental and social impact platforms much needed in the Caribbean to be able to drive real change regionally and globally.”

    “I hope SOS Carbon can serve as a model and inspire similar entrepreneurial efforts,” Gray says. More

  • in

    Advancing public understanding of sea-level rise

    Museum exhibits can be a unique way to communicate science concepts and information. Recently, MIT faculty have served as sounding boards for curators at the Museum of Science, Boston, a close neighbor of the MIT campus.

    In January, Professor Emerita Paola Malanotte-Rizzoli and Cecil and Ida Green Professor Raffaele Ferrari of the Department of Earth, Atmospheric and Planetary Science (EAPS) visited the museum to view the newly opened pilot exhibit, “Resilient Venice: Adapting to Climate Change.”

    When Malanotte-Rizzoli was asked to contribute her expertise on the efforts in Venice, Italy, to mitigate flood damage, she was more than willing to offer her knowledge. “I love Venice. It is fun to tell people all of the challenges which you see the lagoon has … how much must be done to preserve, not only the city, but the environment, the islands and buildings,” she says.

    The installation is the second Museum of Science exhibit to be developed in recent years in consultation with EAPS scientists. In December 2020, “Arctic Adventure: Exploring with Technology” opened with the help of Cecil and Ida Green Career Development Professor Brent Minchew, who lent his expertise in geophysics and glaciology to the project. But for Malanotte-Rizzoli, the new exhibit hits a little closer to home.

    “My house is there,” Malanotte-Rizzoli excitedly pointed out on the exhibit’s aerial view of Venice, which includes a view above St. Mark’s Square and some of the surrounding city.

    “Resilient Venice” focuses on Malanotte-Rizzoli’s hometown, a city known for flooding. Built on a group of islands in the Venetian Lagoon, Venice has always experienced flooding, but climate change has brought unprecedented tide levels, causing billions of dollars in damages and even causing two deaths in the flood of 2019.

    The dark exhibit hall is lined with immersive images created by Iconem, a startup whose mission is digital preservation of endangered World Heritage Sites. The firm took detailed 3D scans and images of Venice to put together the displays and video.

    The video on which Malanotte-Rizzoli pointed to her home shows the potential sea level rise by 2100 if action isn’t taken. It shows the entrance to St. Mark’s Basilica completely submerged in water; she compares it to the disaster movie “The Day After Tomorrow.”

    The MOSE system

    Between critiques of the choice of music (“that’s not very Venice-inspired,” joked Ferrari, who is also Italian) and bits of conversation exchanged in Italian, the two scientists do what scientists do: discuss technicalities.

    Ferrari pointed to a model of a gate system and asked Malanotte-Rizzoli if the hydraulic jump seen in the model is present in the MOSE system; she confirmed it is not.

    This is the part of the exhibit that Malanotte-Rizzoli was consulted on. One of the plans Venice has implemented to address the flooding is the MOSE system — short for Modulo Sperimentale Elettromeccanico, or the Experimental Electromechanical Module. The MOSE is a system of flood barriers designed to protect the city from extremely high tides. Construction began in 2003, and its first successful operation happened on Oct. 3, 2020, when it prevented a tide 53 inches above normal from flooding the city.

    The barriers are made of a series of gates, each 66-98 feet in length and 66 feet wide, which sit in chambers built into the sea floor when not in use to allow boats and wildlife to travel between the ocean and lagoon. The gates are filled with water to keep them submerged; when activated, air is pumped into them, pushing out the water and allowing them to rise. The entire process takes 30 minutes to complete, and half that time to return to the sea floor.

    The top of the gates in the MOSE come out of the water completely and are individually controlled so that sections can remain open to allow ships to pass through. In the model, the gate remains partially submerged, and as the high-velocity water passes over it into an area of low velocity, it creates a small rise of water before it falls over the edge of the barrier, creating a hydraulic jump.

    But Malanotte-Rizzoli joked that only scientists will care about that; otherwise, the model does a good job demonstrating how the MOSE gates rise and fall.

    The MOSE system is only one of many plans taken to mitigate the rising water levels in Venice and to protect the lagoon and the surrounding area, and this is an important point for Malanotte-Rizzoli, who worked on the project from 1995 to 2013.

    “It is not the MOSE or,” she emphasized. “It is the MOSE and.” Other complementary plans have been implemented to reduce harm to both economic sectors, such as shipping and tourism, as well as the wildlife that live in the lagoons.

    Beyond barriers

    There’s more to protecting Venice than navigating flooded streets — it’s not just “putting on rainboots,” as Malanotte-Rizzoli put it.

    “It’s destroying the walls,” she said, pointing out the corrosive effects of water on a model building, which emphasizes the damage to architecture caused by the unusually high flood levels. “People don’t think about this.” The exhibit also emphasizes the economic costs of businesses lost by having visitors take down and rebuild a flood barrier for a gelato shop with the rising and falling water levels.

    Malanotte-Rizzoli gave the exhibit her seal of approval, but the Venice section is only a small portion of what the finished exhibit will look like. The current plan involves expanding it to include a few other World Heritage Sites.

    “How do we make people care about a site that they haven’t been to?” asked Julia Tate, the project manager of touring exhibits and exhibit production at the museum. She said that it’s easy to start with a city like Venice, since it’s a popular tourist destination. But it becomes trickier to get people to care about a site that they maybe haven’t been to, such as the Easter Islands, that are just as much at risk. The plan is to incorporate a few more sites before turning it into a traveling exhibit that will end by asking visitors to think about climate change in their own towns.

    “We want them to think about solutions and how to do better,” said Tate. Hope is the alternative message: It’s not too late to act.

    Malanotte-Rizzoli thinks it’s important for Bostonians to see their own city in Venice, as Boston is also at risk from sea level rise. The history of Boston reminds Malanotte-Rizzoli about her hometown and is one of the reasons why she was willing to emigrate. The history encompassed in Boston makes the need for preservation even more important.

    “Those things that cannot be replaced, they must be respected in the process of preservation,” she said. “Modern things and engineering can be done even in a city which is so fragile, so delicate.” More

  • in

    Predator interactions chiefly determine where Prochlorococcus thrive

    Prochlorococcus are the smallest and most abundant photosynthesizing organisms on the planet. A single Prochlorococcus cell is dwarfed by a human red blood cell, yet globally the microbes number in the octillions and are responsible for a large fraction of the world’s oxygen production as they turn sunlight into energy.

    Prochlorococcus can be found in the ocean’s warm surface waters, and their population drops off dramatically in regions closer to the poles. Scientists have assumed that, as with many marine species, Prochlorococcus’ range is set by temperature: The colder the waters, the less likely the microbes are to live there.

    But MIT scientists have found that where the microbe lives is not determined primarily by temperature. While Prochlorococcus populations do drop off in colder waters, it’s a relationship with a shared predator, and not temperature, that sets the microbe’s range. These findings, published today in the Proceedings of the National Academy of Sciences, could help scientists predict how the microbes’ populations will shift with climate change.

    “People assume that if the ocean warms up, Prochlorococcus will move poleward. And that may be true, but not for the reason they’re predicting,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, temperature is a bit of a red herring.”

    Dutkiewicz’s co-authors on the study are lead author and EAPS Research Scientist Christopher Follett, EAPS Professor Mick Follows, François Ribalet and Virginia Armbrust of the University of Washington, and Emily Zakem and David Caron of the University of Southern California at Los Angeles.

    Temperature’s collapse

    While temperature is thought to set the range of Prochloroccus and other phytoplankton in the ocean, Follett, Dutkiewicz, and their colleagues noticed a curious dissonance in data.

    The team examined observations from several research cruises that sailed through the northeast Pacific Ocean in 2003, 2016, and 2017. Each vessel traversed different latitudes, sampling waters continuously and measuring concentrations of various species of bacteria and phytoplankton, including Prochlorococcus. 

    The MIT team used the publicly archived cruise data to map out the locations where Prochlorococcus noticeably decreased or collapsed, along with each location’s ocean temperature. Surprisingly, they found that Prochlorococcus’ collapse occurred in regions of widely varying temperatures, ranging from around 13 to 18 degrees Celsius. Curiously, the upper end of this range has been shown in lab experiments to be suitable conditions for Prochlorococcus to grow and thrive.

    “Temperature itself was not able to explain where we saw these drop-offs,” Follett says.

    Follett was also working out an alternate idea related to Prochlorococcus and nutrient supply. As a byproduct of its photosynthesis, the microbe produces carbohydrate — an essential nutrient for heterotrophic bacteria, which are single-celled organisms that do not photosynthesize but live off the organic matter produced by phytoplankton.

    “Somewhere along the way, I wondered, what would happen if this food source Prochlorococcus was producing increased? What if we took that knob and spun it?” Follett says.

    In other words, how would the balance of Prochlorococcus and bacteria shift if the bacteria’s food increased as a result of, say, an increase in other carbohydrate-producing phytoplankton? The team also wondered: If the bacteria in question were about the same size as Prochlorococcus, the two would likely share a common grazer, or predator. How would the grazer’s population also shift with a change in carbohydrate supply?

    “Then we went to the whiteboard and started writing down equations and solving them for various cases, and realized that as soon as you reach an environment where other species add carbohydrates to the mix, bacteria and grazers grow up and annihilate Prochlorococcus,” Dutkiewicz says.

    Nutrient shift

    To test this idea, the researchers employed simulations of ocean circulation and marine ecosystem interactions. The team ran the MITgcm, a general circulation model that simulates, in this case, the ocean currents and regions of upwelling waters around the world. They overlaid a biogeochemistry model that simulates how nutrients are redistributed in the ocean. To all of this, they linked a complex ecosystem model that simulates the interactions between many different species of bacteria and phytoplankton, including Prochlorococcus.

    When they ran the simulations without incorporating a representation of bacteria, they found that Prochlorococcus persisted all the way to the poles, contrary to theory and observations. When they added in the equations outlining the relationship between the microbe, bacteria, and a shared predator, Prochlorococcus’ range shifted away from the poles, matching the observations of the original research cruises.

    In particular, the team observed that Prochlorococcus thrived in waters with very low nutrient levels, and where it is the dominant source of food for bacteria. These waters also happen to be warm, and Prochlorococcus and bacteria live in balance, along with their shared predator. But in more nutrient-rich enviroments, such as polar regions, where cold water and nutrients are upwelled from the deep ocean, many more species of phytoplankton can thrive. Bacteria can then feast and grow on more food sources, and in turn feed and grow more of its shared predator. Prochlorococcus, unable to keep up, is quickly decimated. 

    The results show that a relationship with a shared predator, and not temperature, sets Prochlorococcus’ range. Incorporating this mechanism into models will be crucial in predicting how the microbe — and possibly other marine species — will shift with climate change.

    “Prochlorococcus is a big harbinger of changes in the global ocean,” Dutkiewicz says. “If its range expands, that’s a canary — a sign that things have changed in the ocean by a great deal.”

    “There are reasons to believe its range will expand with a warming world,” Follett adds.” But we have to understand the physical mechanisms that set these ranges. And predictions just based on temperature will not be correct.” More

  • in

    Scientists build new atlas of ocean’s oxygen-starved waters

    Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

    Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

    The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

    From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers — roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

    The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

    “It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

    The team’s study appears today in the journal Global Biogeochemical Cycles.

    Airing out artifacts

    Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

    Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

    “But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

    Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

    Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

    “We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

    Bottoming out

    The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

    The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

    “We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says. 

    The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

    “We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

    Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

    “How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

    “This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s oxygen supply is controlled.”

    This research is supported, in part, by the Simons Foundation. More

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More