More stories

  • in

    Engineers use artificial intelligence to capture the complexity of breaking waves

    Waves break once they swell to a critical height, before cresting and crashing into a spray of droplets and bubbles. These waves can be as large as a surfer’s point break and as small as a gentle ripple rolling to shore. For decades, the dynamics of how and when a wave breaks have been too complex to predict.

    Now, MIT engineers have found a new way to model how waves break. The team used machine learning along with data from wave-tank experiments to tweak equations that have traditionally been used to predict wave behavior. Engineers typically rely on such equations to help them design resilient offshore platforms and structures. But until now, the equations have not been able to capture the complexity of breaking waves.

    The updated model made more accurate predictions of how and when waves break, the researchers found. For instance, the model estimated a wave’s steepness just before breaking, and its energy and frequency after breaking, more accurately than the conventional wave equations.

    Their results, published today in the journal Nature Communications, will help scientists understand how a breaking wave affects the water around it. Knowing precisely how these waves interact can help hone the design of offshore structures. It can also improve predictions for how the ocean interacts with the atmosphere. Having better estimates of how waves break can help scientists predict, for instance, how much carbon dioxide and other atmospheric gases the ocean can absorb.

    “Wave breaking is what puts air into the ocean,” says study author Themis Sapsis, an associate professor of mechanical and ocean engineering and an affiliate of the Institute for Data, Systems, and Society at MIT. “It may sound like a detail, but if you multiply its effect over the area of the entire ocean, wave breaking starts becoming fundamentally important to climate prediction.”

    The study’s co-authors include lead author and MIT postdoc Debbie Eeltink, Hubert Branger and Christopher Luneau of Aix-Marseille University, Amin Chabchoub of Kyoto University, Jerome Kasparian of the University of Geneva, and T.S. van den Bremer of Delft University of Technology.

    Learning tank

    To predict the dynamics of a breaking wave, scientists typically take one of two approaches: They either attempt to precisely simulate the wave at the scale of individual molecules of water and air, or they run experiments to try and characterize waves with actual measurements. The first approach is computationally expensive and difficult to simulate even over a small area; the second requires a huge amount of time to run enough experiments to yield statistically significant results.

    The MIT team instead borrowed pieces from both approaches to develop a more efficient and accurate model using machine learning. The researchers started with a set of equations that is considered the standard description of wave behavior. They aimed to improve the model by “training” the model on data of breaking waves from actual experiments.

    “We had a simple model that doesn’t capture wave breaking, and then we had the truth, meaning experiments that involve wave breaking,” Eeltink explains. “Then we wanted to use machine learning to learn the difference between the two.”

    The researchers obtained wave breaking data by running experiments in a 40-meter-long tank. The tank was fitted at one end with a paddle which the team used to initiate each wave. The team set the paddle to produce a breaking wave in the middle of the tank. Gauges along the length of the tank measured the water’s height as waves propagated down the tank.

    “It takes a lot of time to run these experiments,” Eeltink says. “Between each experiment you have to wait for the water to completely calm down before you launch the next experiment, otherwise they influence each other.”

    Safe harbor

    In all, the team ran about 250 experiments, the data from which they used to train a type of machine-learning algorithm known as a neural network. Specifically, the algorithm is trained to compare the real waves in experiments with the predicted waves in the simple model, and based on any differences between the two, the algorithm tunes the model to fit reality.

    After training the algorithm on their experimental data, the team introduced the model to entirely new data — in this case, measurements from two independent experiments, each run at separate wave tanks with different dimensions. In these tests, they found the updated model made more accurate predictions than the simple, untrained model, for instance making better estimates of a breaking wave’s steepness.

    The new model also captured an essential property of breaking waves known as the “downshift,” in which the frequency of a wave is shifted to a lower value. The speed of a wave depends on its frequency. For ocean waves, lower frequencies move faster than higher frequencies. Therefore, after the downshift, the wave will move faster. The new model predicts the change in frequency, before and after each breaking wave, which could be especially relevant in preparing for coastal storms.

    “When you want to forecast when high waves of a swell would reach a harbor, and you want to leave the harbor before those waves arrive, then if you get the wave frequency wrong, then the speed at which the waves are approaching is wrong,” Eeltink says.

    The team’s updated wave model is in the form of an open-source code that others could potentially use, for instance in climate simulations of the ocean’s potential to absorb carbon dioxide and other atmospheric gases. The code can also be worked into simulated tests of offshore platforms and coastal structures.

    “The number one purpose of this model is to predict what a wave will do,” Sapsis says. “If you don’t model wave breaking right, it would have tremendous implications for how structures behave. With this, you could simulate waves to help design structures better, more efficiently, and without huge safety factors.”

    This research is supported, in part, by the Swiss National Science Foundation, and by the U.S. Office of Naval Research. More

  • in

    Ocean vital signs

    Without the ocean, the climate crisis would be even worse than it is. Each year, the ocean absorbs billions of tons of carbon from the atmosphere, preventing warming that greenhouse gas would otherwise cause. Scientists estimate about 25 to 30 percent of all carbon released into the atmosphere by both human and natural sources is absorbed by the ocean.

    “But there’s a lot of uncertainty in that number,” says Ryan Woosley, a marine chemist and a principal research scientist in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) at MIT. Different parts of the ocean take in different amounts of carbon depending on many factors, such as the season and the amount of mixing from storms. Current models of the carbon cycle don’t adequately capture this variation.

    To close the gap, Woosley and a team of other MIT scientists developed a research proposal for the MIT Climate Grand Challenges competition — an Institute-wide campaign to catalyze and fund innovative research addressing the climate crisis. The team’s proposal, “Ocean Vital Signs,” involves sending a fleet of sailing drones to cruise the oceans taking detailed measurements of how much carbon the ocean is really absorbing. Those data would be used to improve the precision of global carbon cycle models and improve researchers’ ability to verify emissions reductions claimed by countries.

    “If we start to enact mitigation strategies—either through removing CO2 from the atmosphere or reducing emissions — we need to know where CO2 is going in order to know how effective they are,” says Woosley. Without more precise models there’s no way to confirm whether observed carbon reductions were thanks to policy and people, or thanks to the ocean.

    “So that’s the trillion-dollar question,” says Woosley. “If countries are spending all this money to reduce emissions, is it enough to matter?”

    In February, the team’s Climate Grand Challenges proposal was named one of 27 finalists out of the almost 100 entries submitted. From among this list of finalists, MIT will announce in April the selection of five flagship projects to receive further funding and support.

    Woosley is leading the team along with Christopher Hill, a principal research engineer in EAPS. The team includes physical and chemical oceanographers, marine microbiologists, biogeochemists, and experts in computational modeling from across the department, in addition to collaborators from the Media Lab and the departments of Mathematics, Aeronautics and Astronautics, and Electrical Engineering and Computer Science.

    Today, data on the flux of carbon dioxide between the air and the oceans are collected in a piecemeal way. Research ships intermittently cruise out to gather data. Some commercial ships are also fitted with sensors. But these present a limited view of the entire ocean, and include biases. For instance, commercial ships usually avoid storms, which can increase the turnover of water exposed to the atmosphere and cause a substantial increase in the amount of carbon absorbed by the ocean.

    “It’s very difficult for us to get to it and measure that,” says Woosley. “But these drones can.”

    If funded, the team’s project would begin by deploying a few drones in a small area to test the technology. The wind-powered drones — made by a California-based company called Saildrone — would autonomously navigate through an area, collecting data on air-sea carbon dioxide flux continuously with solar-powered sensors. This would then scale up to more than 5,000 drone-days’ worth of observations, spread over five years, and in all five ocean basins.

    Those data would be used to feed neural networks to create more precise maps of how much carbon is absorbed by the oceans, shrinking the uncertainties involved in the models. These models would continue to be verified and improved by new data. “The better the models are, the more we can rely on them,” says Woosley. “But we will always need measurements to verify the models.”

    Improved carbon cycle models are relevant beyond climate warming as well. “CO2 is involved in so much of how the world works,” says Woosley. “We’re made of carbon, and all the other organisms and ecosystems are as well. What does the perturbation to the carbon cycle do to these ecosystems?”

    One of the best understood impacts is ocean acidification. Carbon absorbed by the ocean reacts to form an acid. A more acidic ocean can have dire impacts on marine organisms like coral and oysters, whose calcium carbonate shells and skeletons can dissolve in the lower pH. Since the Industrial Revolution, the ocean has become about 30 percent more acidic on average.

    “So while it’s great for us that the oceans have been taking up the CO2, it’s not great for the oceans,” says Woosley. “Knowing how this uptake affects the health of the ocean is important as well.” More

  • in

    MIT entrepreneurs think globally, act locally

    Born and raised amid the natural beauty of the Dominican Republic, Andrés Bisonó León feels a deep motivation to help solve a problem that has been threatening the Caribbean island nation’s tourism industry, its economy, and its people.

    As Bisonó León discussed with his long-time friend and mentor, the Walter M. May and A. Hazel May Professor of Mechanical Engineering (MechE) Alexander Slocum Sr., ugly mats of toxic sargassum seaweed have been encroaching on the Dominican Republic’s pristine beaches and other beaches in the Caribbean region, and public and private organizations have fought a losing battle using expensive, environmentally damaging methods to clean it up. Slocum, who was on the U.S. Department of Energy’s Deepwater Horizon team, has extensive experience with systems that operate in the ocean.

    “In the last 10 years,” says Bisonó León, now an MBA candidate in the MIT Sloan School of Management, “sargassum, a toxic seaweed invasion, has cost the Caribbean as much as $120 million a year in cleanup and has meant a 30 to 35 percent tourism reduction, affecting not only the tourism industry, but also the environment, marine life, local economies, and human health.”

    One of Bisonó León’s discussions with Slocum took place within earshot of MechE alumnus Luke Gray ’18, SM ’20, who had worked with Slocum on other projects and was at the time was about to begin his master’s program.

    “Professor Slocum and Andrés happened to be discussing the sargassum problem in Andrés’ home country,” Gray says. “A week later I was on a plane to the DR to collect sargassum samples and survey the problem in Punta Cana. When I returned, my master’s program was underway, and I already had my thesis project!”

    Gray also had started a working partnership with Bisonó León, which both say proceeded seamlessly right from the first moment.

    “I feel that Luke right away understood the magnitude of the problem and the value we could create in the Dominican Republic and across the Caribbean by teaming up,” Bisonó León says.

    Both Bisonó León and Gray also say they felt a responsibility to work toward helping the global environment.

    “All of my major projects up until now have involved machines for climate restoration and/or adaptation,” says Gray.

    The technologies Bisonó León and Gray arrived at after 18 months of R&D were designed to provide solutions both locally and globally.

    Their Littoral Collection Module (LCM) skims sargassum seaweed off the surface of the water with nets that can be mounted on any boat. The device sits across the boat, with two large hoops holding the nets open, one on each side. As the boat travels forward, it cuts through the seaweed, which flows to the sides of the vessel and through the hoops into the nets. Effective at sweeping the seaweed from the water, the device can be employed by anyone with a boat, including local fishermen whose livelihoods have been disrupted by the seaweed’s damaging effect on tourism and the local economy.

    The sargassum can then be towed out to sea, where Bisonó León’s and Gray’s second technology can come into play. By pumping the seaweed into very deep water, where it then sinks to the bottom of the ocean, the carbon in the seaweed can be sequestered. Other methods for disposing of the seaweed generally involve putting it into landfills, where it emits greenhouse gases such as methane and carbon dioxide as it breaks down. Although some seaweed can be put to other uses, including as fertilizer, sargassum has been found to contain hard-to-remove toxic substances such as arsenic and heavy metals.

    In spring 2020, Bisonó León and Gray formed a company, SOS (Sargassum Ocean Sequestration) Carbon.

    Bisonó León says he comes from a long line of entrepreneurs who often expressed much commitment to social impact. His family has been involved in several different industries, his grandfather and great uncles having opened the first cigar factory in the Dominican Republic in 1903.

    Gray says internships with startup companies and the undergraduate projects he did with Slocum developed his interest in entrepreneurship, and his involvement with the sargassum problem only reinforced that inclination. During his master’s program, he says he became “obsessed” with finding a solution.

    “Professor Slocum let me think extremely big, and so it was almost inevitable that the distillation of our two years of work would continue in some form, and starting a company happened to be the right path. My master’s experience of taking an essentially untouched problem like sargassum and then one year later designing, building, and sending 15,000 pounds of custom equipment to test for three months on a Dominican Navy ship made me realize I had discovered a recipe I could repeat — and machine design had become my core competency,” Gray says.

    During the initial research and development of their technologies, Bisonó León and Gray raised $258,000 from 20 different organizations. Between June and December 2021, they succeeded in removing 3.5 million pounds of sargassum and secured contracts with Grupo Puntacana, which operates several tourist resorts, and with other hotels such as Club Med in Punta Cana. The company subcontracts with the association of fishermen in Punta Cana, employing 15 fishermen who operate LCMs and training 35 others to join as the operation expands.

    Their success so far demonstrates “’mens et manus’ at work,” says Slocum, referring to MIT’s motto, which is Latin for “mind and hand.” “Geeks hear about a very real problem that affects very real people who have no other option for their livelihoods, and they respond by inventing a solution so elegant that it can be readily deployed by those most hurt by the problem to address the problem.

    “The team was always focused on the numbers, from physics to finance, and did not let hype or doubts deter their determination to rationally solve this huge problem.”

    Slocum says he could predict Bisonó León and Gray would work well together “because they started out as good, smart people with complementary skills whose hearts and minds were in the right place.”

    “We are working on having a global impact to reduce millions of tons of CO2 per year,” says Bisonó León. “With training from Sloan and cross-disciplinary collaborative spirit, we will be able to further expand environmental and social impact platforms much needed in the Caribbean to be able to drive real change regionally and globally.”

    “I hope SOS Carbon can serve as a model and inspire similar entrepreneurial efforts,” Gray says. More

  • in

    Advancing public understanding of sea-level rise

    Museum exhibits can be a unique way to communicate science concepts and information. Recently, MIT faculty have served as sounding boards for curators at the Museum of Science, Boston, a close neighbor of the MIT campus.

    In January, Professor Emerita Paola Malanotte-Rizzoli and Cecil and Ida Green Professor Raffaele Ferrari of the Department of Earth, Atmospheric and Planetary Science (EAPS) visited the museum to view the newly opened pilot exhibit, “Resilient Venice: Adapting to Climate Change.”

    When Malanotte-Rizzoli was asked to contribute her expertise on the efforts in Venice, Italy, to mitigate flood damage, she was more than willing to offer her knowledge. “I love Venice. It is fun to tell people all of the challenges which you see the lagoon has … how much must be done to preserve, not only the city, but the environment, the islands and buildings,” she says.

    The installation is the second Museum of Science exhibit to be developed in recent years in consultation with EAPS scientists. In December 2020, “Arctic Adventure: Exploring with Technology” opened with the help of Cecil and Ida Green Career Development Professor Brent Minchew, who lent his expertise in geophysics and glaciology to the project. But for Malanotte-Rizzoli, the new exhibit hits a little closer to home.

    “My house is there,” Malanotte-Rizzoli excitedly pointed out on the exhibit’s aerial view of Venice, which includes a view above St. Mark’s Square and some of the surrounding city.

    “Resilient Venice” focuses on Malanotte-Rizzoli’s hometown, a city known for flooding. Built on a group of islands in the Venetian Lagoon, Venice has always experienced flooding, but climate change has brought unprecedented tide levels, causing billions of dollars in damages and even causing two deaths in the flood of 2019.

    The dark exhibit hall is lined with immersive images created by Iconem, a startup whose mission is digital preservation of endangered World Heritage Sites. The firm took detailed 3D scans and images of Venice to put together the displays and video.

    The video on which Malanotte-Rizzoli pointed to her home shows the potential sea level rise by 2100 if action isn’t taken. It shows the entrance to St. Mark’s Basilica completely submerged in water; she compares it to the disaster movie “The Day After Tomorrow.”

    The MOSE system

    Between critiques of the choice of music (“that’s not very Venice-inspired,” joked Ferrari, who is also Italian) and bits of conversation exchanged in Italian, the two scientists do what scientists do: discuss technicalities.

    Ferrari pointed to a model of a gate system and asked Malanotte-Rizzoli if the hydraulic jump seen in the model is present in the MOSE system; she confirmed it is not.

    This is the part of the exhibit that Malanotte-Rizzoli was consulted on. One of the plans Venice has implemented to address the flooding is the MOSE system — short for Modulo Sperimentale Elettromeccanico, or the Experimental Electromechanical Module. The MOSE is a system of flood barriers designed to protect the city from extremely high tides. Construction began in 2003, and its first successful operation happened on Oct. 3, 2020, when it prevented a tide 53 inches above normal from flooding the city.

    The barriers are made of a series of gates, each 66-98 feet in length and 66 feet wide, which sit in chambers built into the sea floor when not in use to allow boats and wildlife to travel between the ocean and lagoon. The gates are filled with water to keep them submerged; when activated, air is pumped into them, pushing out the water and allowing them to rise. The entire process takes 30 minutes to complete, and half that time to return to the sea floor.

    The top of the gates in the MOSE come out of the water completely and are individually controlled so that sections can remain open to allow ships to pass through. In the model, the gate remains partially submerged, and as the high-velocity water passes over it into an area of low velocity, it creates a small rise of water before it falls over the edge of the barrier, creating a hydraulic jump.

    But Malanotte-Rizzoli joked that only scientists will care about that; otherwise, the model does a good job demonstrating how the MOSE gates rise and fall.

    The MOSE system is only one of many plans taken to mitigate the rising water levels in Venice and to protect the lagoon and the surrounding area, and this is an important point for Malanotte-Rizzoli, who worked on the project from 1995 to 2013.

    “It is not the MOSE or,” she emphasized. “It is the MOSE and.” Other complementary plans have been implemented to reduce harm to both economic sectors, such as shipping and tourism, as well as the wildlife that live in the lagoons.

    Beyond barriers

    There’s more to protecting Venice than navigating flooded streets — it’s not just “putting on rainboots,” as Malanotte-Rizzoli put it.

    “It’s destroying the walls,” she said, pointing out the corrosive effects of water on a model building, which emphasizes the damage to architecture caused by the unusually high flood levels. “People don’t think about this.” The exhibit also emphasizes the economic costs of businesses lost by having visitors take down and rebuild a flood barrier for a gelato shop with the rising and falling water levels.

    Malanotte-Rizzoli gave the exhibit her seal of approval, but the Venice section is only a small portion of what the finished exhibit will look like. The current plan involves expanding it to include a few other World Heritage Sites.

    “How do we make people care about a site that they haven’t been to?” asked Julia Tate, the project manager of touring exhibits and exhibit production at the museum. She said that it’s easy to start with a city like Venice, since it’s a popular tourist destination. But it becomes trickier to get people to care about a site that they maybe haven’t been to, such as the Easter Islands, that are just as much at risk. The plan is to incorporate a few more sites before turning it into a traveling exhibit that will end by asking visitors to think about climate change in their own towns.

    “We want them to think about solutions and how to do better,” said Tate. Hope is the alternative message: It’s not too late to act.

    Malanotte-Rizzoli thinks it’s important for Bostonians to see their own city in Venice, as Boston is also at risk from sea level rise. The history of Boston reminds Malanotte-Rizzoli about her hometown and is one of the reasons why she was willing to emigrate. The history encompassed in Boston makes the need for preservation even more important.

    “Those things that cannot be replaced, they must be respected in the process of preservation,” she said. “Modern things and engineering can be done even in a city which is so fragile, so delicate.” More

  • in

    Predator interactions chiefly determine where Prochlorococcus thrive

    Prochlorococcus are the smallest and most abundant photosynthesizing organisms on the planet. A single Prochlorococcus cell is dwarfed by a human red blood cell, yet globally the microbes number in the octillions and are responsible for a large fraction of the world’s oxygen production as they turn sunlight into energy.

    Prochlorococcus can be found in the ocean’s warm surface waters, and their population drops off dramatically in regions closer to the poles. Scientists have assumed that, as with many marine species, Prochlorococcus’ range is set by temperature: The colder the waters, the less likely the microbes are to live there.

    But MIT scientists have found that where the microbe lives is not determined primarily by temperature. While Prochlorococcus populations do drop off in colder waters, it’s a relationship with a shared predator, and not temperature, that sets the microbe’s range. These findings, published today in the Proceedings of the National Academy of Sciences, could help scientists predict how the microbes’ populations will shift with climate change.

    “People assume that if the ocean warms up, Prochlorococcus will move poleward. And that may be true, but not for the reason they’re predicting,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “So, temperature is a bit of a red herring.”

    Dutkiewicz’s co-authors on the study are lead author and EAPS Research Scientist Christopher Follett, EAPS Professor Mick Follows, François Ribalet and Virginia Armbrust of the University of Washington, and Emily Zakem and David Caron of the University of Southern California at Los Angeles.

    Temperature’s collapse

    While temperature is thought to set the range of Prochloroccus and other phytoplankton in the ocean, Follett, Dutkiewicz, and their colleagues noticed a curious dissonance in data.

    The team examined observations from several research cruises that sailed through the northeast Pacific Ocean in 2003, 2016, and 2017. Each vessel traversed different latitudes, sampling waters continuously and measuring concentrations of various species of bacteria and phytoplankton, including Prochlorococcus. 

    The MIT team used the publicly archived cruise data to map out the locations where Prochlorococcus noticeably decreased or collapsed, along with each location’s ocean temperature. Surprisingly, they found that Prochlorococcus’ collapse occurred in regions of widely varying temperatures, ranging from around 13 to 18 degrees Celsius. Curiously, the upper end of this range has been shown in lab experiments to be suitable conditions for Prochlorococcus to grow and thrive.

    “Temperature itself was not able to explain where we saw these drop-offs,” Follett says.

    Follett was also working out an alternate idea related to Prochlorococcus and nutrient supply. As a byproduct of its photosynthesis, the microbe produces carbohydrate — an essential nutrient for heterotrophic bacteria, which are single-celled organisms that do not photosynthesize but live off the organic matter produced by phytoplankton.

    “Somewhere along the way, I wondered, what would happen if this food source Prochlorococcus was producing increased? What if we took that knob and spun it?” Follett says.

    In other words, how would the balance of Prochlorococcus and bacteria shift if the bacteria’s food increased as a result of, say, an increase in other carbohydrate-producing phytoplankton? The team also wondered: If the bacteria in question were about the same size as Prochlorococcus, the two would likely share a common grazer, or predator. How would the grazer’s population also shift with a change in carbohydrate supply?

    “Then we went to the whiteboard and started writing down equations and solving them for various cases, and realized that as soon as you reach an environment where other species add carbohydrates to the mix, bacteria and grazers grow up and annihilate Prochlorococcus,” Dutkiewicz says.

    Nutrient shift

    To test this idea, the researchers employed simulations of ocean circulation and marine ecosystem interactions. The team ran the MITgcm, a general circulation model that simulates, in this case, the ocean currents and regions of upwelling waters around the world. They overlaid a biogeochemistry model that simulates how nutrients are redistributed in the ocean. To all of this, they linked a complex ecosystem model that simulates the interactions between many different species of bacteria and phytoplankton, including Prochlorococcus.

    When they ran the simulations without incorporating a representation of bacteria, they found that Prochlorococcus persisted all the way to the poles, contrary to theory and observations. When they added in the equations outlining the relationship between the microbe, bacteria, and a shared predator, Prochlorococcus’ range shifted away from the poles, matching the observations of the original research cruises.

    In particular, the team observed that Prochlorococcus thrived in waters with very low nutrient levels, and where it is the dominant source of food for bacteria. These waters also happen to be warm, and Prochlorococcus and bacteria live in balance, along with their shared predator. But in more nutrient-rich enviroments, such as polar regions, where cold water and nutrients are upwelled from the deep ocean, many more species of phytoplankton can thrive. Bacteria can then feast and grow on more food sources, and in turn feed and grow more of its shared predator. Prochlorococcus, unable to keep up, is quickly decimated. 

    The results show that a relationship with a shared predator, and not temperature, sets Prochlorococcus’ range. Incorporating this mechanism into models will be crucial in predicting how the microbe — and possibly other marine species — will shift with climate change.

    “Prochlorococcus is a big harbinger of changes in the global ocean,” Dutkiewicz says. “If its range expands, that’s a canary — a sign that things have changed in the ocean by a great deal.”

    “There are reasons to believe its range will expand with a warming world,” Follett adds.” But we have to understand the physical mechanisms that set these ranges. And predictions just based on temperature will not be correct.” More

  • in

    Scientists build new atlas of ocean’s oxygen-starved waters

    Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

    Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

    The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

    From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers — roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

    The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

    “It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

    The team’s study appears today in the journal Global Biogeochemical Cycles.

    Airing out artifacts

    Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

    Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

    “But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

    Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

    Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

    “We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

    Bottoming out

    The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

    The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

    “We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says. 

    The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

    “We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

    Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

    “How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

    “This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s oxygen supply is controlled.”

    This research is supported, in part, by the Simons Foundation. More

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More

  • in

    How marsh grass protects shorelines

    Marsh plants, which are ubiquitous along the world’s shorelines, can play a major role in mitigating the damage to coastlines as sea levels rise and storm surges increase. Now, a new MIT study provides greater detail about how these protective benefits work under real-world conditions shaped by waves and currents.

    The study combined laboratory experiments using simulated plants in a large wave tank along with mathematical modeling. It appears in the journal Physical Review — Fluids, in a paper by former MIT visiting doctoral student Xiaoxia Zhang, now a postdoc at Dalian University of Technology, and professor of civil and environmental engineering Heidi Nepf.

    It’s already clear that coastal marsh plants provide significant protection from surges and devastating  storms. For example, it has been estimated that the damage caused by Hurricane Sandy was reduced by $625 million thanks to the damping of wave energy provided by extensive areas of marsh along the affected coasts. But the new MIT analysis incorporates details of plant morphology, such as the number and spacing of flexible leaves versus stiffer stems, and the complex interactions of currents and waves that may be coming from different directions.

    This level of detail could enable coastal restoration planners to determine the area of marsh needed to mitigate expected amounts of storm surge or sea-level rise, and to decide which types of plants to introduce to maximize protection.

    “When you go to a marsh, you often will see that the plants are arranged in zones,” says Nepf, who is the Donald and Martha Harleman Professor of Civil and Environmental Engineering. “Along the edge, you tend to have plants that are more flexible, because they are using their flexibility to reduce the wave forces they feel. In the next zone, the plants are a little more rigid and have a bit more leaves.”

    As the zones progress, the plants become stiffer, leafier, and more effective at absorbing wave energy thanks to their greater leaf area. The new modeling done in this research, which incorporated work with simulated plants in the 24-meter-long wave tank at MIT’s Parsons Lab, can enable coastal planners to take these kinds of details into account when planning protection, mitigation, or restoration projects.

    “If you put the stiffest plants at the edge, they might not survive, because they’re feeling very high wave forces. By describing why Mother Nature organizes plants in this way, we can hopefully design a more sustainable restoration,” Nepf says.

    Once established, the marsh plants provide a positive feedback cycle that helps to not only stabilize but also build up these delicate coastal lands, Zhang says. “After a few years, the marsh grasses start to trap and hold the sediment, and the elevation gets higher and higher, which might keep up with sea level rise,” she says.

    The new MIT analysis incorporates details of plant morphology, such as the number and spacing of flexible leaves versus stiffer stems, and the complex interactions of currents and waves that may be coming from different directions.

    Awareness of the protective effects of marshland has been growing, Nepf says. For example, the Netherlands has been restoring lost marshland outside the dikes that surround much of the nation’s agricultural land, finding that the marsh can protect the dikes from erosion; the marsh and dikes work together much more effectively than the dikes alone at preventing flooding.

    But most such efforts so far have been largely empirical, trial-and-error plans, Nepf says. Now, they could take advantage of this modeling to know just how much marshland with what types of plants would be needed to provide the desired level of protection.

    It also provides a more quantitative way to estimate the value provided by marshes, she says. “It could allow you to more accurately say, ‘40 meters of marsh will reduce waves this much and therefore will reduce overtopping of your levee by this much.’ Someone could use that to say, ‘I’m going to save this much money over the next 10 years if I reduce flooding by maintaining this marsh.’ It might help generate some political motivation for restoration efforts.”

    Nepf herself is already trying to get some of these findings included in coastal planning processes. She serves on a practitioner panel led by Chris Esposito of the Water Institute of the Gulf, which serves the storm-battered Louisiana coastline. “We’d like to get this work into the coatal simulations that are used for large-scale restoration and coastal planning,” she says.

    “Understanding the wave damping process in real vegetation wetlands is of critical value, as it is needed in the assessment of the coastal defense value of these wetlands,” says Zhan Hu, an associate professor of marine sciences at Sun Yat-Sen University, who was not associated with this work. “The challenge, however, lies in the quantitative representation of the wave damping process, in which many factors are at play, such as plant flexibility, morphology, and coexisting currents.”

    The new study, Hu says, “neatly combines experimental findings and analytical modeling to reveal the impact of each factor in the wave damping process. … Overall, this work is a solid step forward toward a more accurate assessment of wave damping capacity of real coastal wetlands, which is needed for science-based design and management of nature-based coastal protection.”

    The work was partly supported by the National Science Foundation and the China Scholarship Council.  More