More stories

  • in

    Solar-powered system offers a route to inexpensive desalination

    An estimated two-thirds of humanity is affected by shortages of water, and many such areas in the developing world also face a lack of dependable electricity. Widespread research efforts have thus focused on ways to desalinate seawater or brackish water using just solar heat. Many such efforts have run into problems with fouling of equipment caused by salt buildup, however, which often adds complexity and expense.

    Now, a team of researchers at MIT and in China has come up with a solution to the problem of salt accumulation — and in the process developed a desalination system that is both more efficient and less expensive than previous solar desalination methods. The process could also be used to treat contaminated wastewater or to generate steam for sterilizing medical instruments, all without requiring any power source other than sunlight itself.

    The findings are described today in the journal Nature Communications, in a paper by MIT graduate student Lenan Zhang, postdoc Xiangyu Li, professor of mechanical engineering Evelyn Wang, and four others.

    “There have been a lot of demonstrations of really high-performing, salt-rejecting, solar-based evaporation designs of various devices,” Wang says. “The challenge has been the salt fouling issue, that people haven’t really addressed. So, we see these very attractive performance numbers, but they’re often limited because of longevity. Over time, things will foul.”

    Many attempts at solar desalination systems rely on some kind of wick to draw the saline water through the device, but these wicks are vulnerable to salt accumulation and relatively difficult to clean. The team focused on developing a wick-free system instead. The result is a layered system, with dark material at the top to absorb the sun’s heat, then a thin layer of water above a perforated layer of material, sitting atop a deep reservoir of the salty water such as a tank or a pond. After careful calculations and experiments, the researchers determined the optimal size for the holes drilled through the perforated material, which in their tests was made of polyurethane. At 2.5 millimeters across, these holes can be easily made using commonly available waterjets.

    The holes are large enough to allow for a natural convective circulation between the warmer upper layer of water and the colder reservoir below. That circulation naturally draws the salt from the thin layer above down into the much larger body of water below, where it becomes well-diluted and no longer a problem. “It allows us to achieve high performance and yet also prevent this salt accumulation,” says Wang, who is the Ford Professor of Engineering and head of the Department of Mechanical Engineering.

    Li says that the advantages of this system are “both the high performance and the reliable operation, especially under extreme conditions, where we can actually work with near-saturation saline water. And that means it’s also very useful for wastewater treatment.”

    He adds that much work on such solar-powered desalination has focused on novel materials. “But in our case, we use really low-cost, almost household materials.” The key was analyzing and understanding the convective flow that drives this entirely passive system, he says. “People say you always need new materials, expensive ones, or complicated structures or wicking structures to do that. And this is, I believe, the first one that does this without wicking structures.”

    This new approach “provides a promising and efficient path for desalination of high salinity solutions, and could be a game changer in solar water desalination,” says Hadi Ghasemi, a professor of chemical and biomolecular engineering at the University of Houston, who was not associated with this work. “Further work is required for assessment of this concept in large settings and in long runs,” he adds.

    Just as hot air rises and cold air falls, Zhang explains, natural convection drives the desalination process in this device. In the confined water layer near the top, “the evaporation happens at the very top interface. Because of the salt, the density of water at the very top interface is higher, and the bottom water has lower density. So, this is an original driving force for this natural convection because the higher density at the top drives the salty liquid to go down.” The water evaporated from the top of the system can then be collected on a condensing surface, providing pure fresh water.

    The rejection of salt to the water below could also cause heat to be lost in the process, so preventing that required careful engineering, including making the perforated layer out of highly insulating material to keep the heat concentrated above. The solar heating at the top is accomplished through a simple layer of black paint.

    This gif shows fluid flow visualized by food dye. The left-side shows the slow transport of colored de-ionized water from the top to the bottom bulk water. The right-side shows the fast transport of colored saline water from the top to the bottom bulk water driven by the natural convection effect.

    So far, the team has proven the concept using small benchtop devices, so the next step will be starting to scale up to devices that could have practical applications. Based on their calculations, a system with just 1 square meter (about a square yard) of collecting area should be sufficient to provide a family’s daily needs for drinking water, they say. Zhang says they calculated that the necessary materials for a 1-square-meter device would cost only about $4.

    Their test apparatus operated for a week with no signs of any salt accumulation, Li says. And the device is remarkably stable. “Even if we apply some extreme perturbation, like waves on the seawater or the lake,” where such a device could be installed as a floating platform, “it can return to its original equilibrium position very fast,” he says.

    The necessary work to translate this lab-scale proof of concept into workable commercial devices, and to improve the overall water production rate, should be possible within a few years, Zhang says. The first applications are likely to be providing safe water in remote off-grid locations, or for disaster relief after hurricanes, earthquakes, or other disruptions of normal water supplies.

    Zhang adds that “if we can concentrate the sunlight a little bit, we could use this passive device to generate high-temperature steam to do medical sterilization” for off-grid rural areas.

    “I think a real opportunity is the developing world,” Wang says. “I think that is where there’s most probable impact near-term, because of the simplicity of the design.” But, she adds, “if we really want to get it out there, we also need to work with the end users, to really be able to adopt the way we design it so that they’re willing to use it.”

    “This is a new strategy toward solving the salt accumulation problem in solar evaporation,” says Peng Wang, a professor at King Abdullah University of Science and Technology in Saudi Arabia, who was not associated with this research. “This elegant design will inspire new innovations in the design of advanced solar evaporators. The strategy is very promising due to its high energy efficiency, operation durability, and low cost, which contributes to low-cost and passive water desalination to produce fresh water from various source water with high salinity, e.g., seawater, brine, or brackish groundwater.”

    The team also included Yang Zhong, Arny Leroy, and Lin Zhao at MIT, and Zhenyuan Xu at Shanghai Jiao Tong University in China. The work was supported by the Singapore-MIT Alliance for Research and Technology, the U.S.-Egypt Science and Technology Joint Fund, and used facilities supported by the National Science Foundation. More

  • in

    Reducing food waste to increase access to affordable foods

    About a third of the world’s food supply never gets eaten. That means the water, labor, energy, and fertilizer that went into growing, processing, and distributing the food is wasted.

    On the other end of the supply chain are cash-strapped consumers, who have been further distressed in recent years by factors like the Covid-19 pandemic and inflation.

    Spoiler Alert, a company founded by two MIT alumni, is helping companies bridge the gap between food waste and food insecurity with a platform connecting major food and beverage brands with discount grocers, retailers, and nonprofits. The platform helps brands discount or donate excess and short-dated inventory days, weeks, and months before it expires.

    “There is a tremendous amount of underutilized data that exists in the manufacturing and distribution space that results in good food going to waste,” says Ricky Ashenfelter MBA ’15, who co-founded the company with Emily Malina MBA ’15.

    Spoiler Alert helps brands manage distressed inventory data, create offers for potential buyers, and review and accept bids. The platform is designed to work with companies’ existing inventory and fulfillment systems, using automation and pricing intelligence to further streamline sales.

    “At a high level, we’re a waste-prevention software built for sales and supply-chain teams,” Ashenfelter says. “You can think of it as a private [business-to-business] eBay of sorts.”

    Spoiler Alert is working with global companies like Nestle, Kraft Heinz, and Danone, as well as discount grocers like the United Grocery Outlet and Misfits Market. Those brands are already using the platform to reduce food waste and get more food on people’s tables.

    “Project Drawdown [a nonprofit working on climate solutions] has identified food waste as the number one priority to address the global climate crisis, so these types of corporate initiatives can be really powerful from an environmental standpoint,” Ashenfelter says, noting the nonprofit estimates food waste accounts for 8 percent of global greenhouse gas emissions. “Contrast that with growing levels of food insecurity and folks not being able to access affordable nutrition, and you start to see how tackling supply-chain inefficiency can have a dramatic impact from both an environmental and a social lens. That’s what motivates us.”

    Untapped data for change

    Ashenfelter came to MIT’s Sloan School of Management after several years in sustainability software and management consulting within the retail and consumer products industries.

    “I was really attracted to transitioning into something much more entrepreneurial, and to leverage not only Sloan’s focus on entrepreneurship, but also the broader MIT ecosystem’s focus on technology, entrepreneurship, clean tech innovation, and other themes along that front,” he says.

    Ashenfelter met Malina at one of Sloan’s admitted students events in 2013, and the founders soon set out to use data to decrease food waste.

    “For us, the idea was clear: How do we better leverage data to manage excess and short-dated inventory?” Ashenfelter says. “How we go about that has evolved over the last six years, but it’s all rooted in solving an enormous climate problem, solving a major food insecurity problem, and from a capitalistic standpoint, helping businesses cut costs and generate revenue from otherwise wasted products.”

    The founders spent many hours in the Martin Trust Center for MIT Entrepreneurship with support from the Sloan Sustainability Initiative, and used Spoiler Alert as a case study in nearly every class they took, thinking through product development, sales, marketing, pricing, and more through their coursework.

    “We brought our idea into just about every action learning class that we could at Sloan and MIT,” Ashenfelter says.

    They also participated in the MIT $100K Entrepreneurship Competition and received support from the Venture Mentoring Service and the IDEAS Global Challenge program.

    Upon graduation, the founders initially began building a platform to facilitate donations of excess inventory, but soon learned big companies’ processes for discounting that inventory were also highly manual. Today, more than 90 percent of Spoiler Alert’s transaction volume is discounted, with the remainder donated.

    Different teams within an organization can upload excess inventory reports to Spoiler Alert’s system, eliminating the need to manually aggregate datasets and preparing what the industry refers to as “blowout lists” to sell. Spoiler Alert uses machine-learning-based tools to help both parties with pricing and negotiations to close deals more quickly.

    “Companies are taking pretty manual and slow approaches to deciding [what to do with excess inventory],” Ashenfelter says. “And when you have slow decision-making, you’re losing days or even weeks of shelf life on that product. That can be the difference between selling product versus donating, and donating versus dumping.”

    Once a deal has been made, Spoiler Alert automatically generates the forms and workflows needed by fulfillment teams to get the product out the door. The relationships companies build on the platform are also a major driver for cutting down waste.

    “We’re providing suppliers with the ability to control where their discounted and donated product ends up,” Ashenfelter says. “That’s really powerful because it allows these CPG brands to ensure that this product is, in many cases, getting to affordable nutrition outlets in underserved communities.”

    Ashenfelter says the majority of inventory goes to regional and national discount grocers, supplemented with extensive purchasing from local and nonprofit grocery chains.

    “Everything we do is oriented around helping sell as much product as possible to a reputable set of buyers at the most fair, equitable prices possible,” Ashenfelter says.

    Scaling for impact

    The pandemic has disrupted many aspects of the food supply chains. But Ashenfelter says it has also accelerated the adoption of digital solutions that can better manage such volatility.

    When Campbell began using Spoiler Alert’s system in 2019, for instance, it achieved a 36 percent increase in discount sales and a 27 percent increase in donations over the first five months.

    Ashenfelter says the results have proven that companies’ sustainability targets can go hand in hand with initiatives that boost their bottom lines. In fact, because Spoiler Alert focuses so much on the untapped revenue associated with food waste, many customers don’t even realize Spoiler Alert is a sustainability company until after they’ve signed on.

    “What’s neat about this program is that it becomes an incredibly powerful case study internally for how sustainability and operational outcomes aren’t in conflict and can drive both business results as well as overall environmental impact,” Ashenfelter says.

    Going forward, Spoiler Alert will continue building out algorithmic solutions that could further cut down on waste internationally and across a wider array of products.

    “At every step in our process, we’re collecting a tremendous amount of data in terms of what is and isn’t selling, at what price point, to which buyers, out of which geographies, and with how much remaining shelf life,” Ashenfelter explains. “We are only starting to scratch the surface in terms of bringing our recommendations engine to life for our suppliers and buyers. Ultimately our goal is to power the waste-free economy, and rooted in that is making better decisions faster, in collaboration with a growing ecosystem of supply chain partners, and with as little manual intervention as possible.” More

  • in

    Scientists build new atlas of ocean’s oxygen-starved waters

    Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

    Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

    The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

    From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers — roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

    The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

    “It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

    The team’s study appears today in the journal Global Biogeochemical Cycles.

    Airing out artifacts

    Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

    Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

    “But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

    Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

    Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

    “We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

    Bottoming out

    The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

    The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

    “We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says. 

    The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

    “We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

    Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

    “How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

    “This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s oxygen supply is controlled.”

    This research is supported, in part, by the Simons Foundation. More

  • in

    J-WAFS launches Food and Climate Systems Transformation Alliance

    Food systems around the world are increasingly at risk from the impacts of climate change. At the same time, these systems, which include all activities from food production to consumption and food waste, are responsible for about one-third of the human-caused greenhouse gas emissions warming the planet. 

    To drive research-based innovation that will make food systems more resilient and sustainable, MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) announced the launch of a new initiative at an event during the UN Climate Change Conference in Glasgow, Scotland, last week. The initiative, called the Food and Climate Systems Transformation (FACT) Alliance, will better connect researchers to farmers, food businesses, policymakers, and other food systems stakeholders around the world. 

    “Time is not on our side,” says Greg Sixt, the director of the FACT Alliance and research manager for food and climate systems at J-WAFS. “To date, the research community hasn’t delivered actionable solutions quickly enough or in the policy-relevant form needed if time-critical changes are to be made to our food systems. The FACT Alliance aims to change this.”

    Why, in fact, do our food systems need transformation?

    At COP26 (which stands for “conference of the parties” to the UN Framework Convention on Climate Change, being held for the 26th time this year), a number of countries have pledged to end deforestation, reduce methane emissions, and cease public financing of coal power. In his keynote address at the FACT Alliance event, Professor Pete Smith of the University of Aberdeen, an alliance member institution, noted that food and agriculture also need to be addressed because “there’s an interaction between climate change and the food system.” 

    The UN Intergovernmental Panel on Climate Change warns that a two-degree Celsius increase in average global temperature over preindustrial levels could trigger a worldwide food crisis, and emissions from food systems alone could push us past the two-degree mark even if energy-related emissions could be zeroed out. 

    Smith said dramatic and rapid transformations are needed to deliver safe, nutritious food for all, with reduced environmental impact and increased resilience to climate change. With a global network of leading research institutions and collaborating stakeholder organizations, the FACT Alliance aims to facilitate new, solutions-oriented research for addressing the most challenging aspects of food systems in the era of climate change. 

    How the FACT Alliance works

    Central to the work of the FACT Alliance is the development of new methodologies for aligning data across scales and food systems components, improving data access, integrating research across the diverse disciplines that address aspects of food systems, making stakeholders partners in the research process, and assessing impact in the context of complex and interconnected food and climate systems. 

    The FACT Alliance will conduct what’s known as “convergence research,” which meets complex problems with approaches that embody deep integration across disciplines. This kind of research calls for close association with the stakeholders who both make decisions and are directly affected by how food systems work, be they farmers, extension services (i.e., agricultural advisories), policymakers, international aid organizations, consumers, or others. By inviting stakeholders and collaborators to be part of the research process, the FACT Alliance allows for engagement at the scale, geography, and scope that is most relevant to the needs of each, integrating global and local teams to achieve better outcomes. 

    “Doing research in isolation of all the stakeholders and in isolation of the goals that we want to achieve will not deliver the transformation that we need,” said Smith. “The problem is too big for us to solve in isolation, and we need broad alliances to tackle the issue, and that’s why we developed the FACT Alliance.” 

    Members and collaborators

    Led by MIT’s J-WAFS, the FACT Alliance is currently made up of 16 core members and an associated network of collaborating stakeholder organizations. 

    “As the central convener of MIT research on food systems, J-WAFS catalyzes collaboration across disciplines,” says Maria Zuber, vice president for research at MIT. “Now, by bringing together a world-class group of research institutions and stakeholders from key sectors, the FACT Alliance aims to advance research that will help alleviate climate impacts on food systems and mitigate food system impacts on climate.”

    J-WAFS co-hosted the COP26 event “Bridging the Science-Policy Gap for Impactful, Demand-Driven Food Systems Innovation” with Columbia University, the American University of Beirut, and the CGIAR research program Climate Change, Agriculture and Food Security (CCAFS). The event featured a panel discussion with several FACT Alliance members and the UK Foreign, Commonwealth and Development Office (FCDO). More

  • in

    Q&A: Options for the Diablo Canyon nuclear plant

    The Diablo Canyon nuclear plant in California, the only one still operating in the state, is set to close in 2025. A team of researchers at MIT’s Center for Advanced Nuclear Energy Systems, Abdul Latif Jameel Water and Food Systems Lab, and Center for Energy and Environmental Policy Research; Stanford’s Precourt Energy Institute; and energy analysis firm LucidCatalyst LLC have analyzed the potential benefits the plant could provide if its operation were extended to 2030 or 2045.

    They found that this nuclear plant could simultaneously help to stabilize the state’s electric grid, provide desalinated water to supplement the state’s chronic water shortages, and provide carbon-free hydrogen fuel for transportation. MIT News asked report co-authors Jacopo Buongiorno, the TEPCO Professor of Nuclear Science and Engineering, and John Lienhard, the Jameel Professor of Water and Food, to discuss the group’s findings.

    Q: Your report suggests co-locating a major desalination plant alongside the existing Diablo Canyon power plant. What would be the potential benefits from operating a desalination plant in conjunction with the power plant?

    Lienhard: The cost of desalinated water produced at Diablo Canyon would be lower than for a stand-alone plant because the cost of electricity would be significantly lower and you could take advantage of the existing infrastructure for the intake of seawater and the outfall of brine. Electricity would be cheaper because the location takes advantage of Diablo Canyon’s unique capability to provide low cost, zero-carbon baseload power.

    Depending on the scale at which the desalination plant is built, you could make a very significant impact on the water shortfalls of state and federal projects in the area. In fact, one of the numbers that came out of this study was that an intermediate-sized desalination plant there would produce more fresh water than the highest estimate of the net yield from the proposed Delta Conveyance Project on the Sacramento River. You could get that amount of water at Diablo Canyon for an investment cost less than half as large, and without the associated impacts that would come with the Delta Conveyance Project.

    And the technology envisioned for desalination here, reverse osmosis, is available off the shelf. You can buy this equipment today. In fact, it’s already in use in California and thousands of other places around the world.

    Q: You discuss in the report three potential products from the Diablo Canyon plant:  desalinatinated water, power for the grid, and clean hydrogen. How well can the plant accommodate all of those efforts, and are there advantages to combining them as opposed to doing any one of them separately?

    Buongiorno: California, like many other regions in the world, is facing multiple challenges as it seeks to reduce carbon emissions on a grand scale. First, the wide deployment of intermittent energy sources such as solar and wind creates a great deal of variability on the grid that can be balanced by dispatchable firm power generators like Diablo. So, the first mission for Diablo is to continue to provide reliable, clean electricity to the grid.

    The second challenge is the prolonged drought and water scarcity for the state in general. And one way to address that is water desalination co-located with the nuclear plant at the Diablo site, as John explained.

    The third challenge is related to decarbonization the transportation sector. A possible approach is replacing conventional cars and trucks with vehicles powered by fuel cells which consume hydrogen. Hydrogen has to be produced from a primary energy source. Nuclear power, through a process called electrolysis, can do that quite efficiently and in a manner that is carbon-free.

    Our economic analysis took into account the expected revenue from selling these multiple products — electricity for the grid, hydrogen for the transportation sector, water for farmers or other local users — as well as the costs associated with deploying the new facilities needed to produce desalinated water and hydrogen. We found that, if Diablo’s operating license was extended until 2035, it would cut carbon emissions by an average of 7 million metric tons a year — a more than 11 percent reduction from 2017 levels — and save ratepayers $2.6 billion in power system costs.

    Further delaying the retirement of Diablo to 2045 would spare 90,000 acres of land that would need to be dedicated to renewable energy production to replace the facility’s capacity, and it would save ratepayers up to $21 billion in power system costs.

    Finally, if Diablo was operated as a polygeneration facility that provides electricity, desalinated water, and hydrogen simultaneously, its value, quantified in terms of dollars per unit electricity generated, could increase by 50 percent.

    Lienhard: Most of the desalination scenarios that we considered did not consume the full electrical output of that plant, meaning that under most scenarios you would continue to make electricity and do something with it, beyond just desalination. I think it’s also important to remember that this power plant produces 15 percent of California’s carbon-free electricity today and is responsible for 8 percent of the state’s total electrical production. In other words, Diablo Canyon is a very large factor in California’s decarbonization. When or if this plant goes offline, the near-term outcome is likely to be increased reliance on natural gas to produce electricity, meaning a rise in California’s carbon emissions.

    Q: This plant in particular has been highly controversial since its inception. What’s your assessment of the plant’s safety beyond its scheduled shutdown, and how do you see this report as contributing to the decision-making about that shutdown?

    Buongiorno: The Diablo Canyon Nuclear Power Plant has a very strong safety record. The potential safety concern for Diablo is related to its proximity to several fault lines. Being located in California, the plant was designed to withstand large earthquakes to begin with. Following the Fukushima accident in 2011, the Nuclear Regulatory Commission reviewed the plant’s ability to withstand external events (e.g., earthquakes, tsunamis, floods, tornadoes, wildfires, hurricanes) of exceptionally rare and severe magnitude. After nine years of assessment the NRC’s conclusion is that “existing seismic capacity or effective flood protection [at Diablo Canyon] will address the unbounded reevaluated hazards.” That is, Diablo was designed and built to withstand even the rarest and strongest earthquakes that are physically possible at this site.

    As an additional level of protection, the plant has been retrofitted with special equipment and procedures meant to ensure reliable cooling of the reactor core and spent fuel pool under a hypothetical scenario in which all design-basis safety systems have been disabled by a severe external event.

    Lienhard: As for the potential impact of this report, PG&E [the California utility] has already made the decision to shut down the plant, and we and others hope that decision will be revisited and reversed. We believe that this report gives the relevant stakeholders and policymakers a lot of information about options and value associated with keeping the plant running, and about how California could benefit from clean water and clean power generated at Diablo Canyon. It’s not up to us to make the decision, of course — that is a decision that must be made by the people of California. All we can do is provide information.

    Q: What are the biggest challenges or obstacles to seeing these ideas implemented?

    Lienhard: California has very strict environmental protection regulations, and it’s good that they do. One of the areas of great concern to California is the health of the ocean and protection of the coastal ecosystem. As a result, very strict rules are in place about the intake and outfall of both power plants and desalination plants, to protect marine life. Our analysis suggests that this combined plant can be implemented within the parameters prescribed by the California Ocean Plan and that it can meet the regulatory requirements.

    We believe that deeper analysis would be needed before you could proceed. You would need to do site studies and really get out into the water and look in detail at what’s there. But the preliminary analysis is positive. A second challenge is that the discourse in California around nuclear power has generally not been very supportive, and similarly some groups in California oppose desalination. We expect that that both of those points of view would be part of the conversation about whether or not to procede with this project.

    Q: How particular is this analysis to the specifics of this location? Are there aspects of it that apply to other nuclear plants, domestically or globally?

    Lienhard: Hundreds of nuclear plants around the world are situated along the coast, and many are in water stressed regions. Although our analysis focused on Diablo Canyon, we believe that the general findings are applicable to many other seaside nuclear plants, so that this approach and these conclusions could potentially be applied at hundreds of sites worldwide. More

  • in

    Scientists project increased risk to water supplies in South Africa this century

    In 2018, Cape Town, South Africa’s second most populous city, came very close to running out of water as the multi-year “Day Zero” drought depleted its reservoirs. Since then, researchers from Stanford University determined that climate change had made this extreme drought five to six times more likely, and warned that a lot more Day Zero events could occur in regions with similar climates in the future. A better understanding of likely surface air temperature and precipitation trends in South Africa and other dry, populated areas around the world in the coming decades could empower decision-makers to pursue science-based climate mitigation and adaptation measures designed to reduce the risk of future Day Zero events.    

    Toward that end, researchers at the MIT Joint Program on the Science and Policy of Global Change, International Food Policy Research Institute, and CGIAR have produced modeled projections of 21st-century changes in seasonal surface air temperature and precipitation for South Africa that systematically and comprehensively account for uncertainties in how Earth and socioeconomic systems behave and co-evolve. Presented in a study in the journal Climatic Change, these projections show how temperature and precipitation over three sub-national regions — western, central, and eastern South Africa — are likely to change under a wide range of global climate mitigation policy scenarios.

    In a business-as-usual global climate policy scenario in which no emissions or climate targets are set or met, the projections show that for all three regions, there’s a greater-than 50 percent likelihood that mid-century temperatures will increase threefold over the current climate’s range of variability. But the risk of these mid-century temperature increases is effectively eliminated through more aggressive climate targets.

    The business-as-usual projections indicate that the risk of decreased precipitation levels in western and central South Africa is three to four times higher than the risk of increased precipitation levels. Under a global climate mitigation policy designed to cap global warming at 1.5 degrees Celsius by 2100, the risk of precipitation changes within South Africa toward the end of the century (2065-74) is similar to the risk during the 2030s in the business-as-usual scenario.

    Rising risks of substantially reduced precipitation levels throughout this century under a business-as-usual scenario suggest increased reliance and stress on the widespread water-efficiency measures established in the aftermath of the Day Zero drought. But a 1.5 C global climate mitigation policy would delay these risks by 30 years, giving South Africa ample lead time to prepare for and adapt to them.

    “Our analysis provides risk-based evidence on the benefits of climate mitigation policies as well as unavoidable climate impacts that will need to be addressed through adaptive measures,” says MIT Joint Program Deputy Director C. Adam Schlosser, the lead author of the study. “Global action to limit human-induced warming could give South Africa enough time to secure sufficient water supplies to sustain its population. Otherwise, anticipated climate shifts by the middle of the next decade may well make Day-Zero situations more common.”

    This study is part of an ongoing effort to assess the risks that climate change poses for South Africa’s agricultural, economic, energy and infrastructure sectors. More

  • in

    Rover images confirm Jezero crater is an ancient Martian lake

    The first scientific analysis of images taken by NASA’s Perseverance rover has now confirmed that Mars’ Jezero crater — which today is a dry, wind-eroded depression — was once a quiet lake, fed steadily by a small river some 3.7 billion years ago.

    The images also reveal evidence that the crater endured flash floods. This flooding was energetic enough to sweep up large boulders from tens of miles upstream and deposit them into the lakebed, where the massive rocks lie today.

    The new analysis, published today in the journal Science, is based on images of the outcropping rocks inside the crater on its western side. Satellites had previously shown that this outcrop, seen from above, resembled river deltas on Earth, where layers of sediment are deposited in the shape of a fan as the river feeds into a lake.

    Perseverance’s new images, taken from inside the crater, confirm that this outcrop was indeed a river delta. Based on the sedimentary layers in the outcrop, it appears that the river delta fed into a lake that was calm for much of its existence, until a dramatic shift in climate triggered episodic flooding at or toward the end of the lake’s history.

    “If you look at these images, you’re basically staring at this epic desert landscape. It’s the most forlorn place you could ever visit,” says Benjamin Weiss, professor of planetary sciences in MIT’s Department of Earth, Atmospheric and Planetary Sciences and a member of the analysis team. “There’s not a drop of water anywhere, and yet, here we have evidence of a very different past. Something very profound happened in the planet’s history.”

    As the rover explores the crater, scientists hope to uncover more clues to its climatic evolution. Now that they have confirmed the crater was once a lake environment, they believe its sediments could hold traces of ancient aqueous life. In its mission going forward, Perseverance will look for locations to collect and preserve sediments. These samples will eventually be returned to Earth, where scientists can probe them for Martian biosignatures.

    “We now have the opportunity to look for fossils,” says team member Tanja Bosak, associate professor of geobiology at MIT. “It will take some time to get to the rocks that we really hope to sample for signs of life. So, it’s a marathon, with a lot of potential.”

    Tilted beds

    On Feb. 18, 2021, the Perseverance rover landed on the floor of Jezero crater, a little more than a mile away from its western fan-shaped outcrop. In the first three months, the vehicle remained stationary as NASA engineers performed remote checks of the rover’s many instruments.

    During this time, two of Perseverance’s cameras, Mastcam-Z and the SuperCam Remote Micro-Imager (RMI), captured images of their surroundings, including long-distance photos of the outcrop’s edge and a formation known as Kodiak butte, a smaller outcop that planetary geologists surmise may have once been connected to the main fan-shaped outcrop but has since partially eroded.

    Once the rover downlinked images to Earth, NASA’s Perseverance science team processed and combined the images, and were able to observe distinct beds of sediment along Kodiak butte in surprisingly high resolution. The researchers measured each layer’s thickness, slope, and lateral extent, finding that the sediment must have been deposited by flowing water into a lake, rather than by wind, sheet-like floods, or other geologic processes.

    The rover also captured similar tilted sediment beds along the main outcrop. These images, together with those of Kodiak, confirm that the fan-shaped formation was indeed an ancient delta and that this delta fed into an ancient Martian lake.

    “Without driving anywhere, the rover was able to solve one of the big unknowns, which was that this crater was once a lake,” Weiss says. “Until we actually landed there and confirmed it was a lake, it was always a question.”

    Boulder flow

    When the researchers took a closer look at images of the main outcrop, they noticed large boulders and cobbles embedded in the youngest, topmost layers of the delta. Some boulders measured as wide as 1 meter across, and were estimated to weigh up to several tons. These massive rocks, the team concluded, must have come from outside the crater, and was likely part of bedrock located on the crater rim or else 40 or more miles upstream.

    Judging from their current location and dimensions, the team says the boulders were carried downstream and into the lakebed by a flash-flood that flowed up to 9 meters per second and moved up to 3,000 cubic meters of water per second.

    “You need energetic flood conditions to carry rocks that big and heavy,” Weiss says. “It’s a special thing that may be indicative of a fundamental change in the local hydrology or perhaps the regional climate on Mars.”

    Because the huge rocks lie in the upper layers of the delta, they represent the most recently deposited material. The boulders sit atop layers of older, much finer sediment. This stratification, the researchers say, indicates that for much of its existence, the ancient lake was filled by a gently flowing river. Fine sediments — and possibly organic material — drifted down the river, and settled into a gradual, sloping delta.

    However, the crater later experienced sudden flash floods that deposited large boulders onto the delta. Once the lake dried up, and over billions of years wind eroded the landscape, leaving the crater we see today.

    The cause of this climate turnaround is unknown, although Weiss says the delta’s boulders may hold some answers.

    “The most surprising thing that’s come out of these images is the potential opportunity to catch the time when this crater transitioned from an Earth-like habitable environment, to this desolate landscape wasteland we see now,” he says. “These boulder beds may be records of this transition, and we haven’t seen this in other places on Mars.”

    This research was supported, in part, by NASA. More

  • in

    New “risk triage” platform pinpoints compounding threats to US infrastructure

    Over a 36-hour period in August, Hurricane Henri delivered record rainfall in New York City, where an aging storm-sewer system was not built to handle the deluge, resulting in street flooding. Meanwhile, an ongoing drought in California continued to overburden aquifers and extend statewide water restrictions. As climate change amplifies the frequency and intensity of extreme events in the United States and around the world, and the populations and economies they threaten grow and change, there is a critical need to make infrastructure more resilient. But how can this be done in a timely, cost-effective way?

    An emerging discipline called multi-sector dynamics (MSD) offers a promising solution. MSD homes in on compounding risks and potential tipping points across interconnected natural and human systems. Tipping points occur when these systems can no longer sustain multiple, co-evolving stresses, such as extreme events, population growth, land degradation, drinkable water shortages, air pollution, aging infrastructure, and increased human demands. MSD researchers use observations and computer models to identify key precursory indicators of such tipping points, providing decision-makers with critical information that can be applied to mitigate risks and boost resilience in infrastructure and managed resources.

    At MIT, the Joint Program on the Science and Policy of Global Change has since 2018 been developing MSD expertise and modeling tools and using them to explore compounding risks and potential tipping points in selected regions of the United States. In a two-hour webinar on Sept. 15, MIT Joint Program researchers presented an overview of the program’s MSD research tool set and its applications.  

    MSD and the risk triage platform

    “Multi-sector dynamics explores interactions and interdependencies among human and natural systems, and how these systems may adapt, interact, and co-evolve in response to short-term shocks and long-term influences and stresses,” says MIT Joint Program Deputy Director C. Adam Schlosser, noting that such analysis can reveal and quantify potential risks that would likely evade detection in siloed investigations. “These systems can experience cascading effects or failures after crossing tipping points. The real question is not just where these tipping points are in each system, but how they manifest and interact across all systems.”

    To address that question, the program’s MSD researchers have developed the MIT Socio-Environmental Triage (MST) platform, now publicly available for the first time. Focused on the continental United States, the first version of the platform analyzes present-day risks related to water, land, climate, the economy, energy, demographics, health, and infrastructure, and where these compound to create risk hot spots. It’s essentially a screening-level visualization tool that allows users to examine risks, identify hot spots when combining risks, and make decisions about how to deploy more in-depth analysis to solve complex problems at regional and local levels. For example, MST can identify hot spots for combined flood and poverty risks in the lower Mississippi River basin, and thereby alert decision-makers as to where more concentrated flood-control resources are needed.

    Successive versions of the platform will incorporate projections based on the MIT Joint Program’s Integrated Global System Modeling (IGSM) framework of how different systems and stressors may co-evolve into the future and thereby change the risk landscape. This enhanced capability could help uncover cost-effective pathways for mitigating and adapting to a wide range of environmental and economic risks.  

    MSD applications

    Five webinar presentations explored how MIT Joint Program researchers are applying the program’s risk triage platform and other MSD modeling tools to identify potential tipping points and risks in five key domains: water quality, land use, economics and energy, health, and infrastructure. 

    Joint Program Principal Research Scientist Xiang Gao described her efforts to apply a high-resolution U.S. water-quality model to calculate a location-specific, water-quality index over more than 2,000 river basins in the country. By accounting for interactions among climate, agriculture, and socioeconomic systems, various water-quality measures can be obtained ranging from nitrate and phosphate levels to phytoplankton concentrations. This modeling approach advances a unique capability to identify potential water-quality risk hot spots for freshwater resources.

    Joint Program Research Scientist Angelo Gurgel discussed his MSD-based analysis of how climate change, population growth, changing diets, crop-yield improvements and other forces that drive land-use change at the global level may ultimately impact how land is used in the United States. Drawing upon national observational data and the IGSM framework, the analysis shows that while current U.S. land-use trends are projected to persist or intensify between now and 2050, there is no evidence of any concerning tipping points arising throughout this period.  

    MIT Joint Program Research Scientist Jennifer Morris presented several examples of how the risk triage platform can be used to combine existing U.S. datasets and the IGSM framework to assess energy and economic risks at the regional level. For example, by aggregating separate data streams on fossil-fuel employment and poverty, one can target selected counties for clean energy job training programs as the nation moves toward a low-carbon future. 

    “Our modeling and risk triage frameworks can provide pictures of current and projected future economic and energy landscapes,” says Morris. “They can also highlight interactions among different human, built, and natural systems, including compounding risks that occur in the same location.”  

    MIT Joint Program research affiliate Sebastian Eastham, a research scientist at the MIT Laboratory for Aviation and the Environment, described an MSD approach to the study of air pollution and public health. Linking the IGSM with an atmospheric chemistry model, Eastham ultimately aims to better understand where the greatest health risks are in the United States and how they may compound throughout this century under different policy scenarios. Using the risk triage tool to combine current risk metrics for air quality and poverty in a selected county based on current population and air-quality data, he showed how one can rapidly identify cardiovascular and other air-pollution-induced disease risk hot spots.

    Finally, MIT Joint Program research affiliate Alyssa McCluskey, a lecturer at the University of Colorado at Boulder, showed how the risk triage tool can be used to pinpoint potential risks to roadways, waterways, and power distribution lines from flooding, extreme temperatures, population growth, and other stressors. In addition, McCluskey described how transportation and energy infrastructure development and expansion can threaten critical wildlife habitats.

    Enabling comprehensive, location-specific analyses of risks and hot spots within and among multiple domains, the Joint Program’s MSD modeling tools can be used to inform policymaking and investment from the municipal to the global level.

    “MSD takes on the challenge of linking human, natural, and infrastructure systems in order to inform risk analysis and decision-making,” says Schlosser. “Through our risk triage platform and other MSD models, we plan to assess important interactions and tipping points, and to provide foresight that supports action toward a sustainable, resilient, and prosperous world.”

    This research is funded by the U.S. Department of Energy’s Office of Science as an ongoing project. More