More stories

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More

  • in

    Scientists and musicians tackle climate change together

    Audiences may travel long distances to see their favorite musical acts in concert or to attend large music festivals, which can add to their personal carbon footprint of emissions that are steadily warming the planet. But these same audiences, and the performers they follow, are often quite aware of the dangers of climate change and eager to contribute to ways of curbing those emissions.

    How should the industry reconcile these two perspectives, and how should it harness the enormous influence that musicians have on their fans to help promote action on climate change?

    That was the focus of a wide-ranging discussion on Monday hosted by MIT’s Environmental Solutions Initiative, titled “Artists and scientists together on climate solutions.” The event, which was held live at the Media Lab’s Bartos Theater and streamed online, featured John Fernandez, director of ESI; Dava Newman, director of the Media Lab; Tony McGuinness, a musician with the group Above and Beyond; and Anna Johnson, the sustainability and environment officer at Involved Group, an organization dedicated to embedding sustainability in business operations in the arts and culture fields.

    Fernandez pointed out in opening the discussion that when it comes to influencing people’s attitudes and behavior, changes tend to come about not just through information from a particular field, but rather from a whole culture. “We started thinking about how we might work with artists, how to have scientists and engineers, inventors, and designers working with artists on the challenges that we really need to face,” he said.

    Dealing with the climate change issue, he said, “is not about 2050 or 2100. This is about 2030. This is about this decade. This is about the next two or three years, really shifting that curve” to lowering the world’s greenhouse gas emissions. “It’s not going to be done just with science and engineering,” he added. “It’s got to be done with artists and business and everyone else. It’s not only ‘all of the above’ solutions, it’s ‘all of the above’ people, coming together to solve this problem.”

    Newman, who is also a professor in MIT’s Department of Aeronautics and Astronautics and has served as a NASA deputy administrator, said that while scientists and engineers can produce vast amounts of useful data that clearly demonstrate the dramatic changes the Earth’s climate is undergoing, communicating that information effectively is often a challenge for these specialists. “That data is just the data, but that doesn’t change the hearts and minds,” she said.

    “As scientists, having the data from our satellites, looking down, but also flying airplanes into the atmosphere, … we have the sensors, and then what can we do with it all? … How do we change human behavior? That’s the part I don’t know how to do,” Newman said. “I can have the technology, I can get precision measurements, I can study it, but really at the end of the day, we have to change human behavior, and that is so hard.”

    And that’s where the world of art and music can play a part, she said. “The best way that I know how to do it is with artistic experiences. You can have one moving experience and when you wake up tomorrow, maybe you’re going to do something a little different.” To help generate the compassion and empathy needed to affect behavior positively, she said, “that’s where we turn to the storytellers. We turn to the visionaries.”

    McGuinness, whose electronic music trio has performed for millions of people around the world, said that his own awareness of the urgency of the climate issue came from his passion for scuba diving, and the dramatic changes he has seen over the last two decades. In diving at a coral reef off Palau in the South Pacific, he returned to what had been a lush, brightly colored ecosystem, and found that “immediately when you put your face under the water, you’re looking at the surface of the moon. It was a horrible shock to see this.”

    After this and other similar diving experiences, he said, “I just came away shocked and stunned,” and realizing the kinds of underwater experiences he had enjoyed would no longer exist for his children. After reading more on the subject of global warming,  “that really sort of tipped me over the edge. And I was like, this is probably the most important thing for living beings now. And that’s sort of where I’ve remained ever since.”

    While his group Above and Beyond has performed one song specifically related to global warming, he doesn’t expect that to be the most impactful way of using their influence. Rather, they’re trying to lead by example, he said, by paying more attention to everything from the supply chains of the merchandise sold at concerts to the emissions generated by travel to the concerts. They’re also being selective about concert venues and making an effort to find performance spaces that are making a significant effort to curb their emissions.

    “If people start voting with their wallets,” McGuinness said, “and there are companies that are doing better than others and are doing the right thing, maybe it’ll catch on. I guess that’s what we can hope for.”

    Understanding these kinds of issues, involving supply chains, transportation, and facilities associated within the music industry, has been the focus of much of Johnson’s work, through the organization Involved Group, which has entered into a collaboration with MIT through the Environmental Solutions Initiative. “It’s these kinds of novel partnerships that have so much potential to catalyze the change that we need to see at an incredible pace,” she said. Already, her group has worked with MIT on mapping out where emissions occur throughout the various aspects of the music industry.

    At a recent music festival in London, she said, the group interviewed hundreds of participants, including audience members, band members, and the crew. “We explored people’s level of awareness of the issues around climate change and environmental degradation,” she said. “And what was really interesting was that there was clearly a lot of awareness of the issue across those different stakeholders, and what felt like a real, genuine level of concern and also of motivation, to want to deepen their understanding of what their contribution on a personal level really meant.”

    Working together across the boundaries of different disciplines and areas of expertise could be crucial to winning the battle against global warming, Newman said. “That’s usually how breakthroughs work,” she said. “If we’re really looking to have impact, it’s going to be from teams of people who are trained across the disciplines.” She pointed out that 90 percent of MIT students are also musicians: “It does go together!” she said. “I think going forward, we have to create new academia, new opportunities that are truly multidisciplinary.” More

  • in

    SMART researchers develop method for early detection of bacterial infection in crops

    Researchers from the Disruptive and Sustainable Technologies for Agricultural Precision (DiSTAP) Interdisciplinary Research Group (IRG) ofSingapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, and their local collaborators from Temasek Life Sciences Laboratory (TLL), have developed a rapid Raman spectroscopy-based method for detecting and quantifying early bacterial infection in crops. The Raman spectral biomarkers and diagnostic algorithm enable the noninvasive and early diagnosis of bacterial infections in crop plants, which can be critical for the progress of plant disease management and agricultural productivity.

    Due to the increasing demand for global food supply and security, there is a growing need to improve agricultural production systems and increase crop productivity. Globally, bacterial pathogen infection in crop plants is one of the major contributors to agricultural yield losses. Climate change also adds to the problem by accelerating the spread of plant diseases. Hence, developing methods for rapid and early detection of pathogen-infected crops is important to improve plant disease management and reduce crop loss.

    The breakthrough by SMART and TLL researchers offers a faster and more accurate method to detect bacterial infection in crop plants at an earlier stage, as compared to existing techniques. The new results appear in a paper titled “Rapid detection and quantification of plant innate immunity response using Raman spectroscopy” published in the journal Frontiers in Plant Science.

    “The early detection of pathogen-infected crop plants is a significant step to improve plant disease management,” says Chua Nam Hai, DiSTAP co-lead principal investigator, professor, TLL deputy chair, and co-corresponding author. “It will allow the fast and selective removal of pathogen load and curb the further spread of disease to other neighboring crops.”

    Traditionally, plant disease diagnosis involves a simple visual inspection of plants for disease symptoms and severity. “Visual inspection methods are often ineffective, as disease symptoms usually manifest only at relatively later stages of infection, when the pathogen load is already high and reparative measures are limited. Hence, new methods are required for rapid and early detection of bacterial infection. The idea would be akin to having medical tests to identify human diseases at an early stage, instead of waiting for visual symptoms to show, so that early intervention or treatment can be applied,” says MIT Professor Rajeev Ram, who is a DiSTAP principal investigator and co-corresponding author on the paper.

    While existing techniques, such as current molecular detection methods, can detect bacterial infection in plants, they are often limited in their use. Molecular detection methods largely depend on the availability of pathogen-specific gene sequences or antibodies to identify bacterial infection in crops; the implementation is also time-consuming and nonadaptable for on-site field application due to the high cost and bulky equipment required, making it impractical for use in agricultural farms.

    “At DiSTAP, we have developed a quantitative Raman spectroscopy-based algorithm that can help farmers to identify bacterial infection rapidly. The developed diagnostic algorithm makes use of Raman spectral biomarkers and can be easily implemented in cloud-based computing and prediction platforms. It is more effective than existing techniques as it enables accurate identification and early detection of bacterial infection, both of which are crucial to saving crop plants that would otherwise be destroyed,” explains Gajendra Pratap Singh, scientific director and principal investigator at DiSTAP and co-lead author.

    A portable Raman system can be used on farms and provides farmers with an accurate and simple yes-or-no response when used to test for the presence of bacterial infections in crops. The development of this rapid and noninvasive method could improve plant disease management and have a transformative impact on agricultural farms by efficiently reducing agricultural yield loss and increasing productivity.

    “Using the diagnostic algorithm method, we experimented on several edible plants such as choy sum,” says DiSTAP and TLL principal investigator and co-corresponding author Rajani Sarojam. “The results showed that the Raman spectroscopy-based method can swiftly detect and quantify innate immunity response in plants infected with bacterial pathogens. We believe that this technology will be beneficial for agricultural farms to increase their productivity by reducing their yield loss due to plant diseases.”

    The researchers are currently working on the development of high-throughput, custom-made portable or hand-held Raman spectrometers that will allow Raman spectral analysis to be quickly and easily performed on field-grown crops.

    SMART and TLL developed and discovered the diagnostic algorithm and Raman spectral biomarkers. TLL also confirmed and validated the detection method through mutant plants. The research is carried out by SMART and supported by the National Research Foundation of Singapore under its Campus for Research Excellence And Technological Enterprise (CREATE) program.

    SMART was established by MIT and the NRF in 2007. The first entity in CREATE developed by NRF, SMART serves as an intellectual and innovation hub for research interactions between MIT and Singapore, undertaking cutting-edge research projects in areas of interest to both Singapore and MIT. SMART currently comprises an Innovation Center and five IRGs: Antimicrobial Resistance, Critical Analytics for Manufacturing Personalized-Medicine, DiSTAP, Future Urban Mobility, and Low Energy Electronic Systems. SMART research is funded by the NRF under the CREATE program.

    Led by Professor Michael Strano of MIT and Professor Chua Nam Hai of Temasek Lifesciences Laboratory, the DiSTAP program addresses deep problems in food production in Singapore and the world by developing a suite of impactful and novel analytical, genetic, and biomaterial technologies. The goal is to fundamentally change how plant biosynthetic pathways are discovered, monitored, engineered, and ultimately translated to meet the global demand for food and nutrients. Scientists from MIT, TTL, Nanyang Technological University, and National University of Singapore are collaboratively developing new tools for the continuous measurement of important plant metabolites and hormones for novel discovery, deeper understanding and control of plant biosynthetic pathways in ways not yet possible, especially in the context of green leafy vegetables; leveraging these new techniques to engineer plants with highly desirable properties for global food security, including high-yield density production, and drought and pathogen resistance; and applying these technologies to improve urban farming. More

  • in

    Energy hackers give a glimpse of a postpandemic future

    After going virtual in 2020, the MIT EnergyHack was back on campus last weekend in a brand-new hybrid format that saw teams participate both in person and virtually from across the globe. While the hybrid format presented new challenges to the organizing team, it also allowed for one of the most diverse and inspiring iterations of the event to date.

    “Organizing a hybrid event was a challenging but important goal in 2021 as we slowly come out of the pandemic, but it was great to realize the benefits of the format this year,” says Kailin Graham, a graduate student in MIT’s Technology and Policy Program and one of the EnergyHack communications directors. “Not only were we able to get students back on campus and taking advantage of those important in-person interactions, but preserving a virtual avenue meant that we were still able to hear brilliant ideas from those around the world who might not have had the opportunity to contribute otherwise, and that’s what the EnergyHack is really about.”

    In fact, of the over 300 participants registered for the event, more than a third participated online, and two of the three grand prize winners participated entirely virtually. Teams of students at any degree level from any institution were welcome, and the event saw an incredible range of backgrounds and expertise, from undergraduates to MBAs, put their heads together to create innovative solutions.

    This year’s event was supported by a host of energy partners both in industry and within MIT. The MIT Energy and Climate Club worked with sponsoring organizations Smartflower, Chargepoint, Edison Energy, Line Vision, Chevron, Shell, and Sterlite Power to develop seven problem statements for hackers, with each judged by representatives form their respective organization. The challenges ranged from envisioning the future of electric vehicle fueling to quantifying the social and environmental benefits of renewable energy projects.

    Hackers had 36 hours to come up with a solution to one challenge, and teams then presented these solutions in a short pitch to a judging panel. Finalists from each challenge progressed to the final judging round to pitch against each other in pursuit of three grand prizes. Team COPrs came in third, receiving $1,000 for their solution to the Line Vision challenge; Crown Joules snagged second place and $1,500 for their approach to the Chargepoint problem; and Feel AMPowered took out first place and $2,000 for their innovative solution to the Smartflower challenge.

    In addition to a new format, this year’s EnergyHack also featured a new emphasis on climate change impacts and the energy transition. According to Arina Khotimsky, co-managing director of EnergyHack 2021, “Moving forward after this year’s rebranding of the MIT Energy and Climate Club, we were hoping to carry this aim to EnergyHack. It was incredibly exciting to have ChargePoint and SmartFlower leading as our Sustainability Circle-tier sponsors and bringing their impactful innovations to the conversations at EnergyHack 2021.”

    To the organizing team, whose members from sophomores to MBAs, this aspect of the event was especially important, and their hope was for the event to inspire a generation of young energy and climate leaders — a hope, according to them, that seems to have been fulfilled.

    “I was floored by the positive feedback we received from hackers, both in-person and virtual, about how much they enjoyed the hackathon,” says Graham. “It’s all thanks to our team of incredibly hardworking organizing directors who made EnergyHack 2021 what it was. It was incredibly rewarding seeing everyone’s impact on the event, and we are looking forward to seeing how it evolves in the future.”­­­ More

  • in

    Nanograins make for a seismic shift

    In Earth’s crust, tectonic blocks slide and grind past each other like enormous ships loosed from anchor. Earthquakes are generated along these fault zones when enough stress builds for a block to stick, then suddenly slip.

    These slips can be aided by several factors that reduce friction within a fault zone, such as hotter temperatures or pressurized gases that can separate blocks like pucks on an air-hockey table. The decreasing friction enables one tectonic block to accelerate against the other until it runs out of energy. Seismologists have long believed this kind of frictional instability can explain how all crustal earthquakes start. But that might not be the whole story.

    In a study published today in Nature Communications, scientists Hongyu Sun and Matej Pec, from MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS), find that ultra-fine-grained crystals within fault zones can behave like low-viscosity fluids. The finding offers an alternative explanation for the instability that leads to crustal earthquakes. It also suggests a link between quakes in the crust and other types of temblors that occur deep in the Earth.

    Nanograins are commonly found in rocks from seismic environments along the smooth surface of “fault mirrors.” These polished, reflective rock faces betray the slipping, sliding forces of past earthquakes. However, it was unclear whether the crystals caused quakes or were merely formed by them.

    To better characterize how these crystals behaved within a fault, the researchers used a planetary ball milling machine to pulverize granite rocks into particles resembling those found in nature. Like a super-powered washing machine filled with ceramic balls, the machine pounded the rock until all its crystals were about 100 nanometers in width, each grain 1/2,000 the size of an average grain of sand.

    After packing the nanopowder into postage-stamp sized cylinders jacketed in gold, the researchers then subjected the material to stresses and heat, creating laboratory miniatures of real fault zones. This process enabled them to isolate the effect of the crystals from the complexity of other factors involved in an actual earthquake.

    The researchers report that the crystals were extremely weak when shearing was initiated — an order of magnitude weaker than more common microcrystals. But the nanocrystals became significantly stronger when the deformation rate was accelerated. Pec, professor of geophysics and the Victor P. Starr Career Development Chair, compares this characteristic, called “rate-strengthening,” to stirring honey in a jar. Stirring the honey slowly is easy, but becomes more difficult the faster you stir.

    The experiment suggests something similar happens in fault zones. As tectonic blocks accelerate past each other, the crystals gum things up between them like honey stirred in a seismic pot.

    Sun, the study’s lead author and EAPS graduate student, explains that their finding runs counter to the dominant frictional weakening theory of how earthquakes start. That theory would predict surfaces of a fault zone have material that gets weaker as the fault block accelerates, and friction should be decreasing. The nanocrystals did just the opposite. However, the crystals’ intrinsic weakness could mean that when enough of them accumulate within a fault, they can give way, causing an earthquake.

    “We don’t totally disagree with the old theorem, but our study really opens new doors to explain the mechanisms of how earthquakes happen in the crust,” Sun says.

    The finding also suggests a previously unrecognized link between earthquakes in the crust and the earthquakes that rumble hundreds of kilometers beneath the surface, where the same tectonic dynamics aren’t at play. That deep, there are no tectonic blocks to grind against each other, and even if there were, the immense pressure would prevent the type of quakes observed in the crust that necessitate some dilatancy and void creation.

    “We know that earthquakes happen all the way down to really big depths where this motion along a frictional fault is basically impossible,” says Pec. “And so clearly, there must be different processes that allow for these earthquakes to happen.”

    Possible mechanisms for these deep-Earth tremors include “phase transitions” which occur due to atomic re-arrangement in minerals and are accompanied by a volume change, and other kinds of metamorphic reactions, such as dehydration of water-bearing minerals, in which the released fluid is pumped through pores and destabilizes a fault. These mechanisms are all characterized by a weak, rate-strengthening layer.

    If weak, rate-strengthening nanocrystals are abundant in the deep Earth, they could present another possible mechanism, says Pec. “Maybe crustal earthquakes are not a completely different beast than the deeper earthquakes. Maybe they have something in common.” More

  • in

    J-WAFS launches Food and Climate Systems Transformation Alliance

    Food systems around the world are increasingly at risk from the impacts of climate change. At the same time, these systems, which include all activities from food production to consumption and food waste, are responsible for about one-third of the human-caused greenhouse gas emissions warming the planet. 

    To drive research-based innovation that will make food systems more resilient and sustainable, MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) announced the launch of a new initiative at an event during the UN Climate Change Conference in Glasgow, Scotland, last week. The initiative, called the Food and Climate Systems Transformation (FACT) Alliance, will better connect researchers to farmers, food businesses, policymakers, and other food systems stakeholders around the world. 

    “Time is not on our side,” says Greg Sixt, the director of the FACT Alliance and research manager for food and climate systems at J-WAFS. “To date, the research community hasn’t delivered actionable solutions quickly enough or in the policy-relevant form needed if time-critical changes are to be made to our food systems. The FACT Alliance aims to change this.”

    Why, in fact, do our food systems need transformation?

    At COP26 (which stands for “conference of the parties” to the UN Framework Convention on Climate Change, being held for the 26th time this year), a number of countries have pledged to end deforestation, reduce methane emissions, and cease public financing of coal power. In his keynote address at the FACT Alliance event, Professor Pete Smith of the University of Aberdeen, an alliance member institution, noted that food and agriculture also need to be addressed because “there’s an interaction between climate change and the food system.” 

    The UN Intergovernmental Panel on Climate Change warns that a two-degree Celsius increase in average global temperature over preindustrial levels could trigger a worldwide food crisis, and emissions from food systems alone could push us past the two-degree mark even if energy-related emissions could be zeroed out. 

    Smith said dramatic and rapid transformations are needed to deliver safe, nutritious food for all, with reduced environmental impact and increased resilience to climate change. With a global network of leading research institutions and collaborating stakeholder organizations, the FACT Alliance aims to facilitate new, solutions-oriented research for addressing the most challenging aspects of food systems in the era of climate change. 

    How the FACT Alliance works

    Central to the work of the FACT Alliance is the development of new methodologies for aligning data across scales and food systems components, improving data access, integrating research across the diverse disciplines that address aspects of food systems, making stakeholders partners in the research process, and assessing impact in the context of complex and interconnected food and climate systems. 

    The FACT Alliance will conduct what’s known as “convergence research,” which meets complex problems with approaches that embody deep integration across disciplines. This kind of research calls for close association with the stakeholders who both make decisions and are directly affected by how food systems work, be they farmers, extension services (i.e., agricultural advisories), policymakers, international aid organizations, consumers, or others. By inviting stakeholders and collaborators to be part of the research process, the FACT Alliance allows for engagement at the scale, geography, and scope that is most relevant to the needs of each, integrating global and local teams to achieve better outcomes. 

    “Doing research in isolation of all the stakeholders and in isolation of the goals that we want to achieve will not deliver the transformation that we need,” said Smith. “The problem is too big for us to solve in isolation, and we need broad alliances to tackle the issue, and that’s why we developed the FACT Alliance.” 

    Members and collaborators

    Led by MIT’s J-WAFS, the FACT Alliance is currently made up of 16 core members and an associated network of collaborating stakeholder organizations. 

    “As the central convener of MIT research on food systems, J-WAFS catalyzes collaboration across disciplines,” says Maria Zuber, vice president for research at MIT. “Now, by bringing together a world-class group of research institutions and stakeholders from key sectors, the FACT Alliance aims to advance research that will help alleviate climate impacts on food systems and mitigate food system impacts on climate.”

    J-WAFS co-hosted the COP26 event “Bridging the Science-Policy Gap for Impactful, Demand-Driven Food Systems Innovation” with Columbia University, the American University of Beirut, and the CGIAR research program Climate Change, Agriculture and Food Security (CCAFS). The event featured a panel discussion with several FACT Alliance members and the UK Foreign, Commonwealth and Development Office (FCDO). More

  • in

    Q&A: Options for the Diablo Canyon nuclear plant

    The Diablo Canyon nuclear plant in California, the only one still operating in the state, is set to close in 2025. A team of researchers at MIT’s Center for Advanced Nuclear Energy Systems, Abdul Latif Jameel Water and Food Systems Lab, and Center for Energy and Environmental Policy Research; Stanford’s Precourt Energy Institute; and energy analysis firm LucidCatalyst LLC have analyzed the potential benefits the plant could provide if its operation were extended to 2030 or 2045.

    They found that this nuclear plant could simultaneously help to stabilize the state’s electric grid, provide desalinated water to supplement the state’s chronic water shortages, and provide carbon-free hydrogen fuel for transportation. MIT News asked report co-authors Jacopo Buongiorno, the TEPCO Professor of Nuclear Science and Engineering, and John Lienhard, the Jameel Professor of Water and Food, to discuss the group’s findings.

    Q: Your report suggests co-locating a major desalination plant alongside the existing Diablo Canyon power plant. What would be the potential benefits from operating a desalination plant in conjunction with the power plant?

    Lienhard: The cost of desalinated water produced at Diablo Canyon would be lower than for a stand-alone plant because the cost of electricity would be significantly lower and you could take advantage of the existing infrastructure for the intake of seawater and the outfall of brine. Electricity would be cheaper because the location takes advantage of Diablo Canyon’s unique capability to provide low cost, zero-carbon baseload power.

    Depending on the scale at which the desalination plant is built, you could make a very significant impact on the water shortfalls of state and federal projects in the area. In fact, one of the numbers that came out of this study was that an intermediate-sized desalination plant there would produce more fresh water than the highest estimate of the net yield from the proposed Delta Conveyance Project on the Sacramento River. You could get that amount of water at Diablo Canyon for an investment cost less than half as large, and without the associated impacts that would come with the Delta Conveyance Project.

    And the technology envisioned for desalination here, reverse osmosis, is available off the shelf. You can buy this equipment today. In fact, it’s already in use in California and thousands of other places around the world.

    Q: You discuss in the report three potential products from the Diablo Canyon plant:  desalinatinated water, power for the grid, and clean hydrogen. How well can the plant accommodate all of those efforts, and are there advantages to combining them as opposed to doing any one of them separately?

    Buongiorno: California, like many other regions in the world, is facing multiple challenges as it seeks to reduce carbon emissions on a grand scale. First, the wide deployment of intermittent energy sources such as solar and wind creates a great deal of variability on the grid that can be balanced by dispatchable firm power generators like Diablo. So, the first mission for Diablo is to continue to provide reliable, clean electricity to the grid.

    The second challenge is the prolonged drought and water scarcity for the state in general. And one way to address that is water desalination co-located with the nuclear plant at the Diablo site, as John explained.

    The third challenge is related to decarbonization the transportation sector. A possible approach is replacing conventional cars and trucks with vehicles powered by fuel cells which consume hydrogen. Hydrogen has to be produced from a primary energy source. Nuclear power, through a process called electrolysis, can do that quite efficiently and in a manner that is carbon-free.

    Our economic analysis took into account the expected revenue from selling these multiple products — electricity for the grid, hydrogen for the transportation sector, water for farmers or other local users — as well as the costs associated with deploying the new facilities needed to produce desalinated water and hydrogen. We found that, if Diablo’s operating license was extended until 2035, it would cut carbon emissions by an average of 7 million metric tons a year — a more than 11 percent reduction from 2017 levels — and save ratepayers $2.6 billion in power system costs.

    Further delaying the retirement of Diablo to 2045 would spare 90,000 acres of land that would need to be dedicated to renewable energy production to replace the facility’s capacity, and it would save ratepayers up to $21 billion in power system costs.

    Finally, if Diablo was operated as a polygeneration facility that provides electricity, desalinated water, and hydrogen simultaneously, its value, quantified in terms of dollars per unit electricity generated, could increase by 50 percent.

    Lienhard: Most of the desalination scenarios that we considered did not consume the full electrical output of that plant, meaning that under most scenarios you would continue to make electricity and do something with it, beyond just desalination. I think it’s also important to remember that this power plant produces 15 percent of California’s carbon-free electricity today and is responsible for 8 percent of the state’s total electrical production. In other words, Diablo Canyon is a very large factor in California’s decarbonization. When or if this plant goes offline, the near-term outcome is likely to be increased reliance on natural gas to produce electricity, meaning a rise in California’s carbon emissions.

    Q: This plant in particular has been highly controversial since its inception. What’s your assessment of the plant’s safety beyond its scheduled shutdown, and how do you see this report as contributing to the decision-making about that shutdown?

    Buongiorno: The Diablo Canyon Nuclear Power Plant has a very strong safety record. The potential safety concern for Diablo is related to its proximity to several fault lines. Being located in California, the plant was designed to withstand large earthquakes to begin with. Following the Fukushima accident in 2011, the Nuclear Regulatory Commission reviewed the plant’s ability to withstand external events (e.g., earthquakes, tsunamis, floods, tornadoes, wildfires, hurricanes) of exceptionally rare and severe magnitude. After nine years of assessment the NRC’s conclusion is that “existing seismic capacity or effective flood protection [at Diablo Canyon] will address the unbounded reevaluated hazards.” That is, Diablo was designed and built to withstand even the rarest and strongest earthquakes that are physically possible at this site.

    As an additional level of protection, the plant has been retrofitted with special equipment and procedures meant to ensure reliable cooling of the reactor core and spent fuel pool under a hypothetical scenario in which all design-basis safety systems have been disabled by a severe external event.

    Lienhard: As for the potential impact of this report, PG&E [the California utility] has already made the decision to shut down the plant, and we and others hope that decision will be revisited and reversed. We believe that this report gives the relevant stakeholders and policymakers a lot of information about options and value associated with keeping the plant running, and about how California could benefit from clean water and clean power generated at Diablo Canyon. It’s not up to us to make the decision, of course — that is a decision that must be made by the people of California. All we can do is provide information.

    Q: What are the biggest challenges or obstacles to seeing these ideas implemented?

    Lienhard: California has very strict environmental protection regulations, and it’s good that they do. One of the areas of great concern to California is the health of the ocean and protection of the coastal ecosystem. As a result, very strict rules are in place about the intake and outfall of both power plants and desalination plants, to protect marine life. Our analysis suggests that this combined plant can be implemented within the parameters prescribed by the California Ocean Plan and that it can meet the regulatory requirements.

    We believe that deeper analysis would be needed before you could proceed. You would need to do site studies and really get out into the water and look in detail at what’s there. But the preliminary analysis is positive. A second challenge is that the discourse in California around nuclear power has generally not been very supportive, and similarly some groups in California oppose desalination. We expect that that both of those points of view would be part of the conversation about whether or not to procede with this project.

    Q: How particular is this analysis to the specifics of this location? Are there aspects of it that apply to other nuclear plants, domestically or globally?

    Lienhard: Hundreds of nuclear plants around the world are situated along the coast, and many are in water stressed regions. Although our analysis focused on Diablo Canyon, we believe that the general findings are applicable to many other seaside nuclear plants, so that this approach and these conclusions could potentially be applied at hundreds of sites worldwide. More

  • in

    Scientists project increased risk to water supplies in South Africa this century

    In 2018, Cape Town, South Africa’s second most populous city, came very close to running out of water as the multi-year “Day Zero” drought depleted its reservoirs. Since then, researchers from Stanford University determined that climate change had made this extreme drought five to six times more likely, and warned that a lot more Day Zero events could occur in regions with similar climates in the future. A better understanding of likely surface air temperature and precipitation trends in South Africa and other dry, populated areas around the world in the coming decades could empower decision-makers to pursue science-based climate mitigation and adaptation measures designed to reduce the risk of future Day Zero events.    

    Toward that end, researchers at the MIT Joint Program on the Science and Policy of Global Change, International Food Policy Research Institute, and CGIAR have produced modeled projections of 21st-century changes in seasonal surface air temperature and precipitation for South Africa that systematically and comprehensively account for uncertainties in how Earth and socioeconomic systems behave and co-evolve. Presented in a study in the journal Climatic Change, these projections show how temperature and precipitation over three sub-national regions — western, central, and eastern South Africa — are likely to change under a wide range of global climate mitigation policy scenarios.

    In a business-as-usual global climate policy scenario in which no emissions or climate targets are set or met, the projections show that for all three regions, there’s a greater-than 50 percent likelihood that mid-century temperatures will increase threefold over the current climate’s range of variability. But the risk of these mid-century temperature increases is effectively eliminated through more aggressive climate targets.

    The business-as-usual projections indicate that the risk of decreased precipitation levels in western and central South Africa is three to four times higher than the risk of increased precipitation levels. Under a global climate mitigation policy designed to cap global warming at 1.5 degrees Celsius by 2100, the risk of precipitation changes within South Africa toward the end of the century (2065-74) is similar to the risk during the 2030s in the business-as-usual scenario.

    Rising risks of substantially reduced precipitation levels throughout this century under a business-as-usual scenario suggest increased reliance and stress on the widespread water-efficiency measures established in the aftermath of the Day Zero drought. But a 1.5 C global climate mitigation policy would delay these risks by 30 years, giving South Africa ample lead time to prepare for and adapt to them.

    “Our analysis provides risk-based evidence on the benefits of climate mitigation policies as well as unavoidable climate impacts that will need to be addressed through adaptive measures,” says MIT Joint Program Deputy Director C. Adam Schlosser, the lead author of the study. “Global action to limit human-induced warming could give South Africa enough time to secure sufficient water supplies to sustain its population. Otherwise, anticipated climate shifts by the middle of the next decade may well make Day-Zero situations more common.”

    This study is part of an ongoing effort to assess the risks that climate change poses for South Africa’s agricultural, economic, energy and infrastructure sectors. More