More stories

  • in

    Passive cooling system could benefit off-grid locations

    As the world gets warmer, the use of power-hungry air conditioning systems is projected to increase significantly, putting a strain on existing power grids and bypassing many locations with little or no reliable electric power. Now, an innovative system developed at MIT offers a way to use passive cooling to preserve food crops and supplement conventional air conditioners in buildings, with no need for power and only a small need for water.

    The system, which combines radiative cooling, evaporative cooling, and thermal insulation in a slim package that could resemble existing solar panels, can provide up to about 19 degrees Fahrenheit (9.3 degrees Celsius) of cooling from the ambient temperature, enough to permit safe food storage for about 40 percent longer under very humid conditions. It could triple the safe storage time under dryer conditions.

    The findings are reported today in the journal Cell Reports Physical Science, in a paper by MIT postdoc Zhengmao Lu, Arny Leroy PhD ’21, professors Jeffrey Grossman and Evelyn Wang, and two others. While more research is needed in order to bring down the cost of one key component of the system, the researchers say that eventually such a system could play a significant role in meeting the cooling needs of many parts of the world where a lack of electricity or water limits the use of conventional cooling systems.

    The system cleverly combines previous standalone cooling designs that each provide limited amounts of cooling power, in order to produce significantly more cooling overall — enough to help reduce food losses from spoilage in parts of the world that are already suffering from limited food supplies. In recognition of that potential, the research team has been partly supported by MIT’s Abdul Latif Jameel Water and Food Systems Lab.

    “This technology combines some of the good features of previous technologies such as evaporative cooling and radiative cooling,” Lu says. By using this combination, he says, “we show that you can achieve significant food life extension, even in areas where you have high humidity,” which limits the capabilities of conventional evaporative or radiative cooling systems.

    In places that do have existing air conditioning systems in buildings, the new system could be used to significantly reduce the load on these systems by sending cool water to the hottest part of the system, the condenser. “By lowering the condenser temperature, you can effectively increase the air conditioner efficiency, so that way you can potentially save energy,” Lu says.

    Other groups have also been pursuing passive cooling technologies, he says, but “by combining those features in a synergistic way, we are now able to achieve high cooling performance, even in high-humidity areas where previous technology generally cannot perform well.”

    The system consists of three layers of material, which together provide cooling as water and heat pass through the device. In practice, the device could resemble a conventional solar panel, but instead of putting out electricity, it would directly provide cooling, for example by acting as the roof of a food storage container. Or, it could be used to send chilled water through pipes to cool parts of an existing air conditioning system and improve its efficiency. The only maintenance required is adding water for the evaporation, but the consumption is so low that this need only be done about once every four days in the hottest, driest areas, and only once a month in wetter areas.

    The top layer is an aerogel, a material consisting mostly of air enclosed in the cavities of a sponge-like structure made of polyethylene. The material is highly insulating but freely allows both water vapor and infrared radiation to pass through. The evaporation of water (rising up from the layer below) provides some of the cooling power, while the infrared radiation, taking advantage of the extreme transparency of Earth’s atmosphere at those wavelengths, radiates some of the heat straight up through the air and into space — unlike air conditioners, which spew hot air into the immediate surrounding environment.

    Below the aerogel is a layer of hydrogel — another sponge-like material, but one whose pore spaces filled with water rather than air. It’s similar to material currently used commercially for products such as cooling pads or wound dressings. This provides the water source for evaporative cooling, as water vapor forms at its surface and the vapor passes up right through the aerogel layer and out to the environment.

    Below that, a mirror-like layer reflects any incoming sunlight that has reached it, sending it back up through the device rather than letting it heat up the materials and thus reducing their thermal load. And the top layer of aerogel, being a good insulator, is also highly solar-reflecting, limiting the amount of solar heating of the device, even under strong direct sunlight.

    “The novelty here is really just bringing together the radiative cooling feature, the evaporative cooling feature, and also the thermal insulation feature all together in one architecture,” Lu explains. The system was tested, using a small version, just 4 inches across, on the rooftop of a building at MIT, proving its effectiveness even during suboptimal weather conditions, Lu says, and achieving 9.3 C of cooling (18.7 F).

    “The challenge previously was that evaporative materials often do not deal with solar absorption well,” Lu says. “With these other materials, usually when they’re under the sun, they get heated, so they are unable to get to high cooling power at the ambient temperature.”

    The aerogel material’s properties are a key to the system’s overall efficiency, but that material at present is expensive to produce, as it requires special equipment for critical point drying (CPD) to remove solvents slowly from the delicate porous structure without damaging it. The key characteristic that needs to be controlled to provide the desired characteristics is the size of the pores in the aerogel, which is made by mixing the polyethylene material with solvents, allowing it to set like a bowl of Jell-O, and then getting the solvents out of it. The research team is currently exploring ways of either making this drying process more inexpensive, such as by using freeze-drying, or finding alternative materials that can provide the same insulating function at lower cost, such as membranes separated by an air gap.

    While the other materials used in the system are readily available and relatively inexpensive, Lu says, “the aerogel is the only material that’s a product from the lab that requires further development in terms of mass production.” And it’s impossible to predict how long that development might take before this system can be made practical for widespread use, he says.

    The research team included Lenan Zhang of MIT’s Department of Mechanical Engineering and Jatin Patil of the Department of Materials Science and Engineering. More

  • in

    J-WAFS awards $150K Solutions grant to Patrick Doyle and team for rapid removal of micropollutants from water

    The Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) has awarded a 2022 J-WAFS Solutions grant to Patrick S. Doyle, the Robert T. Haslam Professor of Chemical Engineering at MIT, for his innovative system to tackle water pollution. Doyle will be working with co-Principal Investigator Rafael Gomez-Bombarelli, assistant professor in materials processing in the Department of Materials Science, as well as PhD students Devashish Gokhale and Tynan Perez. Building off of findings from a 2019 J-WAFS seed grant, Doyle and the research team will create cost-effective industry-scale processes to remove micropollutants from water. Project work will commence this month.

    The J-WAFS Solutions program provides one-year, renewable, commercialization grants to help move MIT technology from the laboratory to market. Grants of up to $150,000 are awarded to researchers with breakthrough technologies and inventions in water or food. Since its launch in 2015, J-WAFS Solutions grants have led to seven spinout companies and helped commercialize two products as open-source technologies. The grant program is supported by Community Jameel.

    A widespread problem 

    Micropollutants are contaminants that occur in low concentrations in the environment, yet continuous exposure and bioaccumulation of micropollutants make them a cause for concern. According to the U.S. Environmental Protection Agency, the plastics derivative Bisphenol A (BPA), the “forever chemicals” per-and polyfluoroalkyl substances (PFAS), and heavy metals like lead are common micropollutants known to be found in more than 85 percent of rivers, ponds, and lakes in the United States. Many of these bodies of water are sources of drinking water. Over long periods of time, exposure to micropollutants through drinking water can cause physiological damage in humans, increasing the risk of cancer, developmental disorders, and reproductive failure.

    Since micropollutants occur in low concentrations, it is difficult to detect and monitor their presence, and the chemical diversity of micropollutants makes it difficult to inexpensively remove them from water. Currently, activated carbon is the industry standard for micropollutant elimination, but this method cannot efficiently remove contaminants at parts-per-billion and parts-per-trillion concentrations. There are also strong sustainability concerns associated with activated carbon production, which is energy-intensive and releases large volumes of carbon dioxide.

    A solution with societal and economic benefits

    Doyle and his team are developing a technology that uses sustainable hydrogel microparticles to remove micropollutants from water. The polymeric hydrogel microparticles use chemically anchored structures including micelles and other chelating agents that act like a sponge by absorbing organic micropollutants and heavy metal ions. The microparticles are large enough to separate from water using simple gravitational settling. The system is sustainable because the microparticles can be recycled for continuous use. In testing, the long-lasting, reusable microparticles show quicker removal of contaminants than commercial activated carbon. The researchers plan to utilize machine learning to find optimal microparticle compositions that maximize performance on complex combinations of micropollutants in simulated and real wastewater samples.

    Economically, the technology is a new offering that has applications in numerous large markets where micropollutant elimination is vital, including municipal and industrial water treatment equipment, as well as household water purification systems. The J-WAFS Solutions grant will allow the team to build and test prototypes of the water treatment system, identify the best use cases and customers, and perform technoeconomic analyses and market research to formulate a preliminary business plan. With J-WAFS commercialization support, the project could eventually lead to a startup company.

    “Emerging micropollutants are a growing threat to drinking water supplies worldwide,” says J-WAFS Director John H. Lienhard, the Abdul Latif Jameel Professor of Water at MIT. “Cost-effective and scalable technologies for micropollutant removal are urgently needed. This project will develop and commercialize a promising new tool for water treatment, with the goal of improving water quality for millions of people.” More

  • in

    New J-WAFS-led project combats food insecurity

    Today the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) at MIT announced a new research project, supported by Community Jameel, to tackle one of the most urgent crises facing the planet: food insecurity. Approximately 276 million people worldwide are severely food insecure, and more than half a million face famine conditions.     To better understand and analyze food security, this three-year research project will develop a comprehensive index assessing countries’ food security vulnerability, called the Jameel Index for Food Trade and Vulnerability. Global changes spurred by social and economic transitions, energy and environmental policy, regional geopolitics, conflict, and of course climate change, can impact food demand and supply. The Jameel Index will measure countries’ dependence on global food trade and imports and how these regional-scale threats might affect the ability to trade food goods across diverse geographic regions. A main outcome of the research will be a model to project global food demand, supply balance, and bilateral trade under different likely future scenarios, with a focus on climate change. The work will help guide policymakers over the next 25 years while the global population is expected to grow, and the climate crisis is predicted to worsen.    

    The work will be the foundational project for the J-WAFS-led Food and Climate Systems Transformation Alliance, or FACT Alliance. Formally launched at the COP26 climate conference last November, the FACT Alliance is a global network of 20 leading research institutions and stakeholder organizations that are driving research and innovation and informing better decision-making for healthy, resilient, equitable, and sustainable food systems in a rapidly changing climate. The initiative is co-directed by Greg Sixt, research manager for climate and food systems at J-WAFS, and Professor Kenneth Strzepek, climate, water, and food specialist at J-WAFS.

    The dire state of our food systems

    The need for this project is evidenced by the hundreds of millions of people around the globe currently experiencing food shortages. While several factors contribute to food insecurity, climate change is one of the most notable. Devastating extreme weather events are increasingly crippling crop and livestock production around the globe. From Southwest Asia to the Arabian Peninsula to the Horn of Africa, communities are migrating in search of food. In the United States, extreme heat and lack of rainfall in the Southwest have drastically lowered Lake Mead’s water levels, restricting water access and drying out farmlands. 

    Social, political, and economic issues also disrupt food systems. The effects of the Covid-19 pandemic, supply chain disruptions, and inflation continue to exacerbate food insecurity. Russia’s invasion of Ukraine is dramatically worsening the situation, disrupting agricultural exports from both Russia and Ukraine — two of the world’s largest producers of wheat, sunflower seed oil, and corn. Other countries like Lebanon, Sri Lanka, and Cuba are confronting food insecurity due to domestic financial crises.

    Few countries are immune to threats to food security from sudden disruptions in food production or trade. When an enormous container ship became lodged in the Suez Canal in March 2021, the vital international trade route was blocked for three months. The resulting delays in international shipping affected food supplies around the world. These situations demonstrate the importance of food trade in achieving food security: a disaster in one part of the world can drastically affect the availability of food in another. This puts into perspective just how interconnected the earth’s food systems are and how vulnerable they remain to external shocks. 

    An index to prepare for the future of food

    Despite the need for more secure food systems, significant knowledge gaps exist when it comes to understanding how different climate scenarios may affect both agricultural productivity and global food supply chains and security. The Global Trade Analysis Project database from Purdue University, and the current IMPACT modeling system from the International Food Policy Research Institute (IFPRI), enable assessments of existing conditions but cannot project or model changes in the future.

    In 2021, Strzepek and Sixt developed an initial Food Import Vulnerability Index (FIVI) as part of a regional assessment of the threat of climate change to food security in the Gulf Cooperation Council states and West Asia. FIVI is also limited in that it can only assess current trade conditions and climate change threats to food production. Additionally, FIVI is a national aggregate index and does not address issues of hunger, poverty, or equity that stem from regional variations within a country.

    “Current models are really good at showing global food trade flows, but we don’t have systems for looking at food trade between individual countries and how different food systems stressors such as climate change and conflict disrupt that trade,” says Greg Sixt of J-WAFS and the FACT Alliance. “This timely index will be a valuable tool for policymakers to understand the vulnerabilities to their food security from different shocks in the countries they import their food from. The project will also illustrate the stakeholder-guided, transdisciplinary approach that is central to the FACT Alliance,” Sixt adds.

    Phase 1 of the project will support a collaboration between four FACT Alliance members: MIT J-WAFS, Ethiopian Institute of Agricultural Research, IFPRI (which is also part of the CGIAR network), and the Martin School at the University of Oxford. An external partner, United Arab Emirates University, will also assist with the project work. This first phase will build on Strzepek and Sixt’s previous work on FIVI by developing a comprehensive Global Food System Modeling Framework that takes into consideration climate and global changes projected out to 2050, and assesses their impacts on domestic production, world market prices, and national balance of payments and bilateral trade. The framework will also utilize a mixed-modeling approach that includes the assessment of bilateral trade and macroeconomic data associated with varying agricultural productivity under the different climate and economic policy scenarios. In this way, consistent and harmonized projections of global food demand and supply balance, and bilateral trade under climate and global change can be achieved. 

    “Just like in the global response to Covid-19, using data and modeling are critical to understanding and tackling vulnerabilities in the global supply of food,” says George Richards, director of Community Jameel. “The Jameel Index for Food Trade and Vulnerability will help inform decision-making to manage shocks and long-term disruptions to food systems, with the aim of ensuring food security for all.”

    On a national level, the researchers will enrich the Jameel Index through country-level food security analyses of regions within countries and across various socioeconomic groups, allowing for a better understanding of specific impacts on key populations. The research will present vulnerability scores for a variety of food security metrics for 126 countries. Case studies of food security and food import vulnerability in Ethiopia and Sudan will help to refine the applicability of the Jameel Index with on-the-ground information. The case studies will use an IFPRI-developed tool called the Rural Investment and Policy Analysis model, which allows for analysis of urban and rural populations and different income groups. Local capacity building and stakeholder engagement will be critical to enable the use of the tools developed by this research for national-level planning in priority countries, and ultimately to inform policy.  Phase 2 of the project will build on phase 1 and the lessons learned from the Ethiopian and Sudanese case studies. It will entail a number of deeper, country-level analyses to assess the role of food imports on future hunger, poverty, and equity across various regional and socioeconomic groups within the modeled countries. This work will link the geospatial national models with the global analysis. A scholarly paper is expected to be submitted to show findings from this work, and a website will be launched so that interested stakeholders and organizations can learn more information. More

  • in

    MIT J-WAFS announces 2022 seed grant recipients

    The Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) at MIT has awarded eight MIT principal investigators with 2022 J-WAFS seed grants. The grants support innovative MIT research that has the potential to have significant impact on water- and food-related challenges.

    The only program at MIT that is dedicated to water- and food-related research, J-WAFS has offered seed grant funding to MIT principal investigators and their teams for the past eight years. The grants provide up to $75,000 per year, overhead-free, for two years to support new, early-stage research in areas such as water and food security, safety, supply, and sustainability. Past projects have spanned many diverse disciplines, including engineering, science, technology, and business innovation, as well as social science and economics, architecture, and urban planning. 

    Seven new projects led by eight researchers will be supported this year. With funding going to four different MIT departments, the projects address a range of challenges by employing advanced materials, technology innovations, and new approaches to resource management. The new projects aim to remove harmful chemicals from water sources, develop drought monitoring systems for farmers, improve management of the shellfish industry, optimize water purification materials, and more.

    “Climate change, the pandemic, and most recently the war in Ukraine have exacerbated and put a spotlight on the serious challenges facing global water and food systems,” says J-WAFS director John H. Lienhard. He adds, “The proposals chosen this year have the potential to create measurable, real-world impacts in both the water and food sectors.”  

    The 2022 J-WAFS seed grant researchers and their projects are:

    Gang Chen, the Carl Richard Soderberg Professor of Power Engineering in MIT’s Department of Mechanical Engineering, is using sunlight to desalinate water. The use of solar energy for desalination is not a new idea, particularly solar thermal evaporation methods. However, the solar thermal evaporation process has an overall low efficiency because it relies on breaking hydrogen bonds among individual water molecules, which is very energy-intensive. Chen and his lab recently discovered a photomolecular effect that dramatically lowers the energy required for desalination. 

    The bonds among water molecules inside a water cluster in liquid water are mostly hydrogen bonds. Chen discovered that a photon with energy larger than the bonding energy between the water cluster and the remaining water liquids can cleave off the water cluster at the water-air interface, colliding with air molecules and disintegrating into 60 or even more individual water molecules. This effect has the potential to significantly boost clean water production via new desalination technology that produces a photomolecular evaporation rate that exceeds pure solar thermal evaporation by at least ten-fold. 

    John E. Fernández is the director of the MIT Environmental Solutions Initiative (ESI) and a professor in the Department of Architecture, and also affiliated with the Department of Urban Studies and Planning. Fernández is working with Scott D. Odell, a postdoc in the ESI, to better understand the impacts of mining and climate change in water-stressed regions of Chile.

    The country of Chile is one of the world’s largest exporters of both agricultural and mineral products; however, little research has been done on climate change effects at the intersection of these two sectors. Fernández and Odell will explore how desalination is being deployed by the mining industry to relieve pressure on continental water supplies in Chile, and with what effect. They will also research how climate change and mining intersect to affect Andean glaciers and agricultural communities dependent upon them. The researchers intend for this work to inform policies to reduce social and environmental harms from mining, desalination, and climate change.

    Ariel L. Furst is the Raymond (1921) and Helen St. Laurent Career Development Professor of Chemical Engineering at MIT. Her 2022 J-WAFS seed grant project seeks to effectively remove dangerous and long-lasting chemicals from water supplies and other environmental areas. 

    Perfluorooctanoic acid (PFOA), a component of Teflon, is a member of a group of chemicals known as per- and polyfluoroalkyl substances (PFAS). These human-made chemicals have been extensively used in consumer products like nonstick cooking pans. Exceptionally high levels of PFOA have been measured in water sources near manufacturing sites, which is problematic as these chemicals do not readily degrade in our bodies or the environment. The majority of humans have detectable levels of PFAS in their blood, which can lead to significant health issues including cancer, liver damage, and thyroid effects, as well as developmental effects in infants. Current remediation methods are limited to inefficient capture and are mostly confined to laboratory settings. Furst’s proposed method utilizes low-energy, scaffolded enzyme materials to move beyond simple capture to degrade these hazardous pollutants.

    Heather J. Kulik is an associate professor in the Department of Chemical Engineering at MIT who is developing novel computational strategies to identify optimal materials for purifying water. Water treatment requires purification by selectively separating small ions from water. However, human-made, scalable materials for water purification and desalination are often not stable in typical operating conditions and lack precision pores for good separation. 

    Metal-organic frameworks (MOFs) are promising materials for water purification because their pores can be tailored to have precise shapes and chemical makeup for selective ion affinity. Yet few MOFs have been assessed for their properties relevant to water purification. Kulik plans to use virtual high-throughput screening accelerated by machine learning models and molecular simulation to accelerate discovery of MOFs. Specifically, Kulik will be looking for MOFs with ultra-stable structures in water that do not break down at certain temperatures. 

    Gregory C. Rutledge is the Lammot du Pont Professor of Chemical Engineering at MIT. He is leading a project that will explore how to better separate oils from water. This is an important problem to solve given that industry-generated oil-contaminated water is a major source of pollution to the environment.

    Emulsified oils are particularly challenging to remove from water due to their small droplet sizes and long settling times. Microfiltration is an attractive technology for the removal of emulsified oils, but its major drawback is fouling, or the accumulation of unwanted material on solid surfaces. Rutledge will examine the mechanism of separation behind liquid-infused membranes (LIMs) in which an infused liquid coats the surface and pores of the membrane, preventing fouling. Robustness of the LIM technology for removal of different types of emulsified oils and oil mixtures will be evaluated. César Terrer is an assistant professor in the Department of Civil and Environmental Engineering whose J-WAFS project seeks to answer the question: How can satellite images be used to provide a high-resolution drought monitoring system for farmers? 

    Drought is recognized as one of the world’s most pressing issues, with direct impacts on vegetation that threaten water resources and food production globally. However, assessing and monitoring the impact of droughts on vegetation is extremely challenging as plants’ sensitivity to lack of water varies across species and ecosystems. Terrer will leverage a new generation of remote sensing satellites to provide high-resolution assessments of plant water stress at regional to global scales. The aim is to provide a plant drought monitoring product with farmland-specific services for water and socioeconomic management.

    Michael Triantafyllou is the Henry L. and Grace Doherty Professor in Ocean Science and Engineering in the Department of Mechanical Engineering. He is developing a web-based system for natural resources management that will deploy geospatial analysis, visualization, and reporting to better manage and facilitate aquaculture data.  By providing value to commercial fisheries’ permit holders who employ significant numbers of people and also to recreational shellfish permit holders who contribute to local economies, the project has attracted support from the Massachusetts Division of Marine Fisheries as well as a number of local resource management departments.

    Massachusetts shell fisheries generated roughly $339 million in 2020, accounting for 17 percent of U.S. East Coast production. Managing such a large industry is a time-consuming process, given there are thousands of acres of coastal areas grouped within over 800 classified shellfish growing areas. Extreme climate events present additional challenges. Triantafyllou’s research will help efforts to enforce environmental regulations, support habitat restoration efforts, and prevent shellfish-related food safety issues. More

  • in

    From seawater to drinking water, with the push of a button

    MIT researchers have developed a portable desalination unit, weighing less than 10 kilograms, that can remove particles and salts to generate drinking water.

    The suitcase-sized device, which requires less power to operate than a cell phone charger, can also be driven by a small, portable solar panel, which can be purchased online for around $50. It automatically generates drinking water that exceeds World Health Organization quality standards. The technology is packaged into a user-friendly device that runs with the push of one button.

    Unlike other portable desalination units that require water to pass through filters, this device utilizes electrical power to remove particles from drinking water. Eliminating the need for replacement filters greatly reduces the long-term maintenance requirements.

    This could enable the unit to be deployed in remote and severely resource-limited areas, such as communities on small islands or aboard seafaring cargo ships. It could also be used to aid refugees fleeing natural disasters or by soldiers carrying out long-term military operations.

    “This is really the culmination of a 10-year journey that I and my group have been on. We worked for years on the physics behind individual desalination processes, but pushing all those advances into a box, building a system, and demonstrating it in the ocean, that was a really meaningful and rewarding experience for me,” says senior author Jongyoon Han, a professor of electrical engineering and computer science and of biological engineering, and a member of the Research Laboratory of Electronics (RLE).

    Joining Han on the paper are first author Junghyo Yoon, a research scientist in RLE; Hyukjin J. Kwon, a former postdoc; SungKu Kang, a postdoc at Northeastern University; and Eric Brack of the U.S. Army Combat Capabilities Development Command (DEVCOM). The research has been published online in Environmental Science and Technology.

    Play video

    Filter-free technology

    Commercially available portable desalination units typically require high-pressure pumps to push water through filters, which are very difficult to miniaturize without compromising the energy-efficiency of the device, explains Yoon.

    Instead, their unit relies on a technique called ion concentration polarization (ICP), which was pioneered by Han’s group more than 10 years ago. Rather than filtering water, the ICP process applies an electrical field to membranes placed above and below a channel of water. The membranes repel positively or negatively charged particles — including salt molecules, bacteria, and viruses — as they flow past. The charged particles are funneled into a second stream of water that is eventually discharged.

    The process removes both dissolved and suspended solids, allowing clean water to pass through the channel. Since it only requires a low-pressure pump, ICP uses less energy than other techniques.

    But ICP does not always remove all the salts floating in the middle of the channel. So the researchers incorporated a second process, known as electrodialysis, to remove remaining salt ions.

    Yoon and Kang used machine learning to find the ideal combination of ICP and electrodialysis modules. The optimal setup includes a two-stage ICP process, with water flowing through six modules in the first stage then through three in the second stage, followed by a single electrodialysis process. This minimized energy usage while ensuring the process remains self-cleaning.

    “While it is true that some charged particles could be captured on the ion exchange membrane, if they get trapped, we just reverse the polarity of the electric field and the charged particles can be easily removed,” Yoon explains.

    They shrunk and stacked the ICP and electrodialysis modules to improve their energy efficiency and enable them to fit inside a portable device. The researchers designed the device for nonexperts, with just one button to launch the automatic desalination and purification process. Once the salinity level and the number of particles decrease to specific thresholds, the device notifies the user that the water is drinkable.

    The researchers also created a smartphone app that can control the unit wirelessly and report real-time data on power consumption and water salinity.

    Beach tests

    After running lab experiments using water with different salinity and turbidity (cloudiness) levels, they field-tested the device at Boston’s Carson Beach.

    Yoon and Kwon set the box near the shore and tossed the feed tube into the water. In about half an hour, the device had filled a plastic drinking cup with clear, drinkable water.

    “It was successful even in its first run, which was quite exciting and surprising. But I think the main reason we were successful is the accumulation of all these little advances that we made along the way,” Han says.

    The resulting water exceeded World Health Organization quality guidelines, and the unit reduced the amount of suspended solids by at least a factor of 10. Their prototype generates drinking water at a rate of 0.3 liters per hour, and requires only 20 watts of power per liter.

    “Right now, we are pushing our research to scale up that production rate,” Yoon says.

    One of the biggest challenges of designing the portable system was engineering an intuitive device that could be used by anyone, Han says.

    Yoon hopes to make the device more user-friendly and improve its energy efficiency and production rate through a startup he plans to launch to commercialize the technology.

    In the lab, Han wants to apply the lessons he’s learned over the past decade to water-quality issues that go beyond desalination, such as rapidly detecting contaminants in drinking water.

    “This is definitely an exciting project, and I am proud of the progress we have made so far, but there is still a lot of work to do,” he says.

    For example, while “development of portable systems using electro-membrane processes is an original and exciting direction in off-grid, small-scale desalination,” the effects of fouling, especially if the water has high turbidity, could significantly increase maintenance requirements and energy costs, notes Nidal Hilal, professor of engineering and director of the New York University Abu Dhabi Water research center, who was not involved with this research.

    “Another limitation is the use of expensive materials,” he adds. “It would be interesting to see similar systems with low-cost materials in place.”

    The research was funded, in part, by the DEVCOM Soldier Center, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS), the Experimental AI Postdoc Fellowship Program of Northeastern University, and the Roux AI Institute. More

  • in

    Structures considered key to gene expression are surprisingly fleeting

    In human chromosomes, DNA is coated by proteins to form an exceedingly long beaded string. This “string” is folded into numerous loops, which are believed to help cells control gene expression and facilitate DNA repair, among other functions. A new study from MIT suggests that these loops are very dynamic and shorter-lived than previously thought.

    In the new study, the researchers were able to monitor the movement of one stretch of the genome in a living cell for about two hours. They saw that this stretch was fully looped for only 3 to 6 percent of the time, with the loop lasting for only about 10 to 30 minutes. The findings suggest that scientists’ current understanding of how loops influence gene expression may need to be revised, the researchers say.

    “Many models in the field have been these pictures of static loops regulating these processes. What our new paper shows is that this picture is not really correct,” says Anders Sejr Hansen, the Underwood-Prescott Career Development Assistant Professor of Biological Engineering at MIT. “We suggest that the functional state of these domains is much more dynamic.”

    Hansen is one of the senior authors of the new study, along with Leonid Mirny, a professor in MIT’s Institute for Medical Engineering and Science and the Department of Physics, and Christoph Zechner, a group leader at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and the Center for Systems Biology Dresden. MIT postdoc Michele Gabriele, recent Harvard University PhD recipient Hugo Brandão, and MIT graduate student Simon Grosse-Holz are the lead authors of the paper, which appears today in Science.

    Out of the loop

    Using computer simulations and experimental data, scientists including Mirny’s group at MIT have shown that loops in the genome are formed by a process called extrusion, in which a molecular motor promotes the growth of progressively larger loops. The motor stops each time it encounters a “stop sign” on DNA. The motor that extrudes such loops is a protein complex called cohesin, while the DNA-bound protein CTCF serves as the stop sign. These cohesin-mediated loops between CTCF sites were seen in previous experiments.

    However, those experiments only offered a snapshot of a moment in time, with no information on how the loops change over time. In their new study, the researchers developed techniques that allowed them to fluorescently label CTCF DNA sites so they could image the DNA loops over several hours. They also created a new computational method that can infer the looping events from the imaging data.

    “This method was crucial for us to distinguish signal from noise in our experimental data and quantify looping,” Zechner says. “We believe that such approaches will become increasingly important for biology as we continue to push the limits of detection with experiments.”

    The researchers used their method to image a stretch of the genome in mouse embryonic stem cells. “If we put our data in the context of one cell division cycle, which lasts about 12 hours, the fully formed loop only actually exists for about 20 to 45 minutes, or about 3 to 6 percent of the time,” Grosse-Holz says.

    “If the loop is only present for such a tiny period of the cell cycle and very short-lived, we shouldn’t think of this fully looped state as being the primary regulator of gene expression,” Hansen says. “We think we need new models for how the 3D structure of the genome regulates gene expression, DNA repair, and other functional downstream processes.”

    While fully formed loops were rare, the researchers found that partially extruded loops were present about 92 percent of the time. These smaller loops have been difficult to observe with the previous methods of detecting loops in the genome.

    “In this study, by integrating our experimental data with polymer simulations, we have now been able to quantify the relative extents of the unlooped, partially extruded, and fully looped states,” Brandão says.

    “Since these interactions are very short, but very frequent, the previous methodologies were not able to fully capture their dynamics,” Gabriele adds. “With our new technique, we can start to resolve transitions between fully looped and unlooped states.”

    Play video

    The researchers hypothesize that these partial loops may play more important roles in gene regulation than fully formed loops. Strands of DNA run along each other as loops begin to form and then fall apart, and these interactions may help regulatory elements such as enhancers and gene promoters find each other.

    “More than 90 percent of the time, there are some transient loops, and presumably what’s important is having those loops that are being perpetually extruded,” Mirny says. “The process of extrusion itself may be more important than the fully looped state that only occurs for a short period of time.”

    More loops to study

    Since most of the other loops in the genome are weaker than the one the researchers studied in this paper, they suspect that many other loops will also prove to be highly transient. They now plan to use their new technique study some of those other loops, in a variety of cell types.

    “There are about 10,000 of these loops, and we’ve looked at one,” Hansen says. “We have a lot of indirect evidence to suggest that the results would be generalizable, but we haven’t demonstrated that. Using the technology platform we’ve set up, which combines new experimental and computational methods, we can begin to approach other loops in the genome.”

    The researchers also plan to investigate the role of specific loops in disease. Many diseases, including a neurodevelopmental disorder called FOXG1 syndrome, could be linked to faulty loop dynamics. The researchers are now studying how both the normal and mutated form of the FOXG1 gene, as well as the cancer-causing gene MYC, are affected by genome loop formation.

    The research was funded by the National Institutes of Health, the National Science Foundation, the Mathers Foundation, a Pew-Stewart Cancer Research Scholar grant, the Chaires d’excellence Internationale Blaise Pascal, an American-Italian Cancer Foundation research scholarship, and the Max Planck Institute for Molecular Cell Biology and Genetics. More

  • in

    Q&A: Options for the Diablo Canyon nuclear plant

    The Diablo Canyon nuclear plant in California, the only one still operating in the state, is set to close in 2025. A team of researchers at MIT’s Center for Advanced Nuclear Energy Systems, Abdul Latif Jameel Water and Food Systems Lab, and Center for Energy and Environmental Policy Research; Stanford’s Precourt Energy Institute; and energy analysis firm LucidCatalyst LLC have analyzed the potential benefits the plant could provide if its operation were extended to 2030 or 2045.

    They found that this nuclear plant could simultaneously help to stabilize the state’s electric grid, provide desalinated water to supplement the state’s chronic water shortages, and provide carbon-free hydrogen fuel for transportation. MIT News asked report co-authors Jacopo Buongiorno, the TEPCO Professor of Nuclear Science and Engineering, and John Lienhard, the Jameel Professor of Water and Food, to discuss the group’s findings.

    Q: Your report suggests co-locating a major desalination plant alongside the existing Diablo Canyon power plant. What would be the potential benefits from operating a desalination plant in conjunction with the power plant?

    Lienhard: The cost of desalinated water produced at Diablo Canyon would be lower than for a stand-alone plant because the cost of electricity would be significantly lower and you could take advantage of the existing infrastructure for the intake of seawater and the outfall of brine. Electricity would be cheaper because the location takes advantage of Diablo Canyon’s unique capability to provide low cost, zero-carbon baseload power.

    Depending on the scale at which the desalination plant is built, you could make a very significant impact on the water shortfalls of state and federal projects in the area. In fact, one of the numbers that came out of this study was that an intermediate-sized desalination plant there would produce more fresh water than the highest estimate of the net yield from the proposed Delta Conveyance Project on the Sacramento River. You could get that amount of water at Diablo Canyon for an investment cost less than half as large, and without the associated impacts that would come with the Delta Conveyance Project.

    And the technology envisioned for desalination here, reverse osmosis, is available off the shelf. You can buy this equipment today. In fact, it’s already in use in California and thousands of other places around the world.

    Q: You discuss in the report three potential products from the Diablo Canyon plant:  desalinatinated water, power for the grid, and clean hydrogen. How well can the plant accommodate all of those efforts, and are there advantages to combining them as opposed to doing any one of them separately?

    Buongiorno: California, like many other regions in the world, is facing multiple challenges as it seeks to reduce carbon emissions on a grand scale. First, the wide deployment of intermittent energy sources such as solar and wind creates a great deal of variability on the grid that can be balanced by dispatchable firm power generators like Diablo. So, the first mission for Diablo is to continue to provide reliable, clean electricity to the grid.

    The second challenge is the prolonged drought and water scarcity for the state in general. And one way to address that is water desalination co-located with the nuclear plant at the Diablo site, as John explained.

    The third challenge is related to decarbonization the transportation sector. A possible approach is replacing conventional cars and trucks with vehicles powered by fuel cells which consume hydrogen. Hydrogen has to be produced from a primary energy source. Nuclear power, through a process called electrolysis, can do that quite efficiently and in a manner that is carbon-free.

    Our economic analysis took into account the expected revenue from selling these multiple products — electricity for the grid, hydrogen for the transportation sector, water for farmers or other local users — as well as the costs associated with deploying the new facilities needed to produce desalinated water and hydrogen. We found that, if Diablo’s operating license was extended until 2035, it would cut carbon emissions by an average of 7 million metric tons a year — a more than 11 percent reduction from 2017 levels — and save ratepayers $2.6 billion in power system costs.

    Further delaying the retirement of Diablo to 2045 would spare 90,000 acres of land that would need to be dedicated to renewable energy production to replace the facility’s capacity, and it would save ratepayers up to $21 billion in power system costs.

    Finally, if Diablo was operated as a polygeneration facility that provides electricity, desalinated water, and hydrogen simultaneously, its value, quantified in terms of dollars per unit electricity generated, could increase by 50 percent.

    Lienhard: Most of the desalination scenarios that we considered did not consume the full electrical output of that plant, meaning that under most scenarios you would continue to make electricity and do something with it, beyond just desalination. I think it’s also important to remember that this power plant produces 15 percent of California’s carbon-free electricity today and is responsible for 8 percent of the state’s total electrical production. In other words, Diablo Canyon is a very large factor in California’s decarbonization. When or if this plant goes offline, the near-term outcome is likely to be increased reliance on natural gas to produce electricity, meaning a rise in California’s carbon emissions.

    Q: This plant in particular has been highly controversial since its inception. What’s your assessment of the plant’s safety beyond its scheduled shutdown, and how do you see this report as contributing to the decision-making about that shutdown?

    Buongiorno: The Diablo Canyon Nuclear Power Plant has a very strong safety record. The potential safety concern for Diablo is related to its proximity to several fault lines. Being located in California, the plant was designed to withstand large earthquakes to begin with. Following the Fukushima accident in 2011, the Nuclear Regulatory Commission reviewed the plant’s ability to withstand external events (e.g., earthquakes, tsunamis, floods, tornadoes, wildfires, hurricanes) of exceptionally rare and severe magnitude. After nine years of assessment the NRC’s conclusion is that “existing seismic capacity or effective flood protection [at Diablo Canyon] will address the unbounded reevaluated hazards.” That is, Diablo was designed and built to withstand even the rarest and strongest earthquakes that are physically possible at this site.

    As an additional level of protection, the plant has been retrofitted with special equipment and procedures meant to ensure reliable cooling of the reactor core and spent fuel pool under a hypothetical scenario in which all design-basis safety systems have been disabled by a severe external event.

    Lienhard: As for the potential impact of this report, PG&E [the California utility] has already made the decision to shut down the plant, and we and others hope that decision will be revisited and reversed. We believe that this report gives the relevant stakeholders and policymakers a lot of information about options and value associated with keeping the plant running, and about how California could benefit from clean water and clean power generated at Diablo Canyon. It’s not up to us to make the decision, of course — that is a decision that must be made by the people of California. All we can do is provide information.

    Q: What are the biggest challenges or obstacles to seeing these ideas implemented?

    Lienhard: California has very strict environmental protection regulations, and it’s good that they do. One of the areas of great concern to California is the health of the ocean and protection of the coastal ecosystem. As a result, very strict rules are in place about the intake and outfall of both power plants and desalination plants, to protect marine life. Our analysis suggests that this combined plant can be implemented within the parameters prescribed by the California Ocean Plan and that it can meet the regulatory requirements.

    We believe that deeper analysis would be needed before you could proceed. You would need to do site studies and really get out into the water and look in detail at what’s there. But the preliminary analysis is positive. A second challenge is that the discourse in California around nuclear power has generally not been very supportive, and similarly some groups in California oppose desalination. We expect that that both of those points of view would be part of the conversation about whether or not to procede with this project.

    Q: How particular is this analysis to the specifics of this location? Are there aspects of it that apply to other nuclear plants, domestically or globally?

    Lienhard: Hundreds of nuclear plants around the world are situated along the coast, and many are in water stressed regions. Although our analysis focused on Diablo Canyon, we believe that the general findings are applicable to many other seaside nuclear plants, so that this approach and these conclusions could potentially be applied at hundreds of sites worldwide. More