More stories

  • in

    Report: Sustainability in supply chains is still a firm-level priority

    Corporations are actively seeking sustainability advances in their supply chains — but many need to improve the business metrics they use in this area to realize more progress, according to a new report by MIT researchers.   During a time of shifting policies globally and continued economic uncertainty, the survey-based report finds 85 percent of companies say they are continuing supply chain sustainability practices at the same level as in recent years, or are increasing those efforts.“What we found is strong evidence that sustainability still matters,” says Josué Velázquez Martínez, a research scientist and director of the MIT Sustainable Supply Chain Lab, which helped produce the report. “There are many things that remain to be done to accomplish those goals, but there’s a strong willingness from companies in all parts of the world to do something about sustainability.”The new analysis, titled “Sustainability Still Matters,” was released today. It is the sixth annual report on the subject prepared by the MIT Sustainable Supply Chain Lab, which is part of MIT’s Center for Transportation and Logistics. The Council of Supply Chain Management Professionals collaborated on the project as well.The report is based on a global survey, with responses from 1,203 professionals in 97 countries. This year, the report analyzes three issues in depth, including regulations and the role they play in corporate approaches to supply chain management. A second core topic is management and mitigation of what industry professionals call “Scope 3” emissions, which are those not from a firm itself, but from a firm’s supply chain. And a third issue of focus is the future of freight transportation, which by itself accounts for a substantial portion of supply chain emissions.Broadly, the survey finds that for European-based firms, the principal driver of action in this area remains government mandates, such as the Corporate Sustainability Reporting Directive, which requires companies to publish regular reports on their environmental impact and the risks to society involved. In North America, firm leadership and investor priorities are more likely to be decisive factors in shaping a company’s efforts.“In Europe the pressure primarily comes more from regulation, but in the U.S. it comes more from investors, or from competitors,” Velázquez Martínez says.The survey responses on Scope 3 emissions reveal a number of opportunities for improvement. In business and sustainability terms, Scope 1 greenhouse gas emissions are those a firm produces directly. Scope 2 emissions are the energy it has purchased. And Scope 3 emissions are those produced across a firm’s value chain, including the supply chain activities involved in producing, transporting, using, and disposing of its products.The report reveals that about 40 percent of firms keep close track of Scope 1 and 2 emissions, but far fewer tabulate Scope 3 on equivalent terms. And yet Scope 3 may account for roughly 75 percent of total firm emissions, on aggregate. About 70 percent of firms in the survey say they do not have enough data from suppliers to accurately tabulate the total greenhouse gas and climate impact of their supply chains.Certainly it can be hard to calculate the total emissions when a supply chain has many layers, including smaller suppliers lacking data capacity. But firms can upgrade their analytics in this area, too. For instance, 50 percent of North American firms are still using spreadsheets to tabulate emissions data, often making rough estimates that correlate emissions to simple economic activity. An alternative is life cycle assessment software that provides more sophisticated estimates of a product’s emissions, from the extraction of its materials to its post-use disposal. By contrast, only 32 percent of European firms are still using spreadsheets rather than life cycle assessment tools.“You get what you measure,” Velázquez Martínez says. “If you measure poorly, you’re going to get poor decisions that most likely won’t drive the reductions you’re expecting. So we pay a lot of attention to that particular issue, which is decisive to defining an action plan. Firms pay a lot of attention to metrics in their financials, but in sustainability they’re often using simplistic measurements.”When it comes to transportation, meanwhile, the report shows that firms are still grappling with the best ways to reduce emissions. Some see biofuels as the best short-term alternative to fossil fuels; others are investing in electric vehicles; some are waiting for hydrogen-powered vehicles to gain traction. Supply chains, after all, frequently involve long-haul trips. For firms, as for individual consumers, electric vehicles are more practical with a larger infrastructure of charging stations. There are advances on that front but more work to do as well.That said, “Transportation has made a lot of progress in general,” Velázquez Martínez says, noting the increased acceptance of new modes of vehicle power in general.Even as new technologies loom on the horizon, though, supply chain sustainability is not wholly depend on their introduction. One factor continuing to propel sustainability in supply chains is the incentives companies have to lower costs. In a competitive business environment, spending less on fossil fuels usually means savings. And firms can often find ways to alter their logistics to consume and spend less.“Along with new technologies, there is another side of supply chain sustainability that is related to better use of the current infrastructure,” Velázquez Martínez observes. “There is always a need to revise traditional ways of operating to find opportunities for more efficiency.”  More

  • in

    Simpler models can outperform deep learning at climate prediction

    Environmental scientists are increasingly using enormous artificial intelligence models to make predictions about changes in weather and climate, but a new study by MIT researchers shows that bigger models are not always better.The team demonstrates that, in certain climate scenarios, much simpler, physics-based models can generate more accurate predictions than state-of-the-art deep-learning models.Their analysis also reveals that a benchmarking technique commonly used to evaluate machine-learning techniques for climate predictions can be distorted by natural variations in the data, like fluctuations in weather patterns. This could lead someone to believe a deep-learning model makes more accurate predictions when that is not the case.The researchers developed a more robust way of evaluating these techniques, which shows that, while simple models are more accurate when estimating regional surface temperatures, deep-learning approaches can be the best choice for estimating local rainfall.They used these results to enhance a simulation tool known as a climate emulator, which can rapidly simulate the effect of human activities onto a future climate.The researchers see their work as a “cautionary tale” about the risk of deploying large AI models for climate science. While deep-learning models have shown incredible success in domains such as natural language, climate science contains a proven set of physical laws and approximations, and the challenge becomes how to incorporate those into AI models.“We are trying to develop models that are going to be useful and relevant for the kinds of things that decision-makers need going forward when making climate policy choices. While it might be attractive to use the latest, big-picture machine-learning model on a climate problem, what this study shows is that stepping back and really thinking about the problem fundamentals is important and useful,” says study senior author Noelle Selin, a professor in the MIT Institute for Data, Systems, and Society (IDSS) and the Department of Earth, Atmospheric and Planetary Sciences (EAPS).Selin’s co-authors are lead author Björn Lütjens, a former EAPS postdoc who is now a research scientist at IBM Research; senior author Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in EAPS and co-director of the Lorenz Center; and Duncan Watson-Parris, assistant professor at the University of California at San Diego. Selin and Ferrari are also co-principal investigators of the Bringing Computation to the Climate Challenge project, out of which this research emerged. The paper appears today in the Journal of Advances in Modeling Earth Systems.Comparing emulatorsBecause the Earth’s climate is so complex, running a state-of-the-art climate model to predict how pollution levels will impact environmental factors like temperature can take weeks on the world’s most powerful supercomputers.Scientists often create climate emulators, simpler approximations of a state-of-the art climate model, which are faster and more accessible. A policymaker could use a climate emulator to see how alternative assumptions on greenhouse gas emissions would affect future temperatures, helping them develop regulations.But an emulator isn’t very useful if it makes inaccurate predictions about the local impacts of climate change. While deep learning has become increasingly popular for emulation, few studies have explored whether these models perform better than tried-and-true approaches.The MIT researchers performed such a study. They compared a traditional technique called linear pattern scaling (LPS) with a deep-learning model using a common benchmark dataset for evaluating climate emulators.Their results showed that LPS outperformed deep-learning models on predicting nearly all parameters they tested, including temperature and precipitation.“Large AI methods are very appealing to scientists, but they rarely solve a completely new problem, so implementing an existing solution first is necessary to find out whether the complex machine-learning approach actually improves upon it,” says Lütjens.Some initial results seemed to fly in the face of the researchers’ domain knowledge. The powerful deep-learning model should have been more accurate when making predictions about precipitation, since those data don’t follow a linear pattern.They found that the high amount of natural variability in climate model runs can cause the deep learning model to perform poorly on unpredictable long-term oscillations, like El Niño/La Niña. This skews the benchmarking scores in favor of LPS, which averages out those oscillations.Constructing a new evaluationFrom there, the researchers constructed a new evaluation with more data that address natural climate variability. With this new evaluation, the deep-learning model performed slightly better than LPS for local precipitation, but LPS was still more accurate for temperature predictions.“It is important to use the modeling tool that is right for the problem, but in order to do that you also have to set up the problem the right way in the first place,” Selin says.Based on these results, the researchers incorporated LPS into a climate emulation platform to predict local temperature changes in different emission scenarios.“We are not advocating that LPS should always be the goal. It still has limitations. For instance, LPS doesn’t predict variability or extreme weather events,” Ferrari adds.Rather, they hope their results emphasize the need to develop better benchmarking techniques, which could provide a fuller picture of which climate emulation technique is best suited for a particular situation.“With an improved climate emulation benchmark, we could use more complex machine-learning methods to explore problems that are currently very hard to address, like the impacts of aerosols or estimations of extreme precipitation,” Lütjens says.Ultimately, more accurate benchmarking techniques will help ensure policymakers are making decisions based on the best available information.The researchers hope others build on their analysis, perhaps by studying additional improvements to climate emulation methods and benchmarks. Such research could explore impact-oriented metrics like drought indicators and wildfire risks, or new variables like regional wind speeds.This research is funded, in part, by Schmidt Sciences, LLC, and is part of the MIT Climate Grand Challenges team for “Bringing Computation to the Climate Challenge.” More

  • in

    After more than a decade of successes, ESI’s work will spread out across the Institute

    MIT’s Environmental Solutions Initiative (ESI), a pioneering cross-disciplinary body that helped give a major boost to sustainability and solutions to climate change at MIT, will close as a separate entity at the end of June. But that’s far from the end for its wide-ranging work, which will go forward under different auspices. Many of its key functions will become part of MIT’s recently launched Climate Project. John Fernandez, head of ESI for nearly a decade, will return to the School of Architecture and Planning, where some of ESI’s important work will continue as part of a new interdisciplinary lab.When the ideas that led to the founding of MIT’s Environmental Solutions Initiative first began to be discussed, its founders recall, there was already a great deal of work happening at MIT relating to climate change and sustainability. As Professor John Sterman of the MIT Sloan School of Management puts it, “there was a lot going on, but it wasn’t integrated. So the whole added up to less than the sum of its parts.”ESI was founded in 2014 to help fill that coordinating role, and in the years since it has accomplished a wide range of significant milestones in research, education, and communication about sustainable solutions in a wide range of areas. Its founding director, Professor Susan Solomon, helmed it for its first year, and then handed the leadership to Fernandez, who has led it since 2015.“There wasn’t much of an ecosystem [on sustainability] back then,” Solomon recalls. But with the help of ESI and some other entities, that ecosystem has blossomed. She says that Fernandez “has nurtured some incredible things under ESI,” including work on nature-based climate solutions, and also other areas such as sustainable mining, and reduction of plastics in the environment.Desiree Plata, director of MIT’s Climate and Sustainability Consortium and associate professor of civil and environmental engineering, says that one key achievement of the initiative has been in “communication with the external world, to help take really complex systems and topics and put them in not just plain-speak, but something that’s scientifically rigorous and defensible, for the outside world to consume.”In particular, ESI has created three very successful products, which continue under the auspices of the Climate Project. These include the popular TIL Climate Podcast, the Webby Award-winning Climate Portal website, and the online climate primer developed with Professor Kerry Emanuel. “These are some of the most frequented websites at MIT,” Plata says, and “the impact of this work on the global knowledge base cannot be overstated.”Fernandez says that ESI has played a significant part in helping to catalyze what has become “a rich institutional landscape of work in sustainability and climate change” at MIT. He emphasizes three major areas where he feels the ESI has been able to have the most impact: engaging the MIT community, initiating and stewarding critical environmental research, and catalyzing efforts to promote sustainability as fundamental to the mission of a research university.Engagement of the MIT community, he says, began with two programs: a research seed grant program and the creation of MIT’s undergraduate minor in environment and sustainability, launched in 2017.ESI also created a Rapid Response Group, which gave students a chance to work on real-world projects with external partners, including government agencies, community groups, nongovernmental organizations, and businesses. In the process, they often learned why dealing with environmental challenges in the real world takes so much longer than they might have thought, he says, and that a challenge that “seemed fairly straightforward at the outset turned out to be more complex and nuanced than expected.”The second major area, initiating and stewarding environmental research, grew into a set of six specific program areas: natural climate solutions, mining, cities and climate change, plastics and the environment, arts and climate, and climate justice.These efforts included collaborations with a Nobel Peace Prize laureate, three successive presidential administrations from Colombia, and members of communities affected by climate change, including coal miners, indigenous groups, various cities, companies, the U.N., many agencies — and the popular musical group Coldplay, which has pledged to work toward climate neutrality for its performances. “It was the role that the ESI played as a host and steward of these research programs that may serve as a key element of our legacy,” Fernandez says.The third broad area, he says, “is the idea that the ESI as an entity at MIT would catalyze this movement of a research university toward sustainability as a core priority.” While MIT was founded to be an academic partner to the industrialization of the world, “aren’t we in a different world now? The kind of massive infrastructure planning and investment and construction that needs to happen to decarbonize the energy system is maybe the largest industrialization effort ever undertaken. Even more than in the recent past, the set of priorities driving this have to do with sustainable development.”Overall, Fernandez says, “we did everything we could to infuse the Institute in its teaching and research activities with the idea that the world is now in dire need of sustainable solutions.”Fernandez “has nurtured some incredible things under ESI,” Solomon says. “It’s been a very strong and useful program, both for education and research.” But it is appropriate at this time to distribute its projects to other venues, she says. “We do now have a major thrust in the Climate Project, and you don’t want to have redundancies and overlaps between the two.”Fernandez says “one of the missions of the Climate Project is really acting to coalesce and aggregate lots of work around MIT.” Now, with the Climate Project itself, along with the Climate Policy Center and the Center for Sustainability Science and Strategy, it makes more sense for ESI’s climate-related projects to be integrated into these new entities, and other projects that are less directly connected to climate to take their places in various appropriate departments or labs, he says.“We did enough with ESI that we made it possible for these other centers to really flourish,” he says. “And in that sense, we played our role.”As of June 1, Fernandez has returned to his role as professor of architecture and urbanism and building technology in the School of Architecture and Planning, where he directs the Urban Metabolism Group. He will also be starting up a new group called Environment ResearchAction (ERA) to continue ESI work in cities, nature, and artificial intelligence.  More

  • in

    Study helps pinpoint areas where microplastics will accumulate

    The accumulation of microplastics in the environment, and within our bodies, is an increasingly worrisome issue. But predicting where these ubiquitous particles will accumulate, and therefore where remediation efforts should be focused, has been difficult because of the many factors that contribute to their dispersal and deposition.New research from MIT shows that one key factor in determining where microparticles are likely to build up has to do with the presence of biofilms. These thin, sticky biopolymer layers are shed by microorganisms and can accumulate on surfaces, including along sandy riverbeds or seashores. The study found that, all other conditions being equal, microparticles are less likely to accumulate in sediment infused with biofilms, because if they land there, they are more likely to be resuspended by flowing water and carried away.The open-access findings appear in the journal Geophysical Research Letters, in a paper by MIT postdoc Hyoungchul Park and professor of civil and environmental engineering Heidi Nepf. “Microplastics are definitely in the news a lot,” Nepf says, “and we don’t fully understand where the hotspots of accumulation are likely to be. This work gives a little bit of guidance” on some of the factors that can cause these particles, and small particles in general, to accumulate in certain locations.Most experiments looking at the ways microparticles are transported and deposited have been conducted over bare sand, Park says. “But in nature, there are a lot of microorganisms, such as bacteria, fungi, and algae, and when they adhere to the stream bed they generate some sticky things.” These substances are known as extracellular polymeric substances, or EPS, and they “can significantly affect the channel bed characteristics,” he says. The new research focused on determining exactly how these substances affected the transport of microparticles, including microplastics.The research involved a flow tank with a bottom lined with fine sand, and sometimes with vertical plastic tubes simulating the presence of mangrove roots. In some experiments the bed consisted of pure sand, and in others the sand was mixed with a biological material to simulate the natural biofilms found in many riverbed and seashore environments.Water mixed with tiny plastic particles was pumped through the tank for three hours, and then the bed surface was photographed under ultraviolet light that caused the plastic particles to fluoresce, allowing a quantitative measurement of their concentration.The results revealed two different phenomena that affected how much of the plastic accumulated on the different surfaces. Immediately around the rods that stood in for above-ground roots, turbulence prevented particle deposition. In addition, as the amount of simulated biofilms in the sediment bed increased, the accumulation of particles also decreased.Nepf and Park concluded that the biofilms filled up the spaces between the sand grains, leaving less room for the microparticles to fit in. The particles were more exposed because they penetrated less deeply in between the sand grains, and as a result they were much more easily resuspended and carried away by the flowing water.“These biological films fill the pore spaces between the sediment grains,” Park explains, “and that makes the deposited particles — the particles that land on the bed — more exposed to the forces generated by the flow, which makes it easier for them to be resuspended. What we found was that in a channel with the same flow conditions and the same vegetation and the same sand bed, if one is without EPS and one is with EPS, then the one without EPS has a much higher deposition rate than the one with EPS.”Nepf adds: “The biofilm is blocking the plastics from accumulating in the bed because they can’t go deep into the bed. They just stay right on the surface, and then they get picked up and moved elsewhere. So, if I spilled a large amount of microplastic in two rivers, and one had a sandy or gravel bottom, and one was muddier with more biofilm, I would expect more of the microplastics to be retained in the sandy or gravelly river.”All of this is complicated by other factors, such as the turbulence of the water or the roughness of the bottom surface, she says. But it provides a “nice lens” to provide some suggestions for people who are trying to study the impacts of microplastics in the field. “They’re trying to determine what kinds of habitats these plastics are in, and this gives a framework for how you might categorize those habitats,” she says. “It gives guidance to where you should go to find more plastics versus less.”As an example, Park suggests, in mangrove ecosystems, microplastics may preferentially accumulate in the outer edges, which tend to be sandy, while the interior zones have sediment with more biofilm. Thus, this work suggests “the sandy outer regions may be potential hotspots for microplastic accumulation,” he says, and can make this a priority zone for monitoring and protection.“This is a highly relevant finding,” says Isabella Schalko, a research scientist at ETH Zurich, who was not associated with this research. “It suggests that restoration measures such as re-vegetation or promoting biofilm growth could help mitigate microplastic accumulation in aquatic systems. It highlights the powerful role of biological and physical features in shaping particle transport processes.”The work was supported by Shell International Exploration and Production through the MIT Energy Initiative. More

  • in

    Study: Climate change may make it harder to reduce smog in some regions

    Global warming will likely hinder our future ability to control ground-level ozone, a harmful air pollutant that is a primary component of smog, according to a new MIT study.The results could help scientists and policymakers develop more effective strategies for improving both air quality and human health. Ground-level ozone causes a host of detrimental health impacts, from asthma to heart disease, and contributes to thousands of premature deaths each year.The researchers’ modeling approach reveals that, as the Earth warms due to climate change, ground-level ozone will become less sensitive to reductions in nitrogen oxide emissions in eastern North America and Western Europe. In other words, it will take greater nitrogen oxide emission reductions to get the same air quality benefits.However, the study also shows that the opposite would be true in northeast Asia, where cutting emissions would have a greater impact on reducing ground-level ozone in the future. The researchers combined a climate model that simulates meteorological factors, such as temperature and wind speeds, with a chemical transport model that estimates the movement and composition of chemicals in the atmosphere.By generating a range of possible future outcomes, the researchers’ ensemble approach better captures inherent climate variability, allowing them to paint a fuller picture than many previous studies.“Future air quality planning should consider how climate change affects the chemistry of air pollution. We may need steeper cuts in nitrogen oxide emissions to achieve the same air quality goals,” says Emmie Le Roy, a graduate student in the MIT Department of Earth, Atmospheric and Planetary Sciences (EAPS) and lead author of a paper on this study.Her co-authors include Anthony Y.H. Wong, a postdoc in the MIT Center for Sustainability Science and Strategy; Sebastian D. Eastham, principal research scientist in the MIT Center for Sustainability Science and Strategy; Arlene Fiore, the Peter H. Stone and Paola Malanotte Stone Professor of EAPS; and senior author Noelle Selin, a professor in the Institute for Data, Systems, and Society (IDSS) and EAPS. The research appears today in Environmental Science and Technology.Controlling ozoneGround-level ozone differs from the stratospheric ozone layer that protects the Earth from harmful UV radiation. It is a respiratory irritant that is harmful to the health of humans, animals, and plants.Controlling ground-level ozone is particularly challenging because it is a secondary pollutant, formed in the atmosphere by complex reactions involving nitrogen oxides and volatile organic compounds in the presence of sunlight.“That is why you tend to have higher ozone days when it is warm and sunny,” Le Roy explains.Regulators typically try to reduce ground-level ozone by cutting nitrogen oxide emissions from industrial processes. But it is difficult to predict the effects of those policies because ground-level ozone interacts with nitrogen oxide and volatile organic compounds in nonlinear ways.Depending on the chemical environment, reducing nitrogen oxide emissions could cause ground-level ozone to increase instead.“Past research has focused on the role of emissions in forming ozone, but the influence of meteorology is a really important part of Emmie’s work,” Selin says.To conduct their study, the researchers combined a global atmospheric chemistry model with a climate model that simulate future meteorology.They used the climate model to generate meteorological inputs for each future year in their study, simulating factors such as likely temperature and wind speeds, in a way that captures the inherent variability of a region’s climate.Then they fed those inputs to the atmospheric chemistry model, which calculates how the chemical composition of the atmosphere would change because of meteorology and emissions.The researchers focused on Eastern North America, Western Europe, and Northeast China, since those regions have historically high levels of the precursor chemicals that form ozone and well-established monitoring networks to provide data.They chose to model two future scenarios, one with high warming and one with low warming, over a 16-year period between 2080 and 2095. They compared them to a historical scenario capturing 2000 to 2015 to see the effects of a 10 percent reduction in nitrogen oxide emissions.Capturing climate variability“The biggest challenge is that the climate naturally varies from year to year. So, if you want to isolate the effects of climate change, you need to simulate enough years to see past that natural variability,” Le Roy says.They could overcome that challenge due to recent advances in atmospheric chemistry modeling and by taking advantage of parallel computing to simulate multiple years at the same time. They simulated five 16-year realizations, resulting in 80 model years for each scenario.The researchers found that eastern North America and Western Europe are especially sensitive to increases in nitrogen oxide emissions from the soil, which are natural emissions driven by increases in temperature.Due to that sensitivity, as the Earth warms and more nitrogen oxide from soil enters the atmosphere, reducing nitrogen oxide emissions from human activities will have less of an impact on ground-level ozone.“This shows how important it is to improve our representation of the biosphere in these models to better understand how climate change may impact air quality,” Le Roy says.On the other hand, since industrial processes in northeast Asia cause more ozone per unit of nitrogen oxide emitted, cutting emissions there would cause greater reductions in ground-level ozone in future warming scenarios.“But I wouldn’t say that is a good thing because it means that, overall, there are higher levels of ozone,” Le Roy adds.Running detailed meteorology simulations, rather than relying on annual average weather data, gave the researchers a more complete picture of the potential effects on human health.“Average climate isn’t the only thing that matters. One high ozone day, which might be a statistical anomaly, could mean we don’t meet our air quality target and have negative human health impacts that we should care about,” Le Roy says.In the future, the researchers want to continue exploring the intersection of meteorology and air quality. They also want to expand their modeling approach to consider other climate change factors with high variability, like wildfires or biomass burning.“We’ve shown that it is important for air quality scientists to consider the full range of climate variability, even if it is hard to do in your models, because it really does affect the answer that you get,” says Selin.This work is funded, in part, by the MIT Praecis Presidential Fellowship, the J.H. and E.V. Wade Fellowship, and the MIT Martin Family Society of Fellows for Sustainability. More

  • in

    How J-WAFS Solutions grants bring research to market

    For the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS), 2025 marks a decade of translating groundbreaking research into tangible solutions for global challenges. Few examples illustrate that mission better than NONA Technologies. With support from a J-WAFS Solutions grant, MIT electrical engineering and biological engineering Professor Jongyoon Han and his team developed a portable desalination device that transforms seawater into clean drinking water without filters or high-pressure pumps. The device stands apart from traditional systems because conventional desalination technologies, like reverse osmosis, are energy-intensive, prone to fouling, and typically deployed at large, centralized plants. In contrast, the device developed in Han’s lab employs ion concentration polarization technology to remove salts and particles from seawater, producing potable water that exceeds World Health Organization standards. It is compact, solar-powered, and operable at the push of a button — making it an ideal solution for off-grid and disaster-stricken areas.This research laid the foundation for spinning out NONA Technologies along with co-founders Junghyo Yoon PhD ’21 from Han’s lab and Bruce Crawford MBA ’22, to commercialize the technology and address pressing water-scarcity issues worldwide. “This is really the culmination of a 10-year journey that I and my group have been on,” said Han in an earlier MIT News article. “We worked for years on the physics behind individual desalination processes, but pushing all those advances into a box, building a system, and demonstrating it in the ocean … that was a really meaningful and rewarding experience for me.” You can watch this video showcasing the device in action.Moving breakthrough research out of the lab and into the world is a well-known challenge. While traditional “seed” grants typically support early-stage research at Technology Readiness Level (TRL) 1-2, few funding sources exist to help academic teams navigate to the next phase of technology development. The J-WAFS Solutions Program is strategically designed to address this critical gap by supporting technologies in the high-risk, early-commercialization phase that is often neglected by traditional research, corporate, and venture funding. By supporting technologies at TRLs 3-5, the program increases the likelihood that promising innovations will survive beyond the university setting, advancing sufficiently to attract follow-on funding.Equally important, the program gives academic researchers the time, resources, and flexibility to de-risk their technology, explore customer need and potential real-world applications, and determine whether and how they want to pursue commercialization. For faculty-led teams like Han’s, the J-WAFS Solutions Program provided the critical financial runway and entrepreneurial guidance needed to refine the technology, test assumptions about market fit, and lay the foundation for a startup team. While still in the MIT innovation ecosystem, Nona secured over $200,000 in non-dilutive funding through competitions and accelerators, including the prestigious MIT delta v Educational Accelerator. These early wins laid the groundwork for further investment and technical advancement.Since spinning out of MIT, NONA has made major strides in both technology development and business viability. What started as a device capable of producing just over half-a-liter of clean drinking water per hour has evolved into a system that now delivers 10 times that capacity, at 5 liters per hour. The company successfully raised a $3.5 million seed round to advance its portable desalination device, and entered into a collaboration with the U.S. Army Natick Soldier Systems Center, where it co-developed early prototypes and began generating revenue while validating the technology. Most recently, NONA was awarded two SBIR Phase I grants totaling $575,000, one from the National Science Foundation and another from the National Institute of Environmental Health Sciences.Now operating out of Greentown Labs in Somerville, Massachusetts, NONA has grown to a dedicated team of five and is preparing to launch its nona5 product later this year, with a wait list of over 1,000 customers. It is also kicking off its first industrial pilot, marking a key step toward commercial scale-up. “Starting a business as a postdoc was challenging, especially with limited funding and industry knowledge,” says Yoon, who currently serves as CTO of NONA. “J-WAFS gave me the financial freedom to pursue my venture, and the mentorship pushed me to hit key milestones. Thanks to J-WAFS, I successfully transitioned from an academic researcher to an entrepreneur in the water industry.”NONA is one of several J-WAFS-funded technologies that have moved from the lab to market, part of a growing portfolio of water and food solutions advancing through MIT’s innovation pipeline. As J-WAFS marks a decade of catalyzing innovation in water and food, NONA exemplifies what is possible when mission-driven research is paired with targeted early-stage support and mentorship.To learn more or get involved in supporting startups through the J-WAFS Solutions Program, please contact jwafs@mit.edu. More

  • in

    Study: Burning heavy fuel oil with scrubbers is the best available option for bulk maritime shipping

    When the International Maritime Organization enacted a mandatory cap on the sulfur content of marine fuels in 2020, with an eye toward reducing harmful environmental and health impacts, it left shipping companies with three main options.They could burn low-sulfur fossil fuels, like marine gas oil, or install cleaning systems to remove sulfur from the exhaust gas produced by burning heavy fuel oil. Biofuels with lower sulfur content offer another alternative, though their limited availability makes them a less feasible option.While installing exhaust gas cleaning systems, known as scrubbers, is the most feasible and cost-effective option, there has been a great deal of uncertainty among firms, policymakers, and scientists as to how “green” these scrubbers are.Through a novel lifecycle assessment, researchers from MIT, Georgia Tech, and elsewhere have now found that burning heavy fuel oil with scrubbers in the open ocean can match or surpass using low-sulfur fuels, when a wide variety of environmental factors is considered.The scientists combined data on the production and operation of scrubbers and fuels with emissions measurements taken onboard an oceangoing cargo ship.They found that, when the entire supply chain is considered, burning heavy fuel oil with scrubbers was the least harmful option in terms of nearly all 10 environmental impact factors they studied, such as greenhouse gas emissions, terrestrial acidification, and ozone formation.“In our collaboration with Oldendorff Carriers to broadly explore reducing the environmental impact of shipping, this study of scrubbers turned out to be an unexpectedly deep and important transitional issue,” says Neil Gershenfeld, an MIT professor, director of the Center for Bits and Atoms (CBA), and senior author of the study.“Claims about environmental hazards and policies to mitigate them should be backed by science. You need to see the data, be objective, and design studies that take into account the full picture to be able to compare different options from an apples-to-apples perspective,” adds lead author Patricia Stathatou, an assistant professor at Georgia Tech, who began this study as a postdoc in the CBA.Stathatou is joined on the paper by Michael Triantafyllou, the Henry L. and Grace Doherty and others at the National Technical University of Athens in Greece and the maritime shipping firm Oldendorff Carriers. The research appears today in Environmental Science and Technology.Slashing sulfur emissionsHeavy fuel oil, traditionally burned by bulk carriers that make up about 30 percent of the global maritime fleet, usually has a sulfur content around 2 to 3 percent. This is far higher than the International Maritime Organization’s 2020 cap of 0.5 percent in most areas of the ocean and 0.1 percent in areas near population centers or environmentally sensitive regions.Sulfur oxide emissions contribute to air pollution and acid rain, and can damage the human respiratory system.In 2018, fewer than 1,000 vessels employed scrubbers. After the cap went into place, higher prices of low-sulfur fossil fuels and limited availability of alternative fuels led many firms to install scrubbers so they could keep burning heavy fuel oil.Today, more than 5,800 vessels utilize scrubbers, the majority of which are wet, open-loop scrubbers.“Scrubbers are a very mature technology. They have traditionally been used for decades in land-based applications like power plants to remove pollutants,” Stathatou says.A wet, open-loop marine scrubber is a huge, metal, vertical tank installed in a ship’s exhaust stack, above the engines. Inside, seawater drawn from the ocean is sprayed through a series of nozzles downward to wash the hot exhaust gases as they exit the engines.The seawater interacts with sulfur dioxide in the exhaust, converting it to sulfates — water-soluble, environmentally benign compounds that naturally occur in seawater. The washwater is released back into the ocean, while the cleaned exhaust escapes to the atmosphere with little to no sulfur dioxide emissions.But the acidic washwater can contain other combustion byproducts like heavy metals, so scientists wondered if scrubbers were comparable, from a holistic environmental point of view, to burning low-sulfur fuels.Several studies explored toxicity of washwater and fuel system pollution, but none painted a full picture.The researchers set out to fill that scientific gap.A “well-to-wake” analysisThe team conducted a lifecycle assessment using a global environmental database on production and transport of fossil fuels, such as heavy fuel oil, marine gas oil, and very-low sulfur fuel oil. Considering the entire lifecycle of each fuel is key, since producing low-sulfur fuel requires extra processing steps in the refinery, causing additional emissions of greenhouse gases and particulate matter.“If we just look at everything that happens before the fuel is bunkered onboard the vessel, heavy fuel oil is significantly more low-impact, environmentally, than low-sulfur fuels,” she says.The researchers also collaborated with a scrubber manufacturer to obtain detailed information on all materials, production processes, and transportation steps involved in marine scrubber fabrication and installation.“If you consider that the scrubber has a lifetime of about 20 years, the environmental impacts of producing the scrubber over its lifetime are negligible compared to producing heavy fuel oil,” she adds.For the final piece, Stathatou spent a week onboard a bulk carrier vessel in China to measure emissions and gather seawater and washwater samples. The ship burned heavy fuel oil with a scrubber and low-sulfur fuels under similar ocean conditions and engine settings.Collecting these onboard data was the most challenging part of the study.“All the safety gear, combined with the heat and the noise from the engines on a moving ship, was very overwhelming,” she says.Their results showed that scrubbers reduce sulfur dioxide emissions by 97 percent, putting heavy fuel oil on par with low-sulfur fuels according to that measure. The researchers saw similar trends for emissions of other pollutants like carbon monoxide and nitrous oxide.In addition, they tested washwater samples for more than 60 chemical parameters, including nitrogen, phosphorus, polycyclic aromatic hydrocarbons, and 23 metals.The concentrations of chemicals regulated by the IMO were far below the organization’s requirements. For unregulated chemicals, the researchers compared the concentrations to the strictest limits for industrial effluents from the U.S. Environmental Protection Agency and European Union.Most chemical concentrations were at least an order of magnitude below these requirements.In addition, since washwater is diluted thousands of times as it is dispersed by a moving vessel, the concentrations of such chemicals would be even lower in the open ocean.These findings suggest that the use of scrubbers with heavy fuel oil can be considered as equal to or more environmentally friendly than low-sulfur fuels across many of the impact categories the researchers studied.“This study demonstrates the scientific complexity of the waste stream of scrubbers. Having finally conducted a multiyear, comprehensive, and peer-reviewed study, commonly held fears and assumptions are now put to rest,” says Scott Bergeron, managing director at Oldendorff Carriers and co-author of the study.“This first-of-its-kind study on a well-to-wake basis provides very valuable input to ongoing discussion at the IMO,” adds Thomas Klenum, executive vice president of innovation and regulatory affairs at the Liberian Registry, emphasizing the need “for regulatory decisions to be made based on scientific studies providing factual data and conclusions.”Ultimately, this study shows the importance of incorporating lifecycle assessments into future environmental impact reduction policies, Stathatou says.“There is all this discussion about switching to alternative fuels in the future, but how green are these fuels? We must do our due diligence to compare them equally with existing solutions to see the costs and benefits,” she adds.This study was supported, in part, by Oldendorff Carriers. More

  • in

    Technology developed by MIT engineers makes pesticides stick to plant leaves

    Reducing the amount of agricultural sprays used by farmers — including fertilizers, pesticides and herbicides — could cut down the amount of polluting runoff that ends up in the environment while at the same time reducing farmers’ costs and perhaps even enhancing their productivity. A classic win-win-win.A team of researchers at MIT and a spinoff company they launched has developed a system to do just that. Their technology adds a thin coating around droplets as they are being sprayed onto a field, greatly reducing their tendency to bounce off leaves and end up wasted on the ground. Instead, the coated droplets stick to the leaves as intended.The research is described today in the journal Soft Matter, in a paper by recent MIT alumni Vishnu Jayaprakash PhD ’22 and Sreedath Panat PhD ’23, graduate student Simon Rufer, and MIT professor of mechanical engineering Kripa Varanasi.A recent study found that if farmers didn’t use pesticides, they would lose 78 percent of fruit, 54 percent of vegetable, and 32 percent of cereal production. Despite their importance, a lack of technology that monitors and optimizes sprays has forced farmers to rely on personal experience and rules of thumb to decide how to apply these chemicals. As a result, these chemicals tend to be over-sprayed, leading to runoff and chemicals ending up in waterways or building up in the soil.Pesticides take a significant toll on global health and the environment, the researchers point out. A recent study found that 31 percent of agricultural soils around the world were at high risk from pesticide pollution. And agricultural chemicals are a major expense for farmers: In the U.S., they spend $16 billion a year just on pesticides.Making spraying more efficient is one of the best ways to make food production more sustainable and economical. Agricultural spraying essentially boils down to mixing chemicals into water and spraying water droplets onto plant leaves, which are often inherently water-repellent. “Over more than a decade of research in my lab at MIT, we have developed fundamental understandings of spraying and the interaction between droplets and plants — studying when they bounce and all the ways we have to make them stick better and enhance coverage,” Varanasi says.The team had previously found a way to reduce the amount of sprayed liquid that bounces away from the leaves it strikes, which involved using two spray nozzles instead of one and spraying mixtures with opposite electrical charges. But they found that farmers were reluctant to take on the expense and effort of converting their spraying equipment to a two-nozzle system. So, the team looked for a simpler alternative.They discovered they could achieve the same improvement in droplet retention using a single-nozzle system that can be easily adapted to existing sprayers. Instead of giving the droplets of pesticide an electric charge, they coat each droplet with a vanishingly thin layer of an oily material.In their new study, they conducted lab experiments with high-speed cameras. When they sprayed droplets with no special treatment onto a water-repelling (hydrophobic) surface similar to that of many plant leaves, the droplets initially spread out into a pancake-like disk, then rebounded back into a ball and bounced away. But when the researchers coated the surface of the droplets with a tiny amount of oil — making up less than 1 percent of the droplet’s liquid — the droplets spread out and then stayed put. The treatment improved the droplets’ “stickiness” by as much as a hundredfold.“When these droplets are hitting the surface and as they expand, they form this oil ring that essentially pins the droplet to the surface,” Rufer says. The researchers tried a wide variety of conditions, he says, explaining that they conducted hundreds of experiments, “with different impact velocities, different droplet sizes, different angles of inclination, all the things that fully characterize this phenomenon.” Though different oils varied in their effectiveness, all of them were effective. “Regardless of the impact velocity and the oils, we saw that the rebound height was significantly lower,” he says.The effect works with remarkably small amounts of oil. In their initial tests they used 1 percent oil compared to the water, then they tried a 0.1 percent, and even .01. The improvement in droplets sticking to the surface continued at a 0.1 percent, but began to break down beyond that. “Basically, this oil film acts as a way to trap that droplet on the surface, because oil is very attracted to the surface and sort of holds the water in place,” Rufer says.In the researchers’ initial tests they used soybean oil for the coating, figuring this would be a familiar material for the farmers they were working with, many of whom were growing soybeans. But it turned out that though they were producing the beans, the oil was not part of their usual supply chain for use on the farm. In further tests, the researchers found that several chemicals that farmers were already routinely using in their spraying, called surfactants and adjuvants, could be used instead, and that some of these provided the same benefits in keeping the droplets stuck on the leaves.“That way,” Varanasi says, “we’re not introducing a new chemical or changed chemistries into their field, but they’re using things they’ve known for a long time.”Varanasi and Jayaprakash formed a company called AgZen to commercialize the system. In order to prove how much their coating system improves the amount of spray that stays on the plant, they first had to develop a system to monitor spraying in real time. That system, which they call RealCoverage, has been deployed on farms ranging in size from a few dozen acres to hundreds of thousands of acres, and many different crop types, and has saved farmers 30 to 50 percent on their pesticide expenditures, just by improving the controls on the existing sprays. That system is being deployed to 920,000 acres of crops in 2025, the company says, including some in California, Texas, the Midwest, France and Italy. Adding the cloaking system using new nozzles, the researchers say, should yield at least another doubling of efficiency.“You could give back a billion dollars to U.S. growers if you just saved 6 percent of their pesticide budget,” says Jayaprakash, lead author of the research paper and CEO of AgZen. “In the lab we got 300 percent of extra product on the plant. So that means we could get orders of magnitude reductions in the amount of pesticides that farmers are spraying.”Farmers had already been using these surfactant and adjuvant chemicals as a way to enhance spraying effectiveness, but they were mixing it with a water solution. For it to have any effect, they had to use much more of these materials, risking causing burns to the plants. The new coating system reduces the amount of these materials needed, while improving their effectiveness.In field tests conducted by AgZen, “we doubled the amount of product on kale and soybeans just by changing where the adjuvant was,” from mixed in to being a coating, Jayaprakash says. It’s convenient for farmers because “all they’re doing is changing their nozzle. They’re getting all their existing chemicals to work better, and they’re getting more product on the plant.”And it’s not just for pesticides. “The really cool thing is this is useful for every chemistry that’s going on the leaf, be it an insecticide, a herbicide, a fungicide, or foliar nutrition,” Varanasi says. This year, they plan to introduce the new spray system on about 30,000 acres of cropland.Varanasi says that with projected world population growth, “the amount of food production has got to double, and we are limited in so many resources, for example we cannot double the arable land. … This means that every acre we currently farm must become more efficient and able to do more with less.” These improved spraying technologies, for both monitoring the spraying and coating the droplets, Varanasi says, “I think is fundamentally changing agriculture.”AgZen has recently raised $10 million in venture financing to support rapid commercial deployment of these technologies that can improve the control of chemical inputs into agriculture. “The knowledge we are gathering from every leaf, combined with our expertise in interfacial science and fluid mechanics, is giving us unparalleled insights into how chemicals are used and developed — and it’s clear that we can deliver value across the entire agrochemical supply chain,” Varanasi says  “Our mission is to use these technologies to deliver improved outcomes and reduced costs for the ag industry.”  More