More stories

  • in

    Ancient atmospheric oxygen sleuthing with ocean chromium

    Found in jewelry, car parts, pigments, and industrial chemical reactions, the metal chromium and its compounds are often employed for their color, finish, and anti-corrosive and catalytic properties. Currently, geoscientists and paleoceanographers from MIT and the Woods Hole Oceanographic Institution (WHOI) are looking to add another use to that list: as a way to examine chemical shifts in ancient Earth’s oceans and atmosphere that are preserved in the seafloor’s paleorecord. More specifically, they want to reconstruct rising atmospheric oxygen levels, which began around 2.4 billion years ago, and their effect on the seas. Since biology and the environment are intimately intertwined, this information could help illuminate how the Earth’s life and climate evolved.

    While researchers have widely applied chromium as a tool to understand the rock record around this global transition, they’re still working out what different chemical signals mean. This is especially true for evaluating ocean sediments, which could reveal where and when oxygen began penetrating and was being formed in the oceans. However, paleoscientists have largely lacked an understanding of how trace amounts of chromium mechanistically interact and cycle in modern, oxygenated seas, let alone the early oceans — a key component needed for any interpretation — until now.

    Research recently published in the Proceedings of the National Academy of Sciences and led by MIT-Woods Hole Oceanographic Institution Joint Program graduate student Tianyi Huang investigated the trace metal’s promise as a paleoproxy for oxygen. For this, the team tracked how oxygen-sensitive chromium isotopes circulated and how they were chemically oxidized or reduced within an oxygen-deficient patch of water in the tropical Pacific Ocean, an analog for early, anaerobic seas. Their findings help validate chromium tracking as a reliable instrument in geology toolbox.

    “People have seen the that chromium isotopes in the geological records kind of track the atmospheric oxygen levels. But, because you’re using something that is buried in the sediments to interpret what is happening in the atmosphere, there’s a missing link in between, and that is the ocean,” says Huang. Further, “how that chromium cycles might change our interpretations of geological records.”

    “The evolution of oxygen on Earth is only known in a coarse fashion, but it is crucial to the development and survival of complex multicellular life,” says Ed Boyle, professor of ocean geochemistry of MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS); MIT-WHOI Joint Program director; and study co-author, along with Simone Moos PhD ’18 of the Elementar Corporation. “In addition, there is ongoing concern about the past few decades of decreasing oceanic oxygen levels in the ocean, and we need tools to better understand the ocean’s oxygen dynamics.” 

    Bridging a gap

    Billions of years ago, when Earth and its atmosphere were essentially devoid of molecular oxygen (O2), chemical reactions and biological metabolisms would have occurred in a chemically reduced, anaerobic environment. During the Great Oxidation Event, which occurred over the course of millions of years, oxygen levels rose planet-wide, and life transitioned accordingly. Further, the environment largely became an oxidized one that grappled with stress processes like rusting and free radicals.

    Some evidence has shown that chemical reactions involving chromium track this process, through effects on its isotopes, chromium-52 and chromium-53, and their oxidation states, primarily the trivalent, reduced form Cr (III) and a hexavalent, oxidized one Cr (VI). The latter is more likely to be found in oxygenated, surface seawater and is considered a health and environmental hazard. Previous studies have shown that the upper ocean tends to have more of the heavier isotope than the lighter one, suggesting some preferential uptake by marine microorganisms. The problem, Huang notes, is that after chromium enters the oceans from rivers, scientists don’t really know the mechanisms behind these observations and if the trends are consistent. In today’s oxygen-deficient waters, she says, “chromium could potentially be reduced, and we want to know the isotope signal of that and other chromium processes that might leave an isotope fingerprint.”

    To investigate these phenomena, Huang joined two research cruises to the eastern tropical North Pacific Ocean’s oxygen-deficient zone (ODZ) and gathered vertical profiles of seawater samples down to 3,500 meters from across a transect of sea. Some of these seawater samples were frozen to be analyzed for concentrations of trivalent and hexavalent chromium. After being shipped back to the lab, these samples were thawed and purified. The team analyzed the isotope composition of the Cr (III) samples. They then acidified the Cr (VI) samples to convert them to Cr (III) before performing the same isotope analysis as before. The researchers also measured the total chromium in the samples to be able to account for any chemical transformations or migration within the ODZ. With the addition of data from another cruise, Boyle, Moos, and Huang examined the fraction of each isotope over the depth range, compared to an average partitioning, to see if there was any enrichment in a particular area of the ODZ and which oxidation state it existed in. They charted this against the samples’ oxygen levels and put the results in context of known ocean features to help explain how chromium is cycling.

    A ground truth for chromium cycling

    The oceanographers found a pattern. In surface, oxygenated ocean, hexavalent chromium was consumed, likely by microbial life, and transported deeper, into the ODZ. Around the 200-meter mark, the metal began to accumulate in the seawater, and the lighter isotope, chromium-52, was preferentially reduced. This depth happens to coincide with anaerobic, denitrifying microbes that produce nitrite. Huang says that this could be a sign that nitrogen and chromium cycling are entangled, but that doesn’t rule out other biotic or abiotic mechanisms, like reduction by iron, that could be affecting ocean sediment records.

    Chromium doesn’t linger here forever, though. While data showed that most of it remained in oxygen-deficient zone, which extends from 90 to 800 meters, for about 20-50 years, a small portion of it attached to sinking particles, sank into the deep ocean where there is more dissolved oxygen, and later oxidized back to hexavalent chromium. Here, it could begin incorporating and interacting with sediments.

    “I think it is exciting that we could determine the chromium [oxidation] species, and from that, we could calculate its isotope fractionation,” says Huang. “Nobody has done that in this way before.”

    Their work, Huang says, helps validate chromium as an indicator of different redox environments. “We’re seeing this signal and it’s not vanishing.” Further, it seems consistent over the seasons. However, the team isn’t convinced yet. They plan to test this in other oxygen-deficient zones around the world to see if a similar chromium signature pops up, as well as investigate the composition of the sinking particles carrying the trivalent chromium and the surface of ocean sediments, in order to get a more complete picture of the ocean’s involvement.

    For now, they advise against drawing conclusions, but are guardedly optimistic about its potential. “I think people need to interpret this proxy with more caution,” says Huang. “It might not be purely the atmospheric oxygen that is determining the measurement, but there could be other [biotic or abiotic] processes in the ocean that could alter their paleorecords.” So, they suggest not to read into the chromium signals in the paleorecord too much, yet.

    This research was supported, in part, by the National Science Foundation. More

  • in

    Study reveals uncertainty in how much carbon the ocean absorbs over time

    The ocean’s “biological pump” describes the many marine processes that work to take up carbon dioxide from the atmosphere and transport it deep into the ocean, where it can remain sequestered for centuries. This ocean pump is a powerful regulator of atmospheric carbon dioxide and an essential ingredient in any global climate forecast.

    But a new MIT study points to a significant uncertainty in the way the biological pump is represented in climate models today. Researchers found that the “gold standard” equation used to calculate the pump’s strength has a larger margin of error than previously thought, and that predictions of how much atmospheric carbon the ocean will pump down to various depths could be off by 10 to 15 parts per million.

    Given that the world is currently emitting carbon dioxide into the atmosphere at an annual rate of about 2.5 parts per million, the team estimates that the new uncertainty translates to about a five-year error in climate target projections.

    “This larger error bar might be critical if we want to stay within 1.5 degrees of warming targeted by the Paris Agreement,” says Jonathan Lauderdale, a research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “If current models predict we have until 2040 to cut carbon emissions, we’re expanding the uncertainty around that, to say maybe we now have until 2035, which could be quite a big deal.”

    Lauderdale and former MIT graduate student B.B. Cael, now at the National Oceanography Center in Southampton, U.K., have published their study today in the journal Geophysical Research Letters.

    Snow curve

    The marine processes that contribute to the ocean’s biological pump begin with phytoplankton, microscopic organisms that soak up carbon dioxide from the atmosphere as they grow. When they die, phytoplankton collectively sink through the water column as “marine snow,” carrying that carbon with them.

    “These particles rain down like white flaky snow that is all this dead stuff falling out of the surface ocean,” Lauderdale says.

    At various depths the particles are consumed by microbes, which convert the particles’ organic carbon and respire it into the deep ocean in an inorganic, mineral form, in a process known as remineralization.

    In the 1980s, researchers collected marine snow at locations and depths throughout the tropical Pacific. From these observations they generated a simple power law  mathematical relationship — the Martin curve, named after team member John Martin — to describe the strength of the biological pump, and how much carbon the ocean can remineralize and sequester at various depths.

    “The Martin curve is ubiquitous, and it’s really the gold standard [used in many climate models today],” Lauderdale says.

    But in 2018, Cael and co-author Kelsey Bisson showed that the power law derived to explain the Martin curve was not the only equation that could fit the observations. The power law is a simple mathematical relationship that assumes that particles fall faster with depth. But Cael found that several other mathematical relationships, each based on different mechanisms for how marine snow sinks and is remineralized, could also explain the data.

    For instance, one alternative assumes that particles fall at the same rate no matter the depth, while another assumes that particles with heavy, less-consumable phytoplankton shells fall faster than those without.

    “He found that you can’t tell which curve is the right one, which is a bit troubling, because each curve has different mechanisms behind it,” Lauderdale says. “In other words, researchers might be using the ‘wrong’ function to predict the strength of the biological pump. These discrepancies could snowball and impact climate projections.”

    A curve, reconsidered

    In the new study, Lauderdale and Cael looked at how much difference it would make to estimates of carbon stored deep in the ocean if they changed the mathematical description of the biological pump.

    They started with the same six alternative equations, or remineralization curves, that Cael had previously studied. The team looked at how climate models’ predictions of atmospheric carbon dioxide would change if they were based on any of the six alternatives, versus the Martin curve’s power law.

    To make the comparison as statistically similar as possible, they first fit each alternative equation to the Martin curve. The Martin curve describes the how much marine snow reaches various depths through the ocean. The researchers entered the data points from the curve into each alternative equation. They then ran each equation through the MITgcm, a general circulation model that simulates, among other processes, the flux of carbon dioxide between the atmosphere and the ocean.

    The team ran the climate model forward in time to see how each alternative equation for the biological pump changed the model’s estimates of carbon dioxide in the atmosphere, compared with the Martin curve’s power law. They found that the amount of carbon that the ocean is able to draw down and sequester from the atmosphere varies widely, depending on which mathematical description for the biological pump they used.

    “The surprising part was that even small changes in the amount of remineralization or marine snow making it to different depths due to the different curves can lead to significant changes in atmospheric carbon dioxide,” Lauderdale says.

    The results suggest that the ocean’s pumping strength, and the processes that govern how fast marine snow falls, are still an open question.  

    “We definitely need to make many more measurements of marine snow to break down the mechanisms behind what’s going on,” Lauderdale adds. “Because probably all these processes are relevant, but we really want to know which are driving carbon sequestration.”

    This research was supported, in part, by the National Science Foundation, the Simons Collaboration on Computational Biogeochemical Modeling of Marine Ecosystems, and the UK National Environmental Research Council. More

  • in

    Accounting for firms’ positive impacts on the environment

    Gregory Norris is an expert on quantifying firms’ impacts on the environment over the life cycles of their products and processes. His analyses help decision-makers opt for more sustainable, Earth-friendly outputs.

    He and others in this field of life-cycle assessment (LCA) have largely gone about their work by determining firms’ negative impacts on the environment, or footprints, a term most people are familiar with. But Norris felt something was missing. What about the positive impacts firms can have by, for example, changing behaviors or creating greener manufacturing processes that become available to competitors? Could they be added to the overall LCA tally?

    Introducing handprints, the term Norris coined for those positive impacts and the focus of MIT’s Sustainability and Health Initiative for NetPositive Enterprise (SHINE). SHINE is co-led by Norris and Randolph Kirchain, who both have appointments through MIT’s Materials Research Laboratory (MRL).

    Positive impacts

    “If you ask LCA practitioners what they track to determine a product’s sustainability, 99 out of 100 will talk about footprints, these negative impacts,” Norris says. “We’re about expanding that to include handprints, or positive impacts.”

    Says Kirchain, “we’re trying to make the [LCA] metrics more encompassing so firms are motivated to make positive changes as well.” And that could ultimately “increase the scope of activities that firms engage in for environmental benefits.”

    In a February 2021 paper in the International Journal of Life Cycle Assessment, Norris, Kirchain, and colleagues lay out the methodology for not only estimating handprints but also combining them with footprints. Additional authors of the paper are Jasmina Burek, Elizabeth A. Moore, and Jeremy Gregory, who are also affiliated with the MRL.

    “By giving handprints a defendable methodology, we get closer to the ideal place where everything that counts can be counted,” says Jeff Zeman, principal of TrueNorth Collective, a consulting firm for sustainability. Zeman was not involved in the work.

    As a result, Zeman continues, “designers can see the positive impact of their work show up in an organization’s messaging, as progress toward its sustainability goals, and bridge their work with other good actors to create shared benefits. Handprints have been a powerful influence on me and my team — and continue to be.”

    How it works

    Handprints are measured with the same metrics used for quantifying different footprints. For example, a classic metric for determining a product’s water footprint is the liters of water used to create that product. The same product’s water handprint would be calculated by determining the liters of water saved through a positive change such as instituting a new manufacturing process involving recycled materials. Both footprints and handprints are measured using existing life-cycle inventory databases, software, and calculation methods.

    The SHINE team has demonstrated the impact of adding handprints to LCA analyses through case studies with several companies. One such study described in the paper involved Interface, a manufacturer of flooring materials. The SHINE team calculated the company’s handprints associated with the use of “recycled” gas to help heat its manufacturing facility. Specifically, Interface captured and burned methane gas from a landfill. That gas would otherwise have been released to the atmosphere, contributing to climate change.

    After calculating both the company’s handprints and footprints, the SHINE team found that Interface had a net positive impact. As the team wrote in their paper, “with the SHINE handprint framework, we can help actors to create handprints greater than, and commensurate with, their footprints.”

    Concludes Norris: “With this paper, we hope that work on sustainability will get stronger by making these tools available to more people.”

    This work was supported by the SHINE consortium. More

  • in

    MIT engineers make filters from tree branches to purify drinking water

    The interiors of nonflowering trees such as pine and ginkgo contain sapwood lined with straw-like conduits known as xylem, which draw water up through a tree’s trunk and branches. Xylem conduits are interconnected via thin membranes that act as natural sieves, filtering out bubbles from water and sap.

    MIT engineers have been investigating sapwood’s natural filtering ability, and have previously fabricated simple filters from peeled cross-sections of sapwood branches, demonstrating that the low-tech design effectively filters bacteria.

    Now, the same team has advanced the technology and shown that it works in real-world situations. They have fabricated new xylem filters that can filter out pathogens such as E. coli and rotavirus in lab tests, and have shown that the filter can remove bacteria from contaminated spring, tap, and groundwater. They also developed simple techniques to extend the filters’ shelf-life, enabling the woody disks to purify water after being stored in a dry form for at least two years.

    The researchers took their techniques to India, where they made xylem filters from native trees and tested the filters with local users. Based on their feedback, the team developed a prototype of a simple filtration system, fitted with replaceable xylem filters that purified water at a rate of one liter per hour.

    Their results, published today in Nature Communications, show that xylem filters have potential for use in community settings to remove bacteria and viruses from contaminated drinking water.

    The researchers are exploring options to make xylem filters available at large scale, particularly in areas where contaminated drinking water is a major cause of disease and death. The team has launched an open-source website, with guidelines for designing and fabricating xylem filters from various tree types. The website is intended to support entrepreneurs, organizations, and leaders to introduce the technology to broader communities, and inspire students to perform their own science experiments with xylem filters.

    “Because the raw materials are widely available and the fabrication processes are simple, one could imagine involving communities in procuring, fabricating, and distributing xylem filters,” says Rohit Karnik, professor of mechanical engineering and associate department head for education at MIT. “For places where the only option has been to drink unfiltered water, we expect xylem filters would improve health, and make water drinkable.”

    Karnik’s study co-authors are lead author Krithika Ramchander and Luda Wang of MIT’s Department of Mechanical Engineering, and Megha Hegde, Anish Antony, Kendra Leith, and Amy Smith of MIT D-Lab.

    Play video

    Clearing the way

    In their prior studies of xylem, Karnik and his colleagues found that the woody material’s natural filtering ability also came with some natural limitations. As the wood dried, the branches’ sieve-like membranes began to stick to the walls, reducing the filter’s permeance, or ability to allow water to flow through. The filters also appeared to “self-block” over time, building up woody matter that clogged the conduits.

    Surprisingly, two simple treatments overcame both limitations. By soaking small cross-sections of sapwood in hot water for an hour, then dipping them in ethanol and letting them dry, Ramchander found that the material retained its permeance, efficiently filtering water without clogging up. Its filtering could also be improved by tailoring a filter’s thickness according to its tree type.

    The researchers sliced and treated small cross-sections of white pine from branches around the MIT campus and showed that the resulting filters maintained a permeance comparable to commercial filters, even after being stored for up to two years, significantly extending the filters’ shelf life.

    Play video

    The researchers also tested the filters’ ability to remove contaminants such as E. coli and rotavirus — the most common cause of diarrheal disease. The treated filters removed more than 99 percent of both contaminants, a water treatment level that meets the “two-star comprehensive protection” category set by the World Health Organization.

    “We think these filters can reasonably address bacterial contaminants,” Ramchander says. “But there are chemical contaminants like arsenic and fluoride where we don’t know the effect yet,” she notes.

    Groundwork

    Encouraged by their results in the lab, the researchers moved to field-test their designs in India, a country that has experienced the highest mortality rate due to water-borne disease in the world, and where safe and reliable drinking water is inaccessible to more than 160 million people.

    Over two years, the engineers, including researchers in the MIT D-Lab, worked in mountain and urban regions, facilitated by local NGOs Himmotthan Society, Shramyog, Peoples Science Institute, and Essmart. They fabricated filters from native pine trees and tested them, along with filters made from ginkgo trees in the U.S., with local drinking water sources. These tests confirmed that the filters effectively removed bacteria found in the local water. The researchers also held interviews, focus groups, and design workshops to understand local communities’ current water practices, and challenges and preferences for water treatment solutions. They also gathered feedback on the design.

    “One of the things that scored very high with people was the fact that this filter is a natural material that everyone recognizes,” Hegde says. “We also found that people in low-income households prefer to pay a smaller amount on a daily basis, versus a larger amount less frequently. That was a barrier to using existing filters, because replacement costs were too much.”

    With information from more than 1,000 potential users across India, they designed a prototype of a simple filtration system, fitted with a receptacle at the top that users can fill with water. The water flows down a 1-meter-long tube, through a xylem filter, and out through a valve-controlled spout. The xylem filter can be swapped out either daily or weekly, depending on a household’s needs.

    The team is exploring ways to produce xylem filters at larger scales, with locally available resources and in a way that would encourage people to practice water purification as part of their daily lives — for instance, by providing replacement filters in affordable, pay-as-you-go packets.

    “Xylem filters are made from inexpensive and abundantly available materials, which could be made available at local shops, where people can buy what they need, without requiring an upfront investment as is typical for other water filter cartridges,” Karnik says. “For now, we’ve shown that xylem filters provide performance that’s realistic.”

    This research was supported, in part, by the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) at MIT and the MIT Tata Center for Technology and Design. More

  • in

    Study reveals plunge in lithium-ion battery costs

    The cost of the rechargeable lithium-ion batteries used for phones, laptops, and cars has fallen dramatically over the last three decades, and has been a major driver of the rapid growth of those technologies. But attempting to quantify that cost decline has produced ambiguous and conflicting results that have hampered attempts to project the technology’s future or devise useful policies and research priorities.

    Now, MIT researchers have carried out an exhaustive analysis of the studies that have looked at the decline in the prices these batteries, which are the dominant rechargeable technology in today’s world. The new study looks back over three decades, including analyzing the original underlying datasets and documents whenever possible, to arrive at a clear picture of the technology’s trajectory.

    The researchers found that the cost of these batteries has dropped by 97 percent since they were first commercially introduced in 1991. This rate of improvement is much faster than many analysts had claimed and is comparable to that of solar photovoltaic panels, which some had considered to be an exceptional case. The new findings are reported today in the journal Energy and Environmental Science, in a paper by MIT postdoc Micah Ziegler and Associate Professor Jessika Trancik.

    While it’s clear that there have been dramatic cost declines in some clean-energy technologies such as solar and wind, Trancik says, when they started to look into the decline in prices for lithium-ion batteries, “we saw that there was substantial disagreement as to how quickly the costs of these technologies had come down.” Similar disagreements showed up in tracing other important aspects of battery development, such as the ever-improving energy density (energy stored within a given volume) and specific energy (energy stored within a given mass).

    “These trends are so consequential for getting us to where we are right now, and also for thinking about what could happen in the future,” says Trancik, who is an associate professor in MIT’s Institute for Data, Systems and Society. While it was common knowledge that the decline in battery costs was an enabler of the recent growth in sales of electric vehicles, for example, it was unclear just how great that decline had been. Through this detailed analysis, she says, “we were able to confirm that yes, lithium-ion battery technologies have improved in terms of their costs, at rates that are comparable to solar energy technology, and specifically photovoltaic modules, which are often held up as kind of the gold standard in clean energy innovation.”

    It may seem odd that there was such great uncertainty and disagreement about how much lithium-ion battery costs had declined, and what factors accounted for it, but in fact much of the information is in the form of closely held corporate data that is difficult for researchers to access. Most lithium-ion batteries are not sold directly to consumers — you can’t run down to your typical corner drugstore to pick up a replacement battery for your iPhone, your PC, or your electric car. Instead, manufacturers buy lithium-ion batteries and build them into electronics and cars. Large companies like Apple or Tesla buy batteries by the millions, or manufacture them themselves, for prices that are negotiated or internally accounted for but never publicly disclosed.

    In addition to helping to boost the ongoing electrification of transportation, further declines in lithium-ion battery costs could potentially also increase the batteries’ usage in stationary applications as a way of compensating for the intermittent supply of clean energy sources such as solar and wind. Both applications could play a significant role in helping to curb the world’s emissions of climate-altering greenhouse gases. “I can’t overstate the importance of these trends in clean energy innovation for getting us to where we are right now, where it starts to look like we could see rapid electrification of vehicles and we are seeing the rapid growth of renewable energy technologies,” Trancik says. “Of course, there’s so much more to do to address climate change, but this has really been a game changer.”

    The new findings are not just a matter of retracing the history of battery development, but of helping to guide the future, Ziegler points out. Combing all of the published literature on the subject of the cost reductions in lithium-ion cells, he found “very different measures of the historical improvement. And across a variety of different papers, researchers were using these trends to make suggestions about how to further reduce costs of lithium-ion technologies or when they might meet cost targets.” But because the underlying data varied so much, “the recommendations that the researchers were making could be quite different.” Some studies suggested that lithium-ion batteries would not fall in cost quickly enough for certain applications, while others were much more optimistic. Such differences in data can ultimately have a real impact on the setting of research priorities and government incentives.

    The researchers dug into the original sources of the published data, in some cases finding that certain primary data had been used in multiple studies that were later cited as separate sources, or that the original data sources had been lost along the way. And while most studies have focused only on the cost, Ziegler says it became clear that such a one-dimensional analysis might underestimate how quickly lithium-ion technologies improved; in addition to cost, weight and volume are also key factors for both vehicles and portable electronics. So, the team added a second track to the study, analyzing the improvements in these parameters as well.

    “Lithium-ion batteries were not adopted because they were the least expensive technology at the time,” Ziegler says. “There were less expensive battery technologies available. Lithium-ion technology was adopted because it allows you to put portable electronics into your hand, because it allows you to make power tools that last longer and have more power, and it allows us to build cars” that can provide adequate driving range. “It felt like just looking at dollars per kilowatt-hour was only telling part of the story,” he says.

    That broader analysis helps to define what may be possible in the future, he adds: “We’re saying that lithium-ion technologies might improve more quickly for certain applications than would be projected by just looking at one measure of performance. By looking at multiple measures, you get essentially a clearer picture of the improvement rate, and this suggests that they could maybe improve more rapidly for applications where the restrictions on mass and volume are relaxed.”

    Trancik adds the new study can play an important role in energy-related policymaking. “Published data trends on the few clean technologies that have seen major cost reductions over time, wind, solar, and now lithium-ion batteries, tend to be referenced over and over again, and not only in academic papers but in policy documents and industry reports,” she says. “Many important climate policy conclusions are based on these few trends. For this reason, it is important to get them right. There’s a real need to treat the data with care, and to raise our game overall in dealing with technology data and tracking these trends.”

    “Battery costs determine price parity of electric vehicles with internal combustion engine vehicles,” says Venkat Viswanathan, an associate professor of mechanical engineering at Carnegie Mellon University, who was not associated with this work. “Thus, projecting battery cost declines is probably one of the most critical challenges in ensuring an accurate understanding of adoption of electric vehicles.”

    Viswanathan adds that “the finding that cost declines may occur faster than previously thought will enable broader adoption, increasing volumes, and leading to further cost declines. … The datasets curated, analyzed and released with this paper will have a lasting impact on the community.”

    The work was supported by the Alfred P. Sloan Foundation. More

  • in

    Transforming lives by providing safe drinking water

    As a child, Susan Murcott ’90 SM ’92 saw firsthand the long-term impact that water- and food-borne illness can have on people.

    At age 16, her maternal grandmother contracted polio, which can be transmitted through direct contact with someone infected with the virus or, occasionally, through contaminated food and water. As a result of the illness, she was forever paralyzed from the waist down. Though Murcott didn’t know it at the time, her decades-long career focusing on clean water access would bring her in close collaboration with countless others around the world whose lives, like her grandmother’s, are impacted by unsafe drinking water.

    Murcott is an MIT environmental engineer, social entrepreneur, and educator who has spent her lifetime collaboratively developing and implementing effective, affordable solutions to provide safe water to the world’s neediest.

    “My core work has been focused on water, sanitation, and hygiene,” Murcott says. “It’s not sexy, it’s not a money maker, and it’s not high-profile news even though there are more childhood deaths each year attributable to water-related diseases than to Covid-19.”

    Globally, 2.2 billion people lack safely managed water and 4.2 billion lack basic sanitation. Polluted water is one of the world’s leading causes of disease and death, particularly for children under the age of 5. Furthermore, women and children bear the disproportionate burden of securing household water, limiting their ability to focus on education, employment, and other opportunities for economic and social advancement.

    “I’ve spent 30 years trying to wake people up to the reality of the importance of safe drinking water, both given my family history and travels around the world,” says Murcott. “I feel like it’s still an invisible problem — invisible, at least, to those of us who are privileged enough to take safe water, sanitation, and hygiene for granted.”

    Throughout her time at MIT — as a student, then a senior lecturer in the Department of Civil and Environmental Engineering, and now as a lecturer at MIT D-Lab and a principal investigator driving water solutions innovations through the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) — Murcott has addressed these challenges head-on.

    Murcott’s work started with megacities. Alongside her mentor and colleague, the late MIT civil engineering professor Donald Harleman, she helped to develop and promote innovation in low-energy, low-cost wastewater treatment as an engineering consultant to municipalities in megacities worldwide. Plants in Hong Kong, Rio de Janeiro, and Mexico City have adopted their strategies and are now serving approximately 15.2 million users combined, treating the wastewater instead of dumping raw sewage directly into local waterways.

    A major turning point in Murcott’s career came when she was the keynote speaker and sole female engineer at the Second International Women and Water Conference in Kathmandu, Nepal, in 1998. At that time, 75 percent of Nepali women were illiterate, and many had children sick with water-related diseases. The organizers of that conference, educated women from Kathmandu, invited the entire spectrum of women throughout Nepal to attend. This meant that attendees at the conference included many illiterate women, all the way up to the Queen of Nepal.

    Desperate for solutions to their water problems, the women asked Murcott for help. This encounter proved powerful and career-changing, inspiring her to pivot toward designing and implementing simple, affordable household drinking water systems by working together with these women and vulnerable households, in Nepal and beyond.

    Major success came two year later, when her team of MIT graduate students and partners from the Nepal Department of Water Supply and Sewerage detected the first instances of arsenic in drinking water in Nepal. In collaboration with the Nepali nonprofit Environment and Public Health Organization (ENPHO), over 40,000 tests of arsenic in tubewell groundwater were conducted, tracking the extent of water contamination for the first time.

    “Without Susan and her MIT graduate student team, we wouldn’t have identified the extent of arsenic contamination in Nepal and taken action to implement remediation solutions as quickly as we did,” says Roshan Raj Shrestha, now the deputy director of water, sanitation, and hygiene at the Bill and Melinda Gates Foundation.

    Murcott and ENPHO worked together to design, prototype, pilot, and implement the Arsenic Biosand Filter, subsequently manufactured and distributed throughout 17 arsenic-affected districts of Nepal. She and team members won numerous awards for this, reinvesting award funds in arsenic remediation across the country and training Nepali entrepreneurs to build and market the filters. “Her work has impacted hundreds of thousands of lives, preventing disease and death from arsenic contaminated drinking water. We owe Susan a great deal of gratitude,” says Shrestha.

    Not one to stop at these accomplishments, Murcott then worked to bring her engineering knowledge and entrepreneurial spirit to aid in the elimination of waterborne disease in northern Ghana.

    There, she launched the nonprofit Pure Home WATER to produce ceramic pot water filters that could help eliminate guinea worm from the water supply. Jim Niquette, Ghana country director of the Carter Center Guinea Worm Eradication Campaign, credits these filters for eradicating this debilitating disease from Ghana between 2008 and 2010.

    “We went from 242 cases of guinea worm to zero in 18 months. Prior to what occurred in Ghana, no country had achieved a success of this kind so quickly,” Niquette says. “Susan’s dedication to poor people’s health and well-being, combined with the innovative ceramic pot filter technology, was critical to the unprecedented success.”

    Murcott has since inspired others to build factories, with several of the MIT students she has mentored going on to build and/or manage successful factories in Uganda and South Africa. Overall, she has influenced the construction of ceramic pot filter factories in 10 countries. These factories now provide clean water to approximately 5 million users.

    Murcott continues to improve clean water access in Asia through the creation of the “ECC vial,” an affordable, easy-to-use E. coli test-kit. The project to refine and scale up distribution and use of the ECC vial received support from the MIT Abdul Latif Jameel Water and Food Systems Lab through the J-WAFS Solutions Program sponsored by Community Jameel. Launched in 2020 in partnership with Nepali social entrepreneurs, this novel technology puts water quality measurement in the hands of users. The aim is to enable millions of people in Nepal and across Asia to directly measure the cleanliness of their water and advocate for safe water solutions in the years ahead.

    Murcott’s impact cannot only be measured in the amount of clean water that she has helped provide. Wanting to bring what she saw abroad back to Massachusetts, Murcott was instrumental in the early days of MIT D-Lab, creating its landmark course 11.474 (G) / EC.715

    (D-Lab Water, Sanitation, and Hygiene), which she has taught since 2006. Through this and other courses she has had the opportunity to meet and inspire students early in their careers.

    Driven by her own experience in the male-dominated field of civil engineering, Murcott has committed herself to collaboration and mentorship, with a particular focus on mentoring young women interested in STEM. Her mentees have founded NGOs, launched humanitarian-oriented startups, developed large-scale wastewater infrastructure projects, produced research to influence national policy, and more.

    “She has the unique skill of being able to guide and teach her students while also allowing space for their own curiosity, interests, and ideas,” says Kate Cincotta SM ’09, one of Murcott’s graduate students who went on to co-found the water nonprofit Saha Global. “Susan understands that working in the international development space requires both technical skills and practical knowledge that can only be gained from field experience, and connects her students with the opportunities to gain both.”

    This sense of higher purpose is one that Murcott tries to live out through her research and implementation work inspiring the next generation. “It’s very important, in my life experience, to follow your dream and to serve others. Do something because it’s worth doing and because it changes people’s lives and saves lives.” More

  • in

    Chemists gain new insights into the behavior of water in an influenza virus channel

    In a new study of water dynamics, a team of MIT chemists led by Professor Mei Hong, in collaboration with Associate Professor Adam Willard, has discovered that water in an ion channel is anisotropic, or partially aligned. The researchers’ data, the first of their kind, prove the relation of water dynamics and order to the conduction of protons in an ion channel. The work also provides potential new avenues for the development of antiviral drugs or other treatments.

    Members of the Hong lab conducted sophisticated nuclear magnetic resonance (NMR) experiments to prove the existence of anisotropic water in the proton channel of the influenza M virus, while members of the Willard group carried out independent all-atom molecular dynamics simulations to validate and augment the experimental data. Their study, of which Hong was the senior author, was published in Communications Biology, and was co-authored by Martin Gelenter, Venkata Mandala, and Aurelio Dregni of the Hong Lab, and Michiel Niesen and Dina Sharon of the Willard group.

    Channel water and influenza virus

    The influenza B virus protein BM2 is a protein channel that acidifies the virus, helping it to release its genetic material into infected cells. The water in this channel plays a critical role in helping the influenza virus become infectious, because it facilitates proton conduction inside the channel to cross the lipid membrane.

    Previously, Hong’s lab studied how the amino acid histidine shuttles protons from water into the flu virus, but they hadn’t investigated the water molecules themselves in detail. This new study has provided the missing link in a full understanding of the mixed hydrogen-bonded chain between water and histidine inside the M2 channel. To curb the flu virus protein, the channel would have to be plugged with small molecules — i.e., antiviral drugs — so that the water pathway would be broken.

    In order to align the water-water hydrogen bonds for “proton hopping,” water molecules must be at least partially oriented. However, to experimentally detect the tiny amount of residual alignment of water molecules in a channel, without freezing the sample, is extremely difficult. As a result, the majority of previous studies on the topic were conducted by computational chemists like Willard. Experimental data on this topic were typically restricted to crystal structures obtained at cryogenic temperatures. The Hong lab adopted a relaxation NMR technique that can be employed at the much balmier temperature of around 0 degrees Celsius. At this temperature, the water molecules rotated just slowly enough for the researchers to observe the mobility and residual orientation in the channel for the first time.

    More space, more order

    The evidence yielded by Hong’s NMR experiments indicated that the water molecules in the open state of the BM2 channel are more aligned than they are in the closed state, even though there are many more water molecules in the open state. The researchers detected this residual order by measuring a magnetic property called chemical shift anisotropy for the water protons. The higher water alignment at low pH came as a surprise.

    “This was initially counterintuitive to us,” says Hong. “We know from a lot of previous NMR data that the open channel has more water molecules, so one would think that these water molecules should be more disordered and random in the wider channel. But no, the waters are actually slightly better aligned based on the relaxation NMR data.” Molecular dynamic simulations indicated that this order is induced by the key proton-selective residue, a histidine, which is positively charged at low pH.

    By employing solid-state NMR spectroscopy and molecular dynamics simulations, the researchers also found that water rotated and translated across the channel more rapidly in the low-pH open state than in the high-pH closed state. These results together indicate that the water molecules undergo small-amplitude reorientations to establish the alignment that is necessary for proton hopping.

    Inhibiting proton conduction, blocking the virus

    By using molecular dynamics simulations performed by Willard and his group, the researchers were able to observe that the water network has fewer hydrogen-bonding bottlenecks in the open state than in the closed state. Thus, faster dynamics and higher orientational order of water molecules in the open channel establish the water network structure that is necessary for proton hopping and successful infection on the virus’ part.

    When a flu virus enters a cell, it goes into a small compartment called the endosome. The endosome compartment is acidic, which triggers the protein to open its water-permeated pathway and conduct the protons into the virus. Acidic pH has a high concentration of hydrogen ions, which is what the M2 protein conducts. Without the water molecules relaying the protons, the protons will not reach the histidine, a critical amino acid residue. The histidine is the proton-selective residue, and it rotates in order to shuttle the protons carried by the water molecules. The relay chain between the water molecules and the histidine is therefore responsible for proton conduction through the M2 channel. Therefore, the findings indicated in this research could prove relevant to the development of antiviral drugs and other practical applications. More

  • in

    Study: One enzyme dictates cells’ response to a probable carcinogen

    In the past few years, several medications have been found to be contaminated with NDMA, a probable carcinogen. This chemical, which has also been found at Superfund sites and in some cases has spread to drinking water supplies, causes DNA damage that can lead to cancer.

    MIT researchers have now discovered a mechanism that helps explain whether this damage will lead to cancer in mice: The key is the way cellular DNA repair systems respond. The team found that too little activity of one enzyme necessary for DNA repair leads to much higher cancer rates, while too much activity can produce tissue damage, especially in the liver, which can be fatal.

    Activity levels of this enzyme, called AAG, can vary greatly among different people, and measuring those levels could allow doctors to predict how people might respond to NDMA exposure, says Bevin Engelward, a professor of biological engineering at MIT and the senior author of the study. “It may be that people who are low in this enzyme are more prone to cancer from environmental exposures,” she says.

    MIT postdoc Jennifer Kay is the lead author of the new study, which appears today in Cell Reports.

    Potential hazards

    For several years, Engelward’s lab, in collaboration with the lab of MIT Professor Leona Samson, has been working on a research project, funded by the National Institute of Environmental Health Sciences, to study the effects of exposure to NDMA. This chemical is found in Superfund sites including the contaminated Olin Chemical site in Wilmington, Massachusetts. In the early 2000s, municipal water wells near the site had to be shut down because the groundwater was contaminated with NDMA and other hazardous chemicals.

    More recently, it was discovered that several types of medicine, including Zantac and drugs used to treat type 2 diabetes and high blood pressure, had been contaminated with NDMA. This chemical causes specific types of DNA damage, one of which is a lesion of adenine, one of the bases found in DNA. These lesions are repaired by AAG, which snips out the damaged bases so that other enzymes can cleave the DNA backbone, enabling DNA polymerases to replace them with new ones.

    If AAG activity is very high and the polymerases (or other downstream enzymes) can’t keep up with the repair, then the DNA may end up with too many unrepaired strand breaks, which can be fatal to the cell. However, if AAG activity is too low, damaged adenines persist and can be read incorrectly by the polymerase, causing the wrong base to be paired with it. Incorrect insertion of a new base produces a mutation, and accumulated mutations are known to cause cancer.

    In the new study, the MIT team studied mice with high levels of AAG — six times the normal amount — and mice with AAG knocked out. After exposure to NDMA, the mice with no AAG had many more mutations and higher rates of cancer in the liver, where NDMA has its greatest effect. Mice with sixfold levels of AAG had fewer mutations and lower cancer rates, at first glance appearing to be beneficial. However, in those mice, the researchers found a great deal of tissue damage and cell death in the liver.

    Mice with normal amounts of AAG (“wild-type” mice) showed some mutations after NDMA exposure but overall were much better protected against both cancer and liver damage.

    “Nature did a really good job establishing the optimal levels of AAG, at least for our animal model,” Engelward says. “What is striking is that the levels of one gene out of 23,000 dictates disease outcome, yielding opposite effects depending on low or high expression.” If too low, there are too many mutations; if too high, there is too much cell death.

    Varying responses

    In humans, there is a great deal of variation in AAG levels between different people: Studies have found that some people can have up to 20 times more AAG activity than others. This suggests that people may respond very differently to damage caused by NDMA, Kay says. Measuring those levels could potentially allow doctors to predict how people may respond to NDMA exposure in the environment or in contaminated medicines, she says.

    The researchers next plan to study the effects of chronic, low-level exposure to NDMA in mice, which they hope will shed light on how such exposures might affect humans. “That’s one of the top priorities for us, to figure out what happens in a real world, everyday exposure scenario,” Kay says.

    Another population for which measuring AAG levels could be useful is cancer patients who take temozolomide, a chemotherapy drug that causes the same kind of DNA damage as NDMA. It’s possible that people with high levels of AAG could experience more severe toxic side effects from taking the drug, while people with lower levels of AAG could be susceptible to mutations that might lead to a recurrence of cancer later in life, Kay says, adding that more studies are needed to investigate these potential outcomes.

    The research was funded primarily by the National Institute of Environmental Health Sciences Superfund Basic Research Program, with additional support from the National Cancer Institute and the MIT Center for Environmental Health Sciences.

    Other authors of the paper include Joshua Corrigan, an MIT technical associate, who is second author; Amanda Armijo, an MIT postdoc; Ilana Nazari, an MIT undergraduate; Ishwar Kohale, an MIT graduate student; Robert Croy, an MIT research scientist; Sebastian Carrasco, an MIT comparative pathologist; Dushan Wadduwage, a fellow at the Center for Advanced Imaging at Harvard University; Dorothea Torous, Svetlana Avlasevich, and Stephen Dertinger of Litron Laboratories; Forest White, an MIT professor of biological engineering; John Essigmann, a professor of chemistry and biological engineering at MIT; and Samson, a professor emerita of biology and biological engineering at MIT. More