More stories

  • in

    Using soap to remove micropollutants from water

    Imagine millions of soapy sponges the size of human cells that can clean water by soaking up contaminants. This simplistic model is used to describe technology that MIT chemical engineers have recently developed to remove micropollutants from water — a concerning, worldwide problem.

    Patrick S. Doyle, the Robert T. Haslam Professor of Chemical Engineering, PhD student Devashish Pratap Gokhale, and undergraduate Ian Chen recently published their research on micropollutant removal in the journal ACS Applied Polymer Materials. The work is funded by MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS).

    In spite of their low concentrations (about 0.01–100 micrograms per liter), micropollutants can be hazardous to the ecosystem and to human health. They come from a variety of sources and have been detected in almost all bodies of water, says Gokhale. Pharmaceuticals passing through people and animals, for example, can end up as micropollutants in the water supply. Others, like endocrine disruptor bisphenol A (BPA), can leach from plastics during industrial manufacturing. Pesticides, dyes, petrochemicals, and per-and polyfluoroalkyl substances, more commonly known as PFAS, are also examples of micropollutants, as are some heavy metals like lead and arsenic. These are just some of the kinds of micropollutants, all of which can be toxic to humans and animals over time, potentially causing cancer, organ damage, developmental defects, or other adverse effects.

    Micropollutants are numerous but since their collective mass is small, they are difficult to remove from water. Currently, the most common practice for removing micropollutants from water is activated carbon adsorption. In this process, water passes through a carbon filter, removing only 30 percent of micropollutants. Activated carbon requires high temperatures to produce and regenerate, requiring specialized equipment and consuming large amounts of energy. Reverse osmosis can also be used to remove micropollutants from water; however, “it doesn’t lead to good elimination of this class of molecules, because of both their concentration and their molecular structure,” explains Doyle.

    Inspired by soap

    When devising their solution for how to remove micropollutants from water, the MIT researchers were inspired by a common household cleaning supply — soap. Soap cleans everything from our hands and bodies to dirty dishes to clothes, so perhaps the chemistry of soap could also be applied to sanitizing water. Soap has molecules called surfactants which have both hydrophobic (water-hating) and hydrophilic (water-loving) components. When water comes in contact with soap, the hydrophobic parts of the surfactant stick together, assembling into spherical structures called micelles with the hydrophobic portions of the molecules in the interior. The hydrophobic micelle cores trap and help carry away oily substances like dirt. 

    Doyle’s lab synthesized micelle-laden hydrogel particles to essentially cleanse water. Gokhale explains that they used microfluidics which “involve processing fluids on very small, micron-like scales” to generate uniform polymeric hydrogel particles continuously and reproducibly. These hydrogels, which are porous and absorbent, incorporate a surfactant, a photoinitiator (a molecule that creates reactive species), and a cross-linking agent known as PEGDA. The surfactant assembles into micelles that are chemically bonded to the hydrogel using ultraviolet light. When water flows through this micro-particle system, micropollutants latch onto the micelles and separate from the water. The physical interaction used in the system is strong enough to pull micropollutants from water, but weak enough that the hydrogel particles can be separated from the micropollutants, restabilized, and reused. Lab testing shows that both the speed and extent of pollutant removal increase when the amount of surfactant incorporated into the hydrogels is increased.

    “We’ve shown that in terms of rate of pullout, which is what really matters when you scale this up for industrial use, that with our initial format, we can already outperform the activated carbon,” says Doyle. “We can actually regenerate these particles very easily at room temperature. Nearly 10 regeneration cycles with minimal change in performance,” he adds.

    Regeneration of the particles occurs by soaking the micelles in 90 percent ethanol, whereby “all the pollutants just come out of the particles and back into the ethanol” says Gokhale. Ethanol is biosafe at low concentrations, inexpensive, and combustible, allowing for safe and economically feasible disposal. The recycling of the hydrogel particles makes this technology sustainable, which is a large advantage over activated carbon. The hydrogels can also be tuned to any hydrophobic micropollutant, making this system a novel, flexible approach to water purification.

    Scaling up

    The team experimented in the lab using 2-naphthol, a micropollutant that is an organic pollutant of concern and known to be difficult to remove by using conventional water filtration methods. They hope to continue testing with real water samples. 

    “Right now, we spike one micropollutant into pure lab water. We’d like to get water samples from the natural environment, that we can study and look at experimentally,” says Doyle. 

    By using microfluidics to increase particle production, Doyle and his lab hope to make household-scale filters to be tested with real wastewater. They then anticipate scaling up to municipal water treatment or even industrial wastewater treatment. 

    The lab recently filed an international patent application for their hydrogel technology that uses immobilized micelles. They plan to continue this work by experimenting with different kinds of hydrogels for the removal of heavy metal contaminants like lead from water. 

    Societal impacts

    Funded by a 2019 J-WAFS seed grant that is currently ongoing, this research has the potential to improve the speed, precision, efficiency, and environmental sustainability of water purification systems across the world. 

    “I always wanted to do work which had a social impact, and I was also always interested in water, because I think it’s really cool,” says Gokhale. He notes, “it’s really interesting how water sort of fits into different kinds of fields … we have to consider the cultures of peoples, how we’re going to use this, and then just the equity of these water processes.” Originally from India, Gokhale says he’s seen places that have barely any water at all and others that have floods year after year. “There’s a lot of interesting work to be done, and I think it’s work in this area that’s really going to impact a lot of people’s lives in years to come,” Gokhale says.

    Doyle adds, “water is the most important thing, perhaps for the next decades to come, so it’s very fulfilling to work on something that is so important to the whole world.” More

  • in

    Overcoming a bottleneck in carbon dioxide conversion

    If researchers could find a way to chemically convert carbon dioxide into fuels or other products, they might make a major dent in greenhouse gas emissions. But many such processes that have seemed promising in the lab haven’t performed as expected in scaled-up formats that would be suitable for use with a power plant or other emissions sources.

    Now, researchers at MIT have identified, quantified, and modeled a major reason for poor performance in such conversion systems. The culprit turns out to be a local depletion of the carbon dioxide gas right next to the electrodes being used to catalyze the conversion. The problem can be alleviated, the team found, by simply pulsing the current off and on at specific intervals, allowing time for the gas to build back up to the needed levels next to the electrode.

    The findings, which could spur progress on developing a variety of materials and designs for electrochemical carbon dioxide conversion systems, were published today in the journal Langmuir, in a paper by MIT postdoc Álvaro Moreno Soto, graduate student Jack Lake, and professor of mechanical engineering Kripa Varanasi.

    “Carbon dioxide mitigation is, I think, one of the important challenges of our time,” Varanasi says. While much of the research in the area has focused on carbon capture and sequestration, in which the gas is pumped into some kind of deep underground reservoir or converted to an inert solid such as limestone, another promising avenue has been converting the gas into other carbon compounds such as methane or ethanol, to be used as fuel, or ethylene, which serves as a precursor to useful polymers.

    There are several ways to do such conversions, including electrochemical, thermocatalytic, photothermal, or photochemical processes. “Each of these has problems or challenges,” Varanasi says. The thermal processes require very high temperature, and they don’t produce very high-value chemical products, which is a challenge with the light-activated processes as well, he says. “Efficiency is always at play, always an issue.”

    The team has focused on the electrochemical approaches, with a goal of getting “higher-C products” — compounds that contain more carbon atoms and tend to be higher-value fuels because of their energy per weight or volume. In these reactions, the biggest challenge has been curbing competing reactions that can take place at the same time, especially the splitting of water molecules into oxygen and hydrogen.

    The reactions take place as a stream of liquid electrolyte with the carbon dioxide dissolved in it passes over a metal catalytic surface that is electrically charged. But as the carbon dioxide gets converted, it leaves behind a region in the electrolyte stream where it has essentially been used up, and so the reaction within this depleted zone turns toward water splitting instead. This unwanted reaction uses up energy and greatly reduces the overall efficiency of the conversion process, the researchers found.

    “There’s a number of groups working on this, and a number of catalysts that are out there,” Varanasi says. “In all of these, I think the hydrogen co-evolution becomes a bottleneck.”

    One way of counteracting this depletion, they found, can be achieved by a pulsed system — a cycle of simply turning off the voltage, stopping the reaction and giving the carbon dioxide time to spread back into the depleted zone and reach usable levels again, and then resuming the reaction.

    Often, the researchers say, groups have found promising catalyst materials but haven’t run their lab tests long enough to observe these depletion effects, and thus have been frustrated in trying to scale up their systems. Furthermore, the concentration of carbon dioxide next to the catalyst dictates the products that are made. Hence, depletion can also change the mix of products that are produced and can make the process unreliable. “If you want to be able to make a system that works at industrial scale, you need to be able to run things over a long period of time,” Varanasi says, “and you need to not have these kinds of effects that reduce the efficiency or reliability of the process.”

    The team studied three different catalyst materials, including copper, and “we really focused on making sure that we understood and can quantify the depletion effects,” Lake says. In the process they were able to develop a simple and reliable way of monitoring the efficiency of the conversion process as it happens, by measuring the changing pH levels, a measure of acidity, in the system’s electrolyte.

    In their tests, they used more sophisticated analytical tools to characterize reaction products, including gas chromatography for analysis of the gaseous products, and nuclear magnetic resonance characterization for the system’s liquid products. But their analysis showed that the simple pH measurement of the electrolyte next to the electrode during operation could provide a sufficient measure of the efficiency of the reaction as it progressed.

    This ability to easily monitor the reaction in real-time could ultimately lead to a system optimized by machine-learning methods, controlling the production rate of the desired compounds through continuous feedback, Moreno Soto says.

    Now that the process is understood and quantified, other approaches to mitigating the carbon dioxide depletion might be developed, the researchers say, and could easily be tested using their methods.

    This work shows, Lake says, that “no matter what your catalyst material is” in such an electrocatalytic system, “you’ll be affected by this problem.” And now, by using the model they developed, it’s possible to determine exactly what kind of time window needs to be evaluated to get an accurate sense of the material’s overall efficiency and what kind of system operations could maximize its effectiveness.

    The research was supported by Shell, through the MIT Energy Initiative. More

  • in

    Understanding air pollution from space

    Climate change and air pollution are interlocking crises that threaten human health. Reducing emissions of some air pollutants can help achieve climate goals, and some climate mitigation efforts can in turn improve air quality.

    One part of MIT Professor Arlene Fiore’s research program is to investigate the fundamental science in understanding air pollutants — how long they persist and move through our environment to affect air quality.

    “We need to understand the conditions under which pollutants, such as ozone, form. How much ozone is formed locally and how much is transported long distances?” says Fiore, who notes that Asian air pollution can be transported across the Pacific Ocean to North America. “We need to think about processes spanning local to global dimensions.”

    Fiore, the Peter H. Stone and Paola Malanotte Stone Professor in Earth, Atmospheric and Planetary Sciences, analyzes data from on-the-ground readings and from satellites, along with models, to better understand the chemistry and behavior of air pollutants — which ultimately can inform mitigation strategies and policy setting.

    A global concern

    At the United Nations’ most recent climate change conference, COP26, air quality management was a topic discussed over two days of presentations.

    “Breathing is vital. It’s life. But for the vast majority of people on this planet right now, the air that they breathe is not giving life, but cutting it short,” said Sarah Vogel, senior vice president for health at the Environmental Defense Fund, at the COP26 session.

    “We need to confront this twin challenge now through both a climate and clean air lens, of targeting those pollutants that warm both the air and harm our health.”

    Earlier this year, the World Health Organization (WHO) updated its global air quality guidelines it had issued 15 years earlier for six key pollutants including ozone (O3), nitrogen dioxide (NO2), sulfur dioxide (SO2), and carbon monoxide (CO). The new guidelines are more stringent based on what the WHO stated is the “quality and quantity of evidence” of how these pollutants affect human health. WHO estimates that roughly 7 million premature deaths are attributable to the joint effects of air pollution.

    “We’ve had all these health-motivated reductions of aerosol and ozone precursor emissions. What are the implications for the climate system, both locally but also around the globe? How does air quality respond to climate change? We study these two-way interactions between air pollution and the climate system,” says Fiore.

    But fundamental science is still required to understand how gases, such as ozone and nitrogen dioxide, linger and move throughout the troposphere — the lowermost layer of our atmosphere, containing the air we breathe.

    “We care about ozone in the air we’re breathing where we live at the Earth’s surface,” says Fiore. “Ozone reacts with biological tissue, and can be damaging to plants and human lungs. Even if you’re a healthy adult, if you’re out running hard during an ozone smog event, you might feel an extra weight on your lungs.”

    Telltale signs from space

    Ozone is not emitted directly, but instead forms through chemical reactions catalyzed by radiation from the sun interacting with nitrogen oxides — pollutants released in large part from burning fossil fuels—and volatile organic compounds. However, current satellite instruments cannot sense ground-level ozone.

    “We can’t retrieve surface- or even near-surface ozone from space,” says Fiore of the satellite data, “although the anticipated launch of a new instrument looks promising for new advances in retrieving lower-tropospheric ozone”. Instead, scientists can look at signatures from other gas emissions to get a sense of ozone formation. “Nitrogen dioxide and formaldehyde are a heavy focus of our research because they serve as proxies for two of the key ingredients that go on to form ozone in the atmosphere.”

    To understand ozone formation via these precursor pollutants, scientists have gathered data for more than two decades using spectrometer instruments aboard satellites that measure sunlight in ultraviolet and visible wavelengths that interact with these pollutants in the Earth’s atmosphere — known as solar backscatter radiation.

    Satellites, such as NASA’s Aura, carry instruments like the Ozone Monitoring Instrument (OMI). OMI, along with European-launched satellites such as the Global Ozone Monitoring Experiment (GOME) and the Scanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY), and the newest generation TROPOspheric Monitoring instrument (TROPOMI), all orbit the Earth, collecting data during daylight hours when sunlight is interacting with the atmosphere over a particular location.

    In a recent paper from Fiore’s group, former graduate student Xiaomeng Jin (now a postdoc at the University of California at Berkeley), demonstrated that she could bring together and “beat down the noise in the data,” as Fiore says, to identify trends in ozone formation chemistry over several U.S. metropolitan areas that “are consistent with our on-the-ground understanding from in situ ozone measurements.”

    “This finding implies that we can use these records to learn about changes in surface ozone chemistry in places where we lack on-the-ground monitoring,” says Fiore. Extracting these signals by stringing together satellite data — OMI, GOME, and SCIAMACHY — to produce a two-decade record required reconciling the instruments’ differing orbit days, times, and fields of view on the ground, or spatial resolutions. 

    Currently, spectrometer instruments aboard satellites are retrieving data once per day. However, newer instruments, such as the Geostationary Environment Monitoring Spectrometer launched in February 2020 by the National Institute of Environmental Research in the Ministry of Environment of South Korea, will monitor a particular region continuously, providing much more data in real time.

    Over North America, the Tropospheric Emissions: Monitoring of Pollution Search (TEMPO) collaboration between NASA and the Smithsonian Astrophysical Observatory, led by Kelly Chance of Harvard University, will provide not only a stationary view of the atmospheric chemistry over the continent, but also a finer-resolution view — with the instrument recording pollution data from only a few square miles per pixel (with an anticipated launch in 2022).

    “What we’re very excited about is the opportunity to have continuous coverage where we get hourly measurements that allow us to follow pollution from morning rush hour through the course of the day and see how plumes of pollution are evolving in real time,” says Fiore.

    Data for the people

    Providing Earth-observing data to people in addition to scientists — namely environmental managers, city planners, and other government officials — is the goal for the NASA Health and Air Quality Applied Sciences Team (HAQAST).

    Since 2016, Fiore has been part of HAQAST, including collaborative “tiger teams” — projects that bring together scientists, nongovernment entities, and government officials — to bring data to bear on real issues.

    For example, in 2017, Fiore led a tiger team that provided guidance to state air management agencies on how satellite data can be incorporated into state implementation plans (SIPs). “Submission of a SIP is required for any state with a region in non-attainment of U.S. National Ambient Air Quality Standards to demonstrate their approach to achieving compliance with the standard,” says Fiore. “What we found is that small tweaks in, for example, the metrics we use to convey the science findings, can go a long way to making the science more usable, especially when there are detailed policy frameworks in place that must be followed.”

    Now, in 2021, Fiore is part of two tiger teams announced by HAQAST in late September. One team is looking at data to address environmental justice issues, by providing data to assess communities disproportionately affected by environmental health risks. Such information can be used to estimate the benefits of governmental investments in environmental improvements for disproportionately burdened communities. The other team is looking at urban emissions of nitrogen oxides to try to better quantify and communicate uncertainties in the estimates of anthropogenic sources of pollution.

    “For our HAQAST work, we’re looking at not just the estimate of the exposure to air pollutants, or in other words their concentrations,” says Fiore, “but how confident are we in our exposure estimates, which in turn affect our understanding of the public health burden due to exposure. We have stakeholder partners at the New York Department of Health who will pair exposure datasets with health data to help prioritize decisions around public health.

    “I enjoy working with stakeholders who have questions that require science to answer and can make a difference in their decisions.” Fiore says. More

  • in

    J-PAL North America announces five new partnerships with state and local governments

    J-PAL North America, a research center in MIT’s Department of Economics, has announced five new partnerships with state and local governments across the United States after a call for proposals in early February. Over the next year, these partners will work with J-PAL North America’s State and Local Innovation Initiative to evaluate policy-relevant questions critical to alleviating poverty in the United States.

    J-PAL North America will work with the Colorado Department of Higher Education, Ohio’s Franklin County Department of Job and Family Services, the New Mexico Public Education Department, Puerto Rico’s Department of Economic Development and Commerce, and Oregon’s Jackson County Fire District 3. Each partner will leverage support from J-PAL North America to develop randomized evaluations, which have the potential to reveal widely applicable lessons about which programs and policies are most effective. 

    State and local leaders are vital stakeholders in developing rigorous evidence in order to understand which policies and programs work to reduce poverty, and why. By supporting each government partner in developing these five evaluation projects, the voice of policymakers and practitioners will remain a central part of the research process. Each of this year’s selected projects seeks to address policy concerns that have been identified by state and local governments in J-PAL North America’s State and Local Learning Agenda as key areas for addressing barriers to mobility from poverty, including environment, education, economic security, and housing stability. 

    One project looks to mitigate the emission of carbon co-pollutants, which cause disproportionately high rates of health problems among communities experiencing poverty. 

    Oregon’s Jackson County Fire District 3 will investigate the impact of subsidies on the uptake of wildfire risk reduction activities in a county severely affected by wildfires. “Wildfires have become more prevalent, longer lasting, and more destructive in Oregon and across the western United States. We also know that wildfire is disproportionately impacting our most vulnerable populations,” says Bob Horton, fire chief of Jackson County Fire District 3. “With technical support from JPAL North America’s staff and this grant funding, we will devise the most current and effective strategy, deeply rooted in the evidence, to drive the take-up of home-hardening behaviors — methods to increase a home’s resistance to fire — and lower the risk to homes when faced with wildfire.” 

    This project is in alignment with the priorities of J-PAL’s Environment, Energy, and Climate Change sector and its agenda for catalyzing more policy-relevant research on adaptation strategies. 

    Policymakers and researchers have also identified programs aimed at increasing opportunity within education as a key priority for evaluation. In partnering with J-PAL North America, the Colorado Department of Higher Education will assess the impact of My Colorado Journey, an online platform available to all Coloradans that provides information on education, training, and career pathways. 

    “As Colorado builds back stronger from the pandemic, we know that education and workforce development are at the center of Colorado’s recovery agenda,” shares Executive Director Angie Paccione of the Colorado Department of Education. “Platforms like My Colorado Journey are key to supporting the education, training, and workforce exploration of Coloradans of any age. With support from J-PAL North America, we can better understand how to effectively serve Coloradans, further enhance this vital platform, and continue to build a Colorado for all.”

    Similarly, the New Mexico Public Education Department proposes their intervention within the context of New Mexico’s community school state initiative. They will look at the impact of case management and cash transfers on students at risk of multiple school transfers throughout their education, which include children who are experiencing homelessness, migrant children, children in the foster care system, and military-connected children, among others. “New Mexico is delighted to partner with J-PAL North America to explore visionary pathways to success for highly mobile students,” says New Mexico Public Education Secretary (Designate) Kurt Steinhaus. “We look forward to implementing and testing innovative solutions, such as cash transfers, that can expand our current nationally recognized community schools strategy. Together, we aim to find solutions that meet the needs of highly mobile students and families who lack stable housing.”

    Another key priority for the intersection of policy and research is economic security — fostering upward mobility by providing individuals with resources to promote stable incomes and increase standards of living. By adjusting caseworker employment services to better align with local needs, Puerto Rico’s Department of Economic Development and Commerce (DEDC) looks to understand how individualized services can impact employment and earnings. 

    “The commitment of the government of Puerto Rico is to develop human resources to the highest quality standards,” says DEDC Secretary Cidre Miranda, whose statement was provided in Spanish and translated. “For the DEDC, it is fundamental to contribute to the development of initiatives like this one, because they have the objective of forging the future professionals that Puerto Rico requires and needs.” J-PAL North America’s partnership with DEDC has the potential to provide valuable lessons for other state and local programs also seeking to promote economic security. 

    Finally, Ohio’s Franklin County Department of Job and Family Services seeks to understand the impact of an eviction prevention workshop in a county with eviction rates that are higher than both the state and national average. “Stable housing should not be a luxury, but for far too many Franklin County families it has become one,” Deputy Franklin County Administrator Joy Bivens says. “We need to view our community’s affordable housing crisis through both a social determinants of health and racial equity lens. We are grateful for the opportunity to partner with J-PAL North America to ensure we are pursuing research-based interventions that, yes, address immediate housing needs, but also provide long-term stability so they can climb the economic ladder.”

    Franklin County Department of Job and Family Services’ evaluation aligns with policymaker and researcher interests to ensure safe and affordable housing. This partnership will have great potential to not only improve resources local to Franklin County, but, along with each of the other four agencies, can also provide a useful model for other government agencies facing similar challenges.For more information on state and local policy priorities, see J-PAL North America’s State and Local Learning Agenda. To learn more about the State and Local Innovation Initiative, please visit the Initiative webpage or contact Initiative Manager Louise Geraghty. More

  • in

    Q&A: Can the world change course on climate?

    In this ongoing series on climate issues, MIT faculty, students, and alumni in the humanistic fields share perspectives that are significant for solving climate change and mitigating its myriad social and ecological impacts. Nazli Choucri is a professor of political science and an expert on climate issues, who also focuses on international relations and cyberpolitics. She is the architect and director of the Global System for Sustainable Development, an evolving knowledge networking system centered on sustainability problems and solution strategies. The author and/or editor of 12 books, she is also the founding editor of the MIT Press book series “Global Environmental Accord: Strategies for Sustainability and Institutional Innovation.” Q: The impacts of climate change — including storms, floods, wildfires, and droughts — have the potential to destabilize nations, yet they are not constrained by borders. What international developments most concern you in terms of addressing climate change and its myriad ecological and social impacts?

    A: Climate change is a global issue. By definition, and a long history of practice, countries focus on their own priorities and challenges. Over time, we have seen the gradual development of norms reflecting shared interests, and the institutional arrangements to support and pursue the global good. What concerns me most is that general responses to the climate crisis are being framed in broad terms; the overall pace of change remains perilously slow; and uncertainty remains about operational action and implementation of stated intent. We have just seen the completion of the 26th meeting of states devoted to climate change, the United Nations Climate Change Conference (COP26). In some ways this is positive. Yet, past commitments remain unfulfilled, creating added stress in an already stressful political situation. Industrial countries are uneven in their recognition of, and responses to, climate change. This may signal uncertainty about whether climate matters are sufficiently compelling to call for immediate action. Alternatively, the push for changing course may seem too costly at a time when other imperatives — such as employment, economic growth, or protecting borders — inevitably dominate discourse and decisions. Whatever the cause, the result has been an unwillingness to take strong action. Unfortunately, climate change remains within the domain of “low politics,” although there are signs the issue is making a slow but steady shift to “high politics” — those issues deemed vital to the existence of the state. This means that short-term priorities, such as those noted above, continue to shape national politics and international positions and, by extension, to obscure the existential threat revealed by scientific evidence. As for developing countries, these are overwhelmed by internal challenges, and managing the difficulties of daily life always takes priority over other challenges, however compelling. Long-term thinking is a luxury, but daily bread is a necessity. Non-state actors — including registered nongovernmental organizations, climate organizations, sustainability support groups, activists of various sorts, and in some cases much of civil society — have been left with a large share of the responsibility for educating and convincing diverse constituencies of the consequences of inaction on climate change. But many of these institutions carry their own burdens and struggle to manage current pressures. The international community, through its formal and informal institutions, continues to articulate the perils of climate change and to search for a powerful consensus that can prove effective both in form and in function. The general contours are agreed upon — more or less. But leadership of, for, and by the global collective is elusive and difficult to shape. Most concerning of all is the clear reluctance to address head-on the challenge of planning for changes that we know will occur. The reality that we are all being affected — in different ways and to different degrees — has yet to be sufficiently appreciated by everyone, everywhere. Yet, in many parts of the world, major shifts in climate will create pressures on human settlements, spur forced migrations, or generate social dislocations. Some small island states, for example, may not survive a sea-level surge. Everywhere there is a need to cut emissions, and this means adaptation and/or major changes in economic activity and in lifestyle.The discourse and debate at COP26 reflect all of such persistent features in the international system. So far, the largest achievements center on the common consensus that more must be done to prevent the rise in temperature from creating a global catastrophe. This is not enough, however. Differences remain, and countries have yet to specify what cuts in emissions they are willing to make.Echoes of who is responsible for what remains strong. The thorny matter of the unfulfilled pledge of $100 billion once promised by rich countries to help countries to reduce their emissions remained unresolved. At the same time, however, some important agreements were reached. The United States and China announced they would make greater efforts to cut methane, a powerful greenhouse gas. More than 100 countries agreed to end deforestation. India joined the countries committed to attain zero emissions by 2070. And on matters of finance, countries agreed to a two-year plan to determine how to meet the needs of the most-vulnerable countries. Q: In what ways do you think the tools and insights from political science can advance efforts to address climate change and its impacts?A: I prefer to take a multidisciplinary view of the issues at hand, rather than focus on the tools of political science alone. Disciplinary perspectives can create siloed views and positions that undermine any overall drive toward consensus. The scientific evidence is pointing to, even anticipating, pervasive changes that transcend known and established parameters of social order all across the globe.That said, political science provides important insight, even guidance, for addressing the impacts of climate change in some notable ways. One is understanding the extent to which our formal institutions enable discussion, debate, and decisions about the directions we can take collectively to adapt, adjust, or even depart from the established practices of managing social order.If we consider politics as the allocation of values in terms of who gets what, when, and how, then it becomes clear that the current allocation requires a change in course. Coordination and cooperation across the jurisdictions of sovereign states is foundational for any response to climate change impacts.We have already recognized, and to some extent, developed targets for reducing carbon emissions — a central impact from traditional forms of energy use — and are making notable efforts to shift toward alternatives. This move is an easy one compared to all the work that needs to be done to address climate change. But, in taking this step we have learned quite a bit that might help in creating a necessary consensus for cross-jurisdiction coordination and response.Respecting individuals and protecting life is increasingly recognized as a global value — at least in principle. As we work to change course, new norms will be developed, and political science provides important perspectives on how to establish such norms. We will be faced with demands for institutional design, and these will need to embody our guiding values. For example, having learned to recognize the burdens of inequity, we can establish the value of equity as foundational for our social order both now and as we recognize and address the impacts of climate change.

    Q: You teach a class on “Sustainability Development: Theory and Practice.” Broadly speaking, what are goals of this class? What lessons do you hope students will carry with them into the future?A: The goal of 17.181, my class on sustainability, is to frame as clearly as possible the concept of sustainable development (sustainability) with attention to conceptual, empirical, institutional, and policy issues.The course centers on human activities. Individuals are embedded in complex interactive systems: the social system, the natural environment, and the constructed cyber domain — each with distinct temporal, special, and dynamic features. Sustainability issues intersect with, but cannot be folded into, the impacts of climate change. Sustainability places human beings in social systems at the core of what must be done to respect the imperatives of a highly complex natural environment.We consider sustainability an evolving knowledge domain with attendant policy implications. It is driven by events on the ground, not by revolution in academic or theoretical concerns per se. Overall, sustainable development refers to the process of meeting the needs of current and future generations, without undermining the resilience of the life-supporting properties, the integrity of social systems, or the supports of the human-constructed cyberspace.More specifically, we differentiate among four fundamental dimensions and their necessary conditions:

    (a) ecological systems — exhibiting balance and resilience;(b) economic production and consumption — with equity and efficiency;(c) governance and politics — with participation and responsiveness; and(d) institutional performance — demonstrating adaptation and incorporating feedback.The core proposition is this: If all conditions hold, then the system is (or can be) sustainable. Then, we must examine the critical drivers — people, resources, technology, and their interactions — followed by a review and assessment of evolving policy responses. Then we ask: What are new opportunities?I would like students to carry forward these ideas and issues: what has been deemed “normal” in modern Western societies and in developing societies seeking to emulate the Western model is damaging humans in many ways — all well-known. Yet only recently have alternatives begun to be considered to the traditional economic growth model based on industrialization and high levels of energy use. To make changes, we must first understand the underlying incentives, realities, and choices that shape a whole set of dysfunctional behaviors and outcomes. We then need to delve deep into the driving sources and consequences, and to consider the many ways in which our known “normal” can be adjusted — in theory and in practice. Q: In confronting an issue as formidable as global climate change, what gives you hope?  A: I see a few hopeful signs; among them:The scientific evidence is clear and compelling. We are no longer discussing whether there is climate change, or if we will face major challenges of unprecedented proportions, or even how to bring about an international consensus on the salience of such threats.Climate change has been recognized as a global phenomenon. Imperatives for cooperation are necessary. No one can go it alone. Major efforts have and are being made in world politics to forge action agendas with specific targets.The issue appears to be on the verge of becoming one of “high politics” in the United States.Younger generations are more sensitive to the reality that we are altering the life-supporting properties of our planet. They are generally more educated, skilled, and open to addressing such challenges than their elders.However disappointing the results of COP26 might seem, the global community is moving in the right direction.None of the above points, individually or jointly, translates into an effective response to the known impacts of climate change — let alone the unknown. But, this is what gives me hope.

    Interview prepared by MIT SHASS CommunicationsEditorial, design, and series director: Emily HiestandSenior writer: Kathryn O’Neill More

  • in

    Q&A: More-sustainable concrete with machine learning

    As a building material, concrete withstands the test of time. Its use dates back to early civilizations, and today it is the most popular composite choice in the world. However, it’s not without its faults. Production of its key ingredient, cement, contributes 8-9 percent of the global anthropogenic CO2 emissions and 2-3 percent of energy consumption, which is only projected to increase in the coming years. With aging United States infrastructure, the federal government recently passed a milestone bill to revitalize and upgrade it, along with a push to reduce greenhouse gas emissions where possible, putting concrete in the crosshairs for modernization, too.

    Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in the MIT Department of Materials Science and Engineering, and Jie Chen, MIT-IBM Watson AI Lab research scientist and manager, think artificial intelligence can help meet this need by designing and formulating new, more sustainable concrete mixtures, with lower costs and carbon dioxide emissions, while improving material performance and reusing manufacturing byproducts in the material itself. Olivetti’s research improves environmental and economic sustainability of materials, and Chen develops and optimizes machine learning and computational techniques, which he can apply to materials reformulation. Olivetti and Chen, along with their collaborators, have recently teamed up for an MIT-IBM Watson AI Lab project to make concrete more sustainable for the benefit of society, the climate, and the economy.

    Q: What applications does concrete have, and what properties make it a preferred building material?

    Olivetti: Concrete is the dominant building material globally with an annual consumption of 30 billion metric tons. That is over 20 times the next most produced material, steel, and the scale of its use leads to considerable environmental impact, approximately 5-8 percent of global greenhouse gas (GHG) emissions. It can be made locally, has a broad range of structural applications, and is cost-effective. Concrete is a mixture of fine and coarse aggregate, water, cement binder (the glue), and other additives.

    Q: Why isn’t it sustainable, and what research problems are you trying to tackle with this project?

    Olivetti: The community is working on several ways to reduce the impact of this material, including alternative fuels use for heating the cement mixture, increasing energy and materials efficiency and carbon sequestration at production facilities, but one important opportunity is to develop an alternative to the cement binder.

    While cement is 10 percent of the concrete mass, it accounts for 80 percent of the GHG footprint. This impact is derived from the fuel burned to heat and run the chemical reaction required in manufacturing, but also the chemical reaction itself releases CO2 from the calcination of limestone. Therefore, partially replacing the input ingredients to cement (traditionally ordinary Portland cement or OPC) with alternative materials from waste and byproducts can reduce the GHG footprint. But use of these alternatives is not inherently more sustainable because wastes might have to travel long distances, which adds to fuel emissions and cost, or might require pretreatment processes. The optimal way to make use of these alternate materials will be situation-dependent. But because of the vast scale, we also need solutions that account for the huge volumes of concrete needed. This project is trying to develop novel concrete mixtures that will decrease the GHG impact of the cement and concrete, moving away from the trial-and-error processes towards those that are more predictive.

    Chen: If we want to fight climate change and make our environment better, are there alternative ingredients or a reformulation we could use so that less greenhouse gas is emitted? We hope that through this project using machine learning we’ll be able to find a good answer.

    Q: Why is this problem important to address now, at this point in history?

    Olivetti: There is urgent need to address greenhouse gas emissions as aggressively as possible, and the road to doing so isn’t necessarily straightforward for all areas of industry. For transportation and electricity generation, there are paths that have been identified to decarbonize those sectors. We need to move much more aggressively to achieve those in the time needed; further, the technological approaches to achieve that are more clear. However, for tough-to-decarbonize sectors, such as industrial materials production, the pathways to decarbonization are not as mapped out.

    Q: How are you planning to address this problem to produce better concrete?

    Olivetti: The goal is to predict mixtures that will both meet performance criteria, such as strength and durability, with those that also balance economic and environmental impact. A key to this is to use industrial wastes in blended cements and concretes. To do this, we need to understand the glass and mineral reactivity of constituent materials. This reactivity not only determines the limit of the possible use in cement systems but also controls concrete processing, and the development of strength and pore structure, which ultimately control concrete durability and life-cycle CO2 emissions.

    Chen: We investigate using waste materials to replace part of the cement component. This is something that we’ve hypothesized would be more sustainable and economic — actually waste materials are common, and they cost less. Because of the reduction in the use of cement, the final concrete product would be responsible for much less carbon dioxide production. Figuring out the right concrete mixture proportion that makes endurable concretes while achieving other goals is a very challenging problem. Machine learning is giving us an opportunity to explore the advancement of predictive modeling, uncertainty quantification, and optimization to solve the issue. What we are doing is exploring options using deep learning as well as multi-objective optimization techniques to find an answer. These efforts are now more feasible to carry out, and they will produce results with reliability estimates that we need to understand what makes a good concrete.

    Q: What kinds of AI and computational techniques are you employing for this?

    Olivetti: We use AI techniques to collect data on individual concrete ingredients, mix proportions, and concrete performance from the literature through natural language processing. We also add data obtained from industry and/or high throughput atomistic modeling and experiments to optimize the design of concrete mixtures. Then we use this information to develop insight into the reactivity of possible waste and byproduct materials as alternatives to cement materials for low-CO2 concrete. By incorporating generic information on concrete ingredients, the resulting concrete performance predictors are expected to be more reliable and transformative than existing AI models.

    Chen: The final objective is to figure out what constituents, and how much of each, to put into the recipe for producing the concrete that optimizes the various factors: strength, cost, environmental impact, performance, etc. For each of the objectives, we need certain models: We need a model to predict the performance of the concrete (like, how long does it last and how much weight does it sustain?), a model to estimate the cost, and a model to estimate how much carbon dioxide is generated. We will need to build these models by using data from literature, from industry, and from lab experiments.

    We are exploring Gaussian process models to predict the concrete strength, going forward into days and weeks. This model can give us an uncertainty estimate of the prediction as well. Such a model needs specification of parameters, for which we will use another model to calculate. At the same time, we also explore neural network models because we can inject domain knowledge from human experience into them. Some models are as simple as multi-layer perceptions, while some are more complex, like graph neural networks. The goal here is that we want to have a model that is not only accurate but also robust — the input data is noisy, and the model must embrace the noise, so that its prediction is still accurate and reliable for the multi-objective optimization.

    Once we have built models that we are confident with, we will inject their predictions and uncertainty estimates into the optimization of multiple objectives, under constraints and under uncertainties.

    Q: How do you balance cost-benefit trade-offs?

    Chen: The multiple objectives we consider are not necessarily consistent, and sometimes they are at odds with each other. The goal is to identify scenarios where the values for our objectives cannot be further pushed simultaneously without compromising one or a few. For example, if you want to further reduce the cost, you probably have to suffer the performance or suffer the environmental impact. Eventually, we will give the results to policymakers and they will look into the results and weigh the options. For example, they may be able to tolerate a slightly higher cost under a significant reduction in greenhouse gas. Alternatively, if the cost varies little but the concrete performance changes drastically, say, doubles or triples, then this is definitely a favorable outcome.

    Q: What kinds of challenges do you face in this work?

    Chen: The data we get either from industry or from literature are very noisy; the concrete measurements can vary a lot, depending on where and when they are taken. There are also substantial missing data when we integrate them from different sources, so, we need to spend a lot of effort to organize and make the data usable for building and training machine learning models. We also explore imputation techniques that substitute missing features, as well as models that tolerate missing features, in our predictive modeling and uncertainty estimate.

    Q: What do you hope to achieve through this work?

    Chen: In the end, we are suggesting either one or a few concrete recipes, or a continuum of recipes, to manufacturers and policymakers. We hope that this will provide invaluable information for both the construction industry and for the effort of protecting our beloved Earth.

    Olivetti: We’d like to develop a robust way to design cements that make use of waste materials to lower their CO2 footprint. Nobody is trying to make waste, so we can’t rely on one stream as a feedstock if we want this to be massively scalable. We have to be flexible and robust to shift with feedstocks changes, and for that we need improved understanding. Our approach to develop local, dynamic, and flexible alternatives is to learn what makes these wastes reactive, so we know how to optimize their use and do so as broadly as possible. We do that through predictive model development through software we have developed in my group to automatically extract data from literature on over 5 million texts and patents on various topics. We link this to the creative capabilities of our IBM collaborators to design methods that predict the final impact of new cements. If we are successful, we can lower the emissions of this ubiquitous material and play our part in achieving carbon emissions mitigation goals.

    Other researchers involved with this project include Stefanie Jegelka, the X-Window Consortium Career Development Associate Professor in the MIT Department of Electrical Engineering and Computer Science; Richard Goodwin, IBM principal researcher; Soumya Ghosh, MIT-IBM Watson AI Lab research staff member; and Kristen Severson, former research staff member. Collaborators included Nghia Hoang, former research staff member with MIT-IBM Watson AI Lab and IBM Research; and Jeremy Gregory, research scientist in the MIT Department of Civil and Environmental Engineering and executive director of the MIT Concrete Sustainability Hub.

    This research is supported by the MIT-IBM Watson AI Lab. More

  • in

    Timber or steel? Study helps builders reduce carbon footprint of truss structures

    Buildings are a big contributor to global warming, not just in their ongoing operations but in the materials used in their construction. Truss structures — those crisscross arrays of diagonal struts used throughout modern construction, in everything from antenna towers to support beams for large buildings — are typically made of steel or wood or a combination of both. But little quantitative research has been done on how to pick the right materials to minimize these structures’ contribution global warming.

    The “embodied carbon” in a construction material includes the fuel used in the material’s production (for mining and smelting steel, for example, or for felling and processing trees) and in transporting the materials to a site. It also includes the equipment used for the construction itself.

    Now, researchers at MIT have done a detailed analysis and created a set of computational tools to enable architects and engineers to design truss structures in a way that can minimize their embodied carbon while maintaining all needed properties for a given building application. While in general wood produces a much lower carbon footprint, using steel in places where its properties can provide maximum benefit can provide an optimized result, they say.

    The analysis is described in a paper published today in the journal Engineering Structures, by graduate student Ernest Ching and MIT assistant professor of civil and environmental engineering Josephine Carstensen.

    “Construction is a huge greenhouse gas emitter that has kind of been flying under the radar for the past decades,” says Carstensen. But in recent years building designers “are starting to be more focused on how to not just reduce the operating energy associated with building use, but also the important carbon associated with the structure itself.” And that’s where this new analysis comes in.

    The two main options in reducing the carbon emissions associated with truss structures, she says, are substituting materials or changing the structure. However, there has been “surprisingly little work” on tools to help designers figure out emissions-minimizing strategies for a given situation, she says.

    The new system makes use of a technique called topology optimization, which allows for the input of basic parameters, such as the amount of load to be supported and the dimensions of the structure, and can be used to produce designs optimized for different characteristics, such as weight, cost, or, in this case, global warming impact.

    Wood performs very well under forces of compression, but not as well as steel when it comes to tension — that is, a tendency to pull the structure apart. Carstensen says that in general, wood is far better than steel in terms of embedded carbon, so “especially if you have a structure that doesn’t have any tension, then you should definitely only use timber” in order to minimize emissions. One tradeoff is that “the weight of the structure is going to be bigger than it would be with steel,” she says.

    The tools they developed, which were the basis for Ching’s master’s thesis, can be applied at different stages, either in the early planning phase of a structure, or later on in the final stages of a design.

    As an exercise, the team developed a proposal for reengineering several trusses using these optimization tools, and demonstrated that a significant savings in embodied greenhouse gas emissions could be achieved with no loss of performance. While they have shown improvements of at least 10 percent can be achieved, she says those estimates are “not exactly apples to apples” and likely savings could actually be two to three times that.

    “It’s about choosing materials more smartly,” she says, for the specifics of a given application. Often in existing buildings “you will have timber where there’s compression, and where that makes sense, and then it will have really skinny steel members, in tension, where that makes sense. And that’s also what we see in our design solutions that are suggested, but perhaps we can see it even more clearly.” The tools are not ready for commercial use though, she says, because they haven’t yet added a user interface.

    Carstensen sees a trend to increasing use of timber in large construction, which represents an important potential for reducing the world’s overall carbon emissions. “There’s a big interest in the construction industry in mass timber structures, and this speaks right into that area. So, the hope is that this would make inroads into the construction business and actually make a dent in that very large contribution to greenhouse gas emissions.” More

  • in

    New “risk triage” platform pinpoints compounding threats to US infrastructure

    Over a 36-hour period in August, Hurricane Henri delivered record rainfall in New York City, where an aging storm-sewer system was not built to handle the deluge, resulting in street flooding. Meanwhile, an ongoing drought in California continued to overburden aquifers and extend statewide water restrictions. As climate change amplifies the frequency and intensity of extreme events in the United States and around the world, and the populations and economies they threaten grow and change, there is a critical need to make infrastructure more resilient. But how can this be done in a timely, cost-effective way?

    An emerging discipline called multi-sector dynamics (MSD) offers a promising solution. MSD homes in on compounding risks and potential tipping points across interconnected natural and human systems. Tipping points occur when these systems can no longer sustain multiple, co-evolving stresses, such as extreme events, population growth, land degradation, drinkable water shortages, air pollution, aging infrastructure, and increased human demands. MSD researchers use observations and computer models to identify key precursory indicators of such tipping points, providing decision-makers with critical information that can be applied to mitigate risks and boost resilience in infrastructure and managed resources.

    At MIT, the Joint Program on the Science and Policy of Global Change has since 2018 been developing MSD expertise and modeling tools and using them to explore compounding risks and potential tipping points in selected regions of the United States. In a two-hour webinar on Sept. 15, MIT Joint Program researchers presented an overview of the program’s MSD research tool set and its applications.  

    MSD and the risk triage platform

    “Multi-sector dynamics explores interactions and interdependencies among human and natural systems, and how these systems may adapt, interact, and co-evolve in response to short-term shocks and long-term influences and stresses,” says MIT Joint Program Deputy Director C. Adam Schlosser, noting that such analysis can reveal and quantify potential risks that would likely evade detection in siloed investigations. “These systems can experience cascading effects or failures after crossing tipping points. The real question is not just where these tipping points are in each system, but how they manifest and interact across all systems.”

    To address that question, the program’s MSD researchers have developed the MIT Socio-Environmental Triage (MST) platform, now publicly available for the first time. Focused on the continental United States, the first version of the platform analyzes present-day risks related to water, land, climate, the economy, energy, demographics, health, and infrastructure, and where these compound to create risk hot spots. It’s essentially a screening-level visualization tool that allows users to examine risks, identify hot spots when combining risks, and make decisions about how to deploy more in-depth analysis to solve complex problems at regional and local levels. For example, MST can identify hot spots for combined flood and poverty risks in the lower Mississippi River basin, and thereby alert decision-makers as to where more concentrated flood-control resources are needed.

    Successive versions of the platform will incorporate projections based on the MIT Joint Program’s Integrated Global System Modeling (IGSM) framework of how different systems and stressors may co-evolve into the future and thereby change the risk landscape. This enhanced capability could help uncover cost-effective pathways for mitigating and adapting to a wide range of environmental and economic risks.  

    MSD applications

    Five webinar presentations explored how MIT Joint Program researchers are applying the program’s risk triage platform and other MSD modeling tools to identify potential tipping points and risks in five key domains: water quality, land use, economics and energy, health, and infrastructure. 

    Joint Program Principal Research Scientist Xiang Gao described her efforts to apply a high-resolution U.S. water-quality model to calculate a location-specific, water-quality index over more than 2,000 river basins in the country. By accounting for interactions among climate, agriculture, and socioeconomic systems, various water-quality measures can be obtained ranging from nitrate and phosphate levels to phytoplankton concentrations. This modeling approach advances a unique capability to identify potential water-quality risk hot spots for freshwater resources.

    Joint Program Research Scientist Angelo Gurgel discussed his MSD-based analysis of how climate change, population growth, changing diets, crop-yield improvements and other forces that drive land-use change at the global level may ultimately impact how land is used in the United States. Drawing upon national observational data and the IGSM framework, the analysis shows that while current U.S. land-use trends are projected to persist or intensify between now and 2050, there is no evidence of any concerning tipping points arising throughout this period.  

    MIT Joint Program Research Scientist Jennifer Morris presented several examples of how the risk triage platform can be used to combine existing U.S. datasets and the IGSM framework to assess energy and economic risks at the regional level. For example, by aggregating separate data streams on fossil-fuel employment and poverty, one can target selected counties for clean energy job training programs as the nation moves toward a low-carbon future. 

    “Our modeling and risk triage frameworks can provide pictures of current and projected future economic and energy landscapes,” says Morris. “They can also highlight interactions among different human, built, and natural systems, including compounding risks that occur in the same location.”  

    MIT Joint Program research affiliate Sebastian Eastham, a research scientist at the MIT Laboratory for Aviation and the Environment, described an MSD approach to the study of air pollution and public health. Linking the IGSM with an atmospheric chemistry model, Eastham ultimately aims to better understand where the greatest health risks are in the United States and how they may compound throughout this century under different policy scenarios. Using the risk triage tool to combine current risk metrics for air quality and poverty in a selected county based on current population and air-quality data, he showed how one can rapidly identify cardiovascular and other air-pollution-induced disease risk hot spots.

    Finally, MIT Joint Program research affiliate Alyssa McCluskey, a lecturer at the University of Colorado at Boulder, showed how the risk triage tool can be used to pinpoint potential risks to roadways, waterways, and power distribution lines from flooding, extreme temperatures, population growth, and other stressors. In addition, McCluskey described how transportation and energy infrastructure development and expansion can threaten critical wildlife habitats.

    Enabling comprehensive, location-specific analyses of risks and hot spots within and among multiple domains, the Joint Program’s MSD modeling tools can be used to inform policymaking and investment from the municipal to the global level.

    “MSD takes on the challenge of linking human, natural, and infrastructure systems in order to inform risk analysis and decision-making,” says Schlosser. “Through our risk triage platform and other MSD models, we plan to assess important interactions and tipping points, and to provide foresight that supports action toward a sustainable, resilient, and prosperous world.”

    This research is funded by the U.S. Department of Energy’s Office of Science as an ongoing project. More