More stories

  • in

    Technologies for water conservation and treatment move closer to commercialization

    The Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) provides Solutions Grants to help MIT researchers launch startup companies or products to commercialize breakthrough technologies in water and food systems. The Solutions Grant Program began in 2015 and is supported by Community Jameel. In addition to one-year, renewable grants of up to $150,000, the program also matches grantees with industry mentors and facilitates introductions to potential investors. Since its inception, the J-WAFS Solutions Program has awarded over $3 million in funding to the MIT community. Numerous startups and products, including a portable desalination device and a company commercializing a novel food safety sensor, have spun out of this support.

    The 2023 J-WAFS Solutions Grantees are Professor C. Cem Tasan of the Department of Materials Science and Engineering and Professor Andrew Whittle of the Department of Civil and Environmental Engineering. Tasan’s project involves reducing water use in steel manufacturing and Whittle’s project tackles harmful algal blooms in water. Project work commences this September.

    “This year’s Solutions Grants are being award to professors Tasan and Whittle to help commercialize technologies they have been developing at MIT,” says J-WAFS executive director Renee J. Robins. “With J-WAFS’ support, we hope to see the teams move their technologies from the lab to the market, so they can have a beneficial impact on water use and water quality challenges,” Robins adds.

    Reducing water consumption by solid-state steelmaking

    Water is a major requirement for steel production. The steel industry ranks fourth in industrial freshwater consumption worldwide, since large amounts of water are needed mainly for cooling purposes in the process. Unfortunately, a strong correlation has also been shown to exist between freshwater use in steelmaking and water contamination. As the global demand for steel increases and freshwater availability decreases due to climate change, improved methods for more sustainable steel production are needed.

    A strategy to reduce the water footprint of steelmaking is to explore steel recycling processes that avoid liquid metal processing. With this motivation, Cem Tasan, the Thomas B. King Associate Professor of Metallurgy in the Department of Materials Science and Engineering, and postdoc Onur Guvenc PhD created a new process called Scrap Metal Consolidation (SMC). SMC is based on a well-established metal forming process known as roll bonding. Conventionally, roll bonding requires intensive prior surface treatment of the raw material, specific atmospheric conditions, and high deformation levels. Tasan and Guvenc’s research revealed that SMC can overcome these restrictions by enabling the solid-state bonding of scrap into a sheet metal form, even when the surface quality, atmospheric conditions, and deformation levels are suboptimal. Through lab-scale proof-of-principle investigations, they have already identified SMC process conditions and validated the mechanical formability of resulting steel sheets, focusing on mild steel, the most common sheet metal scrap.

    The J-WAFS Solutions Grant will help the team to build customer product prototypes, design the processing unit, and develop a scale-up strategy and business model. By simultaneously decreasing water usage, energy demand, contamination risk, and carbon dioxide burden, SMC has the potential to decrease the energy need for steel recycling by up to 86 percent, as well as reduce the linked carbon dioxide emissions and safeguard the freshwater resources that would otherwise be directed to industrial consumption. 

    Detecting harmful algal blooms in water before it’s too late

    Harmful algal blooms (HABs) are a growing problem in both freshwater and saltwater environments worldwide, causing an estimated $13 billion in annual damage to drinking water, water for recreational use, commercial fishing areas, and desalination activities. HABs pose a threat to both human health and aquaculture, thereby threatening the food supply. Toxins in HABs are produced by some cyanobacteria, or blue-green algae, whose communities change in composition in response to eutrophication from agricultural runoff, sewer overflows, or other events. Mitigation of risks from HABs are most effective when there is advance warning of these changes in algal communities. 

    Most in situ measurements of algae are based on fluorescence spectroscopy that is conducted with LED-induced fluorescence (LEDIF) devices, or probes that induce fluorescence of specific algal pigments using LED light sources. While LEDIFs provide reasonable estimates of concentrations of individual pigments, they lack resolution to discriminate algal classes within complex mixtures found in natural water bodies. In prior research, Andrew Whittle, the Edmund K. Turner Professor of Civil and Environmental Engineering, worked with colleagues to design REMORA, a low-cost, field-deployable prototype spectrofluorometer for measuring induced fluorescence. This research was part of a collaboration between MIT and the AMS Institute. Whittle and the team successfully trained a machine learning model to discriminate and quantify cell concentrations for mixtures of different algal groups in water samples through an extensive laboratory calibration program using various algae cultures. The group demonstrated these capabilities in a series of field measurements at locations in Boston and Amsterdam. 

    Whittle will work with Fábio Duarte of the Department of Urban Studies and Planning, the Senseable City Lab, and MIT’s Center for Real Estate to refine the design of REMORA. They will develop software for autonomous operation of the sensor that can be deployed remotely on mobile vessels or platforms to enable high-resolution spatiotemporal monitoring for harmful algae. Sensor commercialization will hopefully be able to exploit the unique capabilities of REMORA for long-term monitoring applications by water utilities, environmental regulatory agencies, and water-intensive industries.  More

  • in

    Study suggests energy-efficient route to capturing and converting CO2

    In the race to draw down greenhouse gas emissions around the world, scientists at MIT are looking to carbon-capture technologies to decarbonize the most stubborn industrial emitters.

    Steel, cement, and chemical manufacturing are especially difficult industries to decarbonize, as carbon and fossil fuels are inherent ingredients in their production. Technologies that can capture carbon emissions and convert them into forms that feed back into the production process could help to reduce the overall emissions from these “hard-to-abate” sectors.

    But thus far, experimental technologies that capture and convert carbon dioxide do so as two separate processes, that themselves require a huge amount of energy to run. The MIT team is looking to combine the two processes into one integrated and far more energy-efficient system that could potentially run on renewable energy to both capture and convert carbon dioxide from concentrated, industrial sources.

    In a study appearing today in ACS Catalysis, the researchers reveal the hidden functioning of how carbon dioxide can be both captured and converted through a single electrochemical process. The process involves using an electrode to attract carbon dioxide released from a sorbent, and to convert it into a reduced, reusable form.

    Others have reported similar demonstrations, but the mechanisms driving the electrochemical reaction have remained unclear. The MIT team carried out extensive experiments to determine that driver, and found that, in the end, it came down to the partial pressure of carbon dioxide. In other words, the more pure carbon dioxide that makes contact with the electrode, the more efficiently the electrode can capture and convert the molecule.

    Knowledge of this main driver, or “active species,” can help scientists tune and optimize similar electrochemical systems to efficiently capture and convert carbon dioxide in an integrated process.

    The study’s results imply that, while these electrochemical systems would probably not work for very dilute environments (for instance, to capture and convert carbon emissions directly from the air), they would be well-suited to the highly concentrated emissions generated by industrial processes, particularly those that have no obvious renewable alternative.

    “We can and should switch to renewables for electricity production. But deeply decarbonizing industries like cement or steel production is challenging and will take a longer time,” says study author Betar Gallant, the Class of 1922 Career Development Associate Professor at MIT. “Even if we get rid of all our power plants, we need some solutions to deal with the emissions from other industries in the shorter term, before we can fully decarbonize them. That’s where we see a sweet spot, where something like this system could fit.”

    The study’s MIT co-authors are lead author and postdoc Graham Leverick and graduate student Elizabeth Bernhardt, along with Aisyah Illyani Ismail, Jun Hui Law, Arif Arifutzzaman, and Mohamed Kheireddine Aroua of Sunway University in Malaysia.

    Breaking bonds

    Carbon-capture technologies are designed to capture emissions, or “flue gas,” from the smokestacks of power plants and manufacturing facilities. This is done primarily using large retrofits to funnel emissions into chambers filled with a “capture” solution — a mix of amines, or ammonia-based compounds, that chemically bind with carbon dioxide, producing a stable form that can be separated out from the rest of the flue gas.

    High temperatures are then applied, typically in the form of fossil-fuel-generated steam, to release the captured carbon dioxide from its amine bond. In its pure form, the gas can then be pumped into storage tanks or underground, mineralized, or further converted into chemicals or fuels.

    “Carbon capture is a mature technology, in that the chemistry has been known for about 100 years, but it requires really large installations, and is quite expensive and energy-intensive to run,” Gallant notes. “What we want are technologies that are more modular and flexible and can be adapted to more diverse sources of carbon dioxide. Electrochemical systems can help to address that.”

    Her group at MIT is developing an electrochemical system that both recovers the captured carbon dioxide and converts it into a reduced, usable product. Such an integrated system, rather than a decoupled one, she says, could be entirely powered with renewable electricity rather than fossil-fuel-derived steam.

    Their concept centers on an electrode that would fit into existing chambers of carbon-capture solutions. When a voltage is applied to the electrode, electrons flow onto the reactive form of carbon dioxide and convert it to a product using protons supplied from water. This makes the sorbent available to bind more carbon dioxide, rather than using steam to do the same.

    Gallant previously demonstrated this electrochemical process could work to capture and convert carbon dioxide into a solid carbonate form.

    “We showed that this electrochemical process was feasible in very early concepts,” she says. “Since then, there have been other studies focused on using this process to attempt to produce useful chemicals and fuels. But there’s been inconsistent explanations of how these reactions work, under the hood.”

    Solo CO2

    In the new study, the MIT team took a magnifying glass under the hood to tease out the specific reactions driving the electrochemical process. In the lab, they generated amine solutions that resemble the industrial capture solutions used to extract carbon dioxide from flue gas. They methodically altered various properties of each solution, such as the pH, concentration, and type of amine, then ran each solution past an electrode made from silver — a metal that is widely used in electrolysis studies and known to efficiently convert carbon dioxide to carbon monoxide. They then measured the concentration of carbon monoxide that was converted at the end of the reaction, and compared this number against that of every other solution they tested, to see which parameter had the most influence on how much carbon monoxide was produced.

    In the end, they found that what mattered most was not the type of amine used to initially capture carbon dioxide, as many have suspected. Instead, it was the concentration of solo, free-floating carbon dioxide molecules, which avoided bonding with amines but were nevertheless present in the solution. This “solo-CO2” determined the concentration of carbon monoxide that was ultimately produced.

    “We found that it’s easier to react this ‘solo’ CO2, as compared to CO2 that has been captured by the amine,” Leverick offers. “This tells future researchers that this process could be feasible for industrial streams, where high concentrations of carbon dioxide could efficiently be captured and converted into useful chemicals and fuels.”

    “This is not a removal technology, and it’s important to state that,” Gallant stresses. “The value that it does bring is that it allows us to recycle carbon dioxide some number of times while sustaining existing industrial processes, for fewer associated emissions. Ultimately, my dream is that electrochemical systems can be used to facilitate mineralization, and permanent storage of CO2 — a true removal technology. That’s a longer-term vision. And a lot of the science we’re starting to understand is a first step toward designing those processes.”

    This research is supported by Sunway University in Malaysia. More

  • in

    Devices offers long-distance, low-power underwater communication

    MIT researchers have demonstrated the first system for ultra-low-power underwater networking and communication, which can transmit signals across kilometer-scale distances.

    This technique, which the researchers began developing several years ago, uses about one-millionth the power that existing underwater communication methods use. By expanding their battery-free system’s communication range, the researchers have made the technology more feasible for applications such as aquaculture, coastal hurricane prediction, and climate change modeling.

    “What started as a very exciting intellectual idea a few years ago — underwater communication with a million times lower power — is now practical and realistic. There are still a few interesting technical challenges to address, but there is a clear path from where we are now to deployment,” says Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab.

    Underwater backscatter enables low-power communication by encoding data in sound waves that it reflects, or scatters, back toward a receiver. These innovations enable reflected signals to be more precisely directed at their source.

    Due to this “retrodirectivity,” less signal scatters in the wrong directions, allowing for more efficient and longer-range communication.

    When tested in a river and an ocean, the retrodirective device exhibited a communication range that was more than 15 times farther than previous devices. However, the experiments were limited by the length of the docks available to the researchers.

    To better understand the limits of underwater backscatter, the team also developed an analytical model to predict the technology’s maximum range. The model, which they validated using experimental data, showed that their retrodirective system could communicate across kilometer-scale distances.

    The researchers shared these findings in two papers which will be presented at this year’s ACM SIGCOMM and MobiCom conferences. Adib, senior author on both papers, is joined on the SIGCOMM paper by co-lead authors Aline Eid, a former postdoc who is now an assistant professor at the University of Michigan, and Jack Rademacher, a research assistant; as well as research assistants Waleed Akbar and Purui Wang, and postdoc Ahmed Allam. The MobiCom paper is also written by co-lead authors Akbar and Allam.

    Communicating with sound waves

    Underwater backscatter communication devices utilize an array of nodes made from “piezoelectric” materials to receive and reflect sound waves. These materials produce an electric signal when mechanical force is applied to them.

    When sound waves strike the nodes, they vibrate and convert the mechanical energy to an electric charge. The nodes use that charge to scatter some of the acoustic energy back to the source, transmitting data that a receiver decodes based on the sequence of reflections.

    But because the backscattered signal travels in all directions, only a small fraction reaches the source, reducing the signal strength and limiting the communication range.

    To overcome this challenge, the researchers leveraged a 70-year-old radio device called a Van Atta array, in which symmetric pairs of antennas are connected in such a way that the array reflects energy back in the direction it came from.

    But connecting piezoelectric nodes to make a Van Atta array reduces their efficiency. The researchers avoided this problem by placing a transformer between pairs of connected nodes. The transformer, which transfers electric energy from one circuit to another, allows the nodes to reflect the maximum amount of energy back to the source.

    “Both nodes are receiving and both nodes are reflecting, so it is a very interesting system. As you increase the number of elements in that system, you build an array that allows you to achieve much longer communication ranges,” Eid explains.

    In addition, they used a technique called cross-polarity switching to encode binary data in the reflected signal. Each node has a positive and a negative terminal (like a car battery), so when the positive terminals of two nodes are connected and the negative terminals of two nodes are connected, that reflected signal is a “bit one.”

    But if the researchers switch the polarity, and the negative and positive terminals are connected to each other instead, then the reflection is a “bit zero.”

    “Just connecting the piezoelectric nodes together is not enough. By alternating the polarities between the two nodes, we are able to transmit data back to the remote receiver,” Rademacher explains.

    When building the Van Atta array, the researchers found that if the connected nodes were too close, they would block each other’s signals. They devised a new design with staggered nodes that enables signals to reach the array from any direction. With this scalable design, the more nodes an array has, the greater its communication range.

    They tested the array in more than 1,500 experimental trials in the Charles River in Cambridge, Massachusetts, and in the Atlantic Ocean, off the coast of Falmouth, Massachusetts, in collaboration with the Woods Hole Oceanographic Institution. The device achieved communication ranges of 300 meters, more than 15 times longer than they previously demonstrated.

    However, they had to cut the experiments short because they ran out of space on the dock.

    Modeling the maximum

    That inspired the researchers to build an analytical model to determine the theoretical and practical communication limits of this new underwater backscatter technology.

    Building off their group’s work on RFIDs, the team carefully crafted a model that captured the impact of system parameters, like the size of the piezoelectric nodes and the input power of the signal, on the underwater operation range of the device.

    “It is not a traditional communication technology, so you need to understand how you can quantify the reflection. What are the roles of the different components in that process?” Akbar says.

    For instance, the researchers needed to derive a function that captures the amount of signal reflected out of an underwater piezoelectric node with a specific size, which was among the biggest challenges of developing the model, he adds.

    They used these insights to create a plug-and-play model into a which a user could enter information like input power and piezoelectric node dimensions and receive an output that shows the expected range of the system.

    They evaluated the model on data from their experimental trials and found that it could accurately predict the range of retrodirected acoustic signals with an average error of less than one decibel.

    Using this model, they showed that an underwater backscatter array can potentially achieve kilometer-long communication ranges.

    “We are creating a new ocean technology and propelling it into the realm of the things we have been doing for 6G cellular networks. For us, it is very rewarding because we are starting to see this now very close to reality,” Adib says.

    The researchers plan to continue studying underwater backscatter Van Atta arrays, perhaps using boats so they could evaluate longer communication ranges. Along the way, they intend to release tools and datasets so other researchers can build on their work. At the same time, they are beginning to move toward commercialization of this technology.

    “Limited range has been an open problem in underwater backscatter networks, preventing them from being used in real-world applications. This paper takes a significant step forward in the future of underwater communication, by enabling them to operate on minimum energy while achieving long range,” says Omid Abari, assistant professor of computer science at the University of California at Los Angeles, who was not involved with this work. “The paper is the first to bring Van Atta Reflector array technique into underwater backscatter settings and demonstrate its benefits in improving the communication range by orders of magnitude. This can take battery-free underwater communication one step closer to reality, enabling applications such as underwater climate change monitoring and coastal monitoring.”

    This research was funded, in part, by the Office of Naval Research, the Sloan Research Fellowship, the National Science Foundation, the MIT Media Lab, and the Doherty Chair in Ocean Utilization. More

  • in

    Fast-tracking fusion energy’s arrival with AI and accessibility

    As the impacts of climate change continue to grow, so does interest in fusion’s potential as a clean energy source. While fusion reactions have been studied in laboratories since the 1930s, there are still many critical questions scientists must answer to make fusion power a reality, and time is of the essence. As part of their strategy to accelerate fusion energy’s arrival and reach carbon neutrality by 2050, the U.S. Department of Energy (DoE) has announced new funding for a project led by researchers at MIT’s Plasma Science and Fusion Center (PSFC) and four collaborating institutions.

    Cristina Rea, a research scientist and group leader at the PSFC, will serve as the primary investigator for the newly funded three-year collaboration to pilot the integration of fusion data into a system that can be read by AI-powered tools. The PSFC, together with scientists from the College of William and Mary, the University of Wisconsin at Madison, Auburn University, and the nonprofit HDF Group, plan to create a holistic fusion data platform, the elements of which could offer unprecedented access for researchers, especially underrepresented students. The project aims to encourage diverse participation in fusion and data science, both in academia and the workforce, through outreach programs led by the group’s co-investigators, of whom four out of five are women. 

    The DoE’s award, part of a $29 million funding package for seven projects across 19 institutions, will support the group’s efforts to distribute data produced by fusion devices like the PSFC’s Alcator C-Mod, a donut-shaped “tokamak” that utilized powerful magnets to control and confine fusion reactions. Alcator C-Mod operated from 1991 to 2016 and its data are still being studied, thanks in part to the PSFC’s commitment to the free exchange of knowledge.

    Currently, there are nearly 50 public experimental magnetic confinement-type fusion devices; however, both historical and current data from these devices can be difficult to access. Some fusion databases require signing user agreements, and not all data are catalogued and organized the same way. Moreover, it can be difficult to leverage machine learning, a class of AI tools, for data analysis and to enable scientific discovery without time-consuming data reorganization. The result is fewer scientists working on fusion, greater barriers to discovery, and a bottleneck in harnessing AI to accelerate progress.

    The project’s proposed data platform addresses technical barriers by being FAIR — Findable, Interoperable, Accessible, Reusable — and by adhering to UNESCO’s Open Science (OS) recommendations to improve the transparency and inclusivity of science; all of the researchers’ deliverables will adhere to FAIR and OS principles, as required by the DoE. The platform’s databases will be built using MDSplusML, an upgraded version of the MDSplus open-source software developed by PSFC researchers in the 1980s to catalogue the results of Alcator C-Mod’s experiments. Today, nearly 40 fusion research institutes use MDSplus to store and provide external access to their fusion data. The release of MDSplusML aims to continue that legacy of open collaboration.

    The researchers intend to address barriers to participation for women and disadvantaged groups not only by improving general access to fusion data, but also through a subsidized summer school that will focus on topics at the intersection of fusion and machine learning, which will be held at William and Mary for the next three years.

    Of the importance of their research, Rea says, “This project is about responding to the fusion community’s needs and setting ourselves up for success. Scientific advancements in fusion are enabled via multidisciplinary collaboration and cross-pollination, so accessibility is absolutely essential. I think we all understand now that diverse communities have more diverse ideas, and they allow faster problem-solving.”

    The collaboration’s work also aligns with vital areas of research identified in the International Atomic Energy Agency’s “AI for Fusion” Coordinated Research Project (CRP). Rea was selected as the technical coordinator for the IAEA’s CRP emphasizing community engagement and knowledge access to accelerate fusion research and development. In a letter of support written for the group’s proposed project, the IAEA stated that, “the work [the researchers] will carry out […] will be beneficial not only to our CRP but also to the international fusion community in large.”

    PSFC Director and Hitachi America Professor of Engineering Dennis Whyte adds, “I am thrilled to see PSFC and our collaborators be at the forefront of applying new AI tools while simultaneously encouraging and enabling extraction of critical data from our experiments.”

    “Having the opportunity to lead such an important project is extremely meaningful, and I feel a responsibility to show that women are leaders in STEM,” says Rea. “We have an incredible team, strongly motivated to improve our fusion ecosystem and to contribute to making fusion energy a reality.” More

  • in

    New clean air and water labs to bring together researchers, policymakers to find climate solutions

    MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL) is launching the Clean Air and Water Labs, with support from Community Jameel, to generate evidence-based solutions aimed at increasing access to clean air and water.

    Led by J-PAL’s Africa, Middle East and North Africa (MENA), and South Asia regional offices, the labs will partner with government agencies to bring together researchers and policymakers in areas where impactful clean air and water solutions are most urgently needed.

    Together, the labs aim to improve clean air and water access by informing the scaling of evidence-based policies and decisions of city, state, and national governments that serve nearly 260 million people combined.

    The Clean Air and Water Labs expand the work of J-PAL’s King Climate Action Initiative, building on the foundational support of King Philanthropies, which significantly expanded J-PAL’s work at the nexus of climate change and poverty alleviation worldwide. 

    Air pollution, water scarcity and the need for evidence 

    Africa, MENA, and South Asia are on the front lines of global air and water crises. 

    “There is no time to waste investing in solutions that do not achieve their desired effects,” says Iqbal Dhaliwal, global executive director of J-PAL. “By co-generating rigorous real-world evidence with researchers, policymakers can have the information they need to dedicate resources to scaling up solutions that have been shown to be effective.”

    In India, about 75 percent of households did not have drinking water on premises in 2018. In MENA, nearly 90 percent of children live in areas facing high or extreme water stress. Across Africa, almost 400 million people lack access to safe drinking water. 

    Simultaneously, air pollution is one of the greatest threats to human health globally. In India, extraordinary levels of air pollution are shortening the average life expectancy by five years. In Africa, rising indoor and ambient air pollution contributed to 1.1 million premature deaths in 2019. 

    There is increasing urgency to find high-impact and cost-effective solutions to the worsening threats to human health and resources caused by climate change. However, data and evidence on potential solutions are limited.

    Fostering collaboration to generate policy-relevant evidence 

    The Clean Air and Water Labs will foster deep collaboration between government stakeholders, J-PAL regional offices, and researchers in the J-PAL network. 

    Through the labs, J-PAL will work with policymakers to:

    co-diagnose the most pressing air and water challenges and opportunities for policy innovation;
    expand policymakers’ access to and use of high-quality air and water data;
    co-design potential solutions informed by existing evidence;
    co-generate evidence on promising solutions through rigorous evaluation, leveraging existing and new data sources; and
    support scaling of air and water policies and programs that are found to be effective through evaluation. 
    A research and scaling fund for each lab will prioritize resources for co-generated pilot studies, randomized evaluations, and scaling projects. 

    The labs will also collaborate with C40 Cities, a global network of mayors of the world’s leading cities that are united in action to confront the climate crisis, to share policy-relevant evidence and identify opportunities for potential new connections and research opportunities within India and across Africa.

    This model aims to strengthen the use of evidence in decision-making to ensure solutions are highly effective and to guide research to answer policymakers’ most urgent questions. J-PAL Africa, MENA, and South Asia’s strong on-the-ground presence will further bridge research and policy work by anchoring activities within local contexts. 

    “Communities across the world continue to face challenges in accessing clean air and water, a threat to human safety that has only been exacerbated by the climate crisis, along with rising temperatures and other hazards,” says George Richards, director of Community Jameel. “Through our collaboration with J-PAL and C40 in creating climate policy labs embedded in city, state, and national governments in Africa and South Asia, we are committed to innovative and science-based approaches that can help hundreds of millions of people enjoy healthier lives.”

    J-PAL Africa, MENA, and South Asia will formally launch Clean Air and Water Labs with government partners over the coming months. J-PAL is housed in the MIT Department of Economics, within the School of Humanities, Arts, and Social Sciences. More

  • in

    Tiny magnetic beads produce an optical signal that could be used to quickly detect pathogens

    Getting results from a blood test can take anywhere from one day to a week, depending on what a test is targeting. The same goes for tests of water pollution and food contamination. And in most cases, the wait time has to do with time-consuming steps in sample processing and analysis.

    Now, MIT engineers have identified a new optical signature in a widely used class of magnetic beads, which could be used to quickly detect contaminants in a variety of diagnostic tests. For example, the team showed the signature could be used to detect signs of the food contaminant Salmonella.

    The so-called Dynabeads are microscopic magnetic beads that can be coated with antibodies that bind to target molecules, such as a specific pathogen. Dynabeads are typically used in experiments in which they are mixed into solutions to capture molecules of interest. But from there, scientists have to take additional, time-consuming steps to confirm that the molecules are indeed present and bound to the beads.

    The MIT team found a faster way to confirm the presence of Dynabead-bound pathogens, using optics, specifically, Raman spectroscopy. This optical technique identifies specific molecules based on their “Raman signature,” or the unique way in which a molecule scatters light.

    The researchers found that Dynabeads have an unusually strong Raman signature that can be easily detected, much like a fluorescent tag. This signature, they found, can act as a “reporter.” If detected, the signal can serve as a quick confirmation, within less than one second, that a target pathogen is indeed present in a given sample. The team is currently working to develop a portable device for quickly detecting a range of bacterial pathogens, and their results will appear in an Emerging Investigators special issue of the Journal of Raman Spectroscopy.

    “This technique would be useful in a situation where a doctor is trying to narrow down the source of an infection in order to better inform antibiotic prescription, as well as for the detection of known pathogens in food and water,” says study co-author Marissa McDonald, a graduate student in the Harvard-MIT Program in Health Sciences and Technology. “Additionally, we hope this approach will eventually lead to expanded access to advanced diagnostics in resource-limited environments.”

    Study co-authors at MIT include Postdoctoral Associate Jongwan Lee; Visiting Scholar Nikiwe Mhlanga; Research Scientist Jeon Woong Kang; Tata Professor Rohit Karnik, who is also the associate director of the Abdul Latif Jameel Water and Food Systems Lab; and Assistant Professor Loza Tadesse of the Department of Mechanical Engineering.

    Oil and water

    Looking for diseased cells and pathogens in fluid samples is an exercise in patience.

    “It’s kind of a needle-in-a-haystack problem,” Tadesse says.

    The numbers present are so small that they must be grown in controlled environments to sufficient numbers, and their cultures stained, then studied under a microscope. The entire process can take several days to a week to yield a confident positive or negative result.

    Both Karnik and Tadesse’s labs have independently been developing techniques to speed up various parts of the pathogen testing process and make the process portable, using Dynabeads.

    Dynabeads are commercially available microscopic beads made from a magnetic iron core and a polymer shell that can be coated with antibodies. The surface antibodies act as hooks to bind specific target molecules. When mixed with a fluid, such as a vial of blood or water, any molecules present will glom onto the Dynabeads. Using a magnet, scientists can gently coax the beads to the bottom of a vial and filter them out of a solution. Karnik’s lab is investigating ways to then further separate the beads into those that are bound to a target molecule, and those that are not. “Still, the challenge is, how do we know that we have what we’re looking for?” Tadesse says.

    The beads themselves are not visible by eye. That’s where Tadesse’s work comes in. Her lab uses Raman spectroscopy as a way to “fingerprint” pathogens. She has found that different cell types scatter light in unique ways that can be used as a signature to identify them.

    In the team’s new work, she and her colleagues found that Dynabeads also have a unique and strong Raman signature that can act as a surprisingly clear beacon.

    “We were initially seeking to identify the signatures of bacteria, but the signature of the Dynabeads was actually very strong,” Tadesse says. “We realized this signal could be a means of reporting to you whether you have that bacteria or not.”

    Testing beacon

    As a practical demonstration, the researchers mixed Dynabeads into vials of water contaminated with Salmonella. They then magnetically isolated these beads onto microscope slides and measured the way light scattered through the fluid when exposed to laser light. Within half a second, they quickly detected the Dynabeads’ Raman signature — a confirmation that bound Dynabeads, and by inference, Salmonella, were present in the fluid.

    “This is something that can be used to rapidly give a positive or negative answer: Is there a contaminant or not?” Tadesse says. “Because even a handful of pathogens can cause clinical symptoms.”

    The team’s new technique is significantly faster than conventional methods and uses elements that could be adapted into smaller, more portable forms — a goal that the researchers are currently working toward. The approach is also highly versatile.

    “Salmonella is the proof of concept,” Tadesse says. “You could purchase Dynabeads with E.coli antibodies, and the same thing would happen: It would bind to the bacteria, and we’d be able to detect the Dynabead signature because the signal is super strong.”

    The team is particularly keen to apply the test to conditions such as sepsis, where time is of the essence, and where pathogens that trigger the condition are not rapidly detected using conventional lab tests.

    “There are a lot cases, like in sepsis, where pathogenic cells cannot always be grown on a plate,” says Lee, a member of Karnik’s lab. “In that case, our technique could rapidly detect these pathogens.”

    This research was supported, in part, by the MIT Laser Biomedical Research Center, the National Cancer Institute, and the Abdul Latif Jameel Water and Food Systems Lab at MIT. More

  • in

    Supporting sustainability, digital health, and the future of work

    The MIT and Accenture Convergence Initiative for Industry and Technology has selected three new research projects that will receive support from the initiative. The research projects aim to accelerate progress in meeting complex societal needs through new business convergence insights in technology and innovation.

    Established in MIT’s School of Engineering and now in its third year, the MIT and Accenture Convergence Initiative is furthering its mission to bring together technological experts from across business and academia to share insights and learn from one another. Recently, Thomas W. Malone, the Patrick J. McGovern (1959) Professor of Management, joined the initiative as its first-ever faculty lead. The research projects relate to three of the initiative’s key focus areas: sustainability, digital health, and the future of work.

    “The solutions these research teams are developing have the potential to have tremendous impact,” says Anantha Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. “They embody the initiative’s focus on advancing data-driven research that addresses technology and industry convergence.”

    “The convergence of science and technology driven by advancements in generative AI, digital twins, quantum computing, and other technologies makes this an especially exciting time for Accenture and MIT to be undertaking this joint research,” says Kenneth Munie, senior managing director at Accenture Strategy, Life Sciences. “Our three new research projects focusing on sustainability, digital health, and the future of work have the potential to help guide and shape future innovations that will benefit the way we work and live.”

    The MIT and Accenture Convergence Initiative charter project researchers are described below.

    Accelerating the journey to net zero with industrial clusters

    Jessika Trancik is a professor at the Institute for Data, Systems, and Society (IDSS). Trancik’s research examines the dynamic costs, performance, and environmental impacts of energy systems to inform climate policy and accelerate beneficial and equitable technology innovation. Trancik’s project aims to identify how industrial clusters can enable companies to derive greater value from decarbonization, potentially making companies more willing to invest in the clean energy transition.

    To meet the ambitious climate goals that have been set by countries around the world, rising greenhouse gas emissions trends must be rapidly reversed. Industrial clusters — geographically co-located or otherwise-aligned groups of companies representing one or more industries — account for a significant portion of greenhouse gas emissions globally. With major energy consumers “clustered” in proximity, industrial clusters provide a potential platform to scale low-carbon solutions by enabling the aggregation of demand and the coordinated investment in physical energy supply infrastructure.

    In addition to Trancik, the research team working on this project will include Aliza Khurram, a postdoc in IDSS; Micah Ziegler, an IDSS research scientist; Melissa Stark, global energy transition services lead at Accenture; Laura Sanderfer, strategy consulting manager at Accenture; and Maria De Miguel, strategy senior analyst at Accenture.

    Eliminating childhood obesity

    Anette “Peko” Hosoi is the Neil and Jane Pappalardo Professor of Mechanical Engineering. A common theme in her work is the fundamental study of shape, kinematic, and rheological optimization of biological systems with applications to the emergent field of soft robotics. Her project will use both data from existing studies and synthetic data to create a return-on-investment (ROI) calculator for childhood obesity interventions so that companies can identify earlier returns on their investment beyond reduced health-care costs.

    Childhood obesity is too prevalent to be solved by a single company, industry, drug, application, or program. In addition to the physical and emotional impact on children, society bears a cost through excess health care spending, lost workforce productivity, poor school performance, and increased family trauma. Meaningful solutions require multiple organizations, representing different parts of society, working together with a common understanding of the problem, the economic benefits, and the return on investment. ROI is particularly difficult to defend for any single organization because investment and return can be separated by many years and involve asymmetric investments, returns, and allocation of risk. Hosoi’s project will consider the incentives for a particular entity to invest in programs in order to reduce childhood obesity.

    Hosoi will be joined by graduate students Pragya Neupane and Rachael Kha, both of IDSS, as well a team from Accenture that includes Kenneth Munie, senior managing director at Accenture Strategy, Life Sciences; Kaveh Safavi, senior managing director in Accenture Health Industry; and Elizabeth Naik, global health and public service research lead.

    Generating innovative organizational configurations and algorithms for dealing with the problem of post-pandemic employment

    Thomas Malone is the Patrick J. McGovern (1959) Professor of Management at the MIT Sloan School of Management and the founding director of the MIT Center for Collective Intelligence. His research focuses on how new organizations can be designed to take advantage of the possibilities provided by information technology. Malone will be joined in this project by John Horton, the Richard S. Leghorn (1939) Career Development Professor at the MIT Sloan School of Management, whose research focuses on the intersection of labor economics, market design, and information systems. Malone and Horton’s project will look to reshape the future of work with the help of lessons learned in the wake of the pandemic.

    The Covid-19 pandemic has been a major disrupter of work and employment, and it is not at all obvious how governments, businesses, and other organizations should manage the transition to a desirable state of employment as the pandemic recedes. Using natural language processing algorithms such as GPT-4, this project will look to identify new ways that companies can use AI to better match applicants to necessary jobs, create new types of jobs, assess skill training needed, and identify interventions to help include women and other groups whose employment was disproportionately affected by the pandemic.

    In addition to Malone and Horton, the research team will include Rob Laubacher, associate director and research scientist at the MIT Center for Collective Intelligence, and Kathleen Kennedy, executive director at the MIT Center for Collective Intelligence and senior director at MIT Horizon. The team will also include Nitu Nivedita, managing director of artificial intelligence at Accenture, and Thomas Hancock, data science senior manager at Accenture. More

  • in

    Making aviation fuel from biomass

    In 2021, nearly a quarter of the world’s carbon dioxide emissions came from the transportation sector, with aviation being a significant contributor. While the growing use of electric vehicles is helping to clean up ground transportation, today’s batteries can’t compete with fossil fuel-derived liquid hydrocarbons in terms of energy delivered per pound of weight — a major concern when it comes to flying. Meanwhile, based on projected growth in travel demand, consumption of jet fuel is projected to double between now and 2050 — the year by which the international aviation industry has pledged to be carbon neutral.

    Many groups have targeted a 100 percent sustainable hydrocarbon fuel for aircraft, but without much success. Part of the challenge is that aviation fuels are so tightly regulated. “This is a subclass of fuels that has very specific requirements in terms of the chemistry and the physical properties of the fuel, because you can’t risk something going wrong in an airplane engine,” says Yuriy Román-Leshkov, the Robert T. Haslam Professor of Chemical Engineering. “If you’re flying at 30,000 feet, it’s very cold outside, and you don’t want the fuel to thicken or freeze. That’s why the formulation is very specific.”

    Aviation fuel is a combination of two large classes of chemical compounds. Some 75 to 90 percent of it is made up of “aliphatic” molecules, which consist of long chains of carbon atoms linked together. “This is similar to what we would find in diesel fuels, so it’s a classic hydrocarbon that is out there,” explains Román-Leshkov. The remaining 10 to 25 percent consists of “aromatic” molecules, each of which includes at least one ring made up of six connected carbon atoms.

    In most transportation fuels, aromatic hydrocarbons are viewed as a source of pollution, so they’re removed as much as possible. However, in aviation fuels, some aromatic molecules must remain because they set the necessary physical and combustion properties of the overall mixture. They also perform one more critical task: They ensure that seals between various components in the aircraft’s fuel system are tight. “The aromatics get absorbed by the plastic seals and make them swell,” explains Román-Leshkov. “If for some reason the fuel changes, so can the seals, and that’s very dangerous.”

    As a result, aromatics are a necessary component — but they’re also a stumbling block in the move to create sustainable aviation fuels, or SAFs. Companies know how to make the aliphatic fraction from inedible parts of plants and other renewables, but they haven’t yet developed an approved method of generating the aromatic fraction from sustainable sources. As a result, there’s a “blending wall,” explains Román-Leshkov. “Since we need that aromatic content — regardless of its source — there will always be a limit on how much of the sustainable aliphatic hydrocarbons we can use without changing the properties of the mixture.” He notes a similar blending wall with gasoline. “We have a lot of ethanol, but we can’t add more than 10 percent without changing the properties of the gasoline. In fact, current engines can’t handle even 15 percent ethanol without modification.”

    No shortage of renewable source material — or attempts to convert it

    For the past five years, understanding and solving the SAF problem has been the goal of research by Román-Leshkov and his MIT team — Michael L. Stone PhD ’21, Matthew S. Webber, and others — as well as their collaborators at Washington State University, the National Renewable Energy Laboratory (NREL), and the Pacific Northwest National Laboratory. Their work has focused on lignin, a tough material that gives plants structural support and protection against microbes and fungi. About 30 percent of the carbon in biomass is in lignin, yet when ethanol is generated from biomass, the lignin is left behind as a waste product.

    Despite valiant efforts, no one has found an economically viable, scalable way to turn lignin into useful products, including the aromatic molecules needed to make jet fuel 100 percent sustainable. Why not? As Román-Leshkov says, “It’s because of its chemical recalcitrance.” It’s difficult to make it chemically react in useful ways. As a result, every year millions of tons of waste lignin are burned as a low-grade fuel, used as fertilizer, or simply thrown away.

    Understanding the problem requires understanding what’s happening at the atomic level. A single lignin molecule — the starting point of the challenge — is a big “macromolecule” made up of a network of many aromatic rings connected by oxygen and hydrogen atoms. Put simply, the key to converting lignin into the aromatic fraction of SAF is to break that macromolecule into smaller pieces while in the process getting rid of all of the oxygen atoms.

    In general, most industrial processes begin with a chemical reaction that prevents the subsequent upgrading of lignin: As the lignin is extracted from the biomass, the aromatic molecules in it react with one another, linking together to form strong networks that won’t react further. As a result, the lignin is no longer useful for making aviation fuels.

    To avoid that outcome, Román-Leshkov and his team utilize another approach: They use a catalyst to induce a chemical reaction that wouldn’t normally occur during extraction. By reacting the biomass in the presence of a ruthenium-based catalyst, they are able to remove the lignin from the biomass and produce a black liquid called lignin oil. That product is chemically stable, meaning that the aromatic molecules in it will no longer react with one another.

    So the researchers have now successfully broken the original lignin macromolecule into fragments that contain just one or two aromatic rings each. However, while the isolated fragments don’t chemically react, they still contain oxygen atoms. Therefore, one task remains: finding a way to remove the oxygen atoms.

    In fact, says Román-Leshkov, getting from the molecules in the lignin oil to the targeted aromatic molecules required them to accomplish three things in a single step: They needed to selectively break the carbon-oxygen bonds to free the oxygen atoms; they needed to avoid incorporating noncarbon atoms into the aromatic rings (for example, atoms from the hydrogen gas that must be present for all of the chemical transformations to occur); and they needed to preserve the carbon backbone of the molecule — that is, the series of linked carbon atoms that connect the aromatic rings that remain.

    Ultimately, Román-Leshkov and his team found a special ingredient that would do the trick: a molybdenum carbide catalyst. “It’s actually a really amazing catalyst because it can perform those three actions very well,” says Román-Leshkov. “In addition to that, it’s extremely resistant to poisons. Plants can contain a lot of components like proteins, salts, and sulfur, which often poison catalysts so they don’t work anymore. But molybdenum carbide is very robust and isn’t strongly influenced by such impurities.”

    Trying it out on lignin from poplar trees

    To test their approach in the lab, the researchers first designed and built a specialized “trickle-bed” reactor, a type of chemical reactor in which both liquids and gases flow downward through a packed bed of catalyst particles. They then obtained biomass from a poplar, a type of tree known as an “energy crop” because it grows quickly and doesn’t require a lot of fertilizer.

    To begin, they reacted the poplar biomass in the presence of their ruthenium-based catalyst to extract the lignin and produce the lignin oil. They then flowed the oil through their trickle-bed reactor containing the molybdenum carbide catalyst. The mixture that formed contained some of the targeted product but also a lot of others that still contained oxygen atoms.

    Román-Leshkov notes that in a trickle-bed reactor, the time during which the lignin oil is exposed to the catalyst depends entirely on how quickly it drips down through the packed bed. To increase the exposure time, they tried passing the oil through the same catalyst twice. However, the distribution of products that formed in the second pass wasn’t as they had predicted based on the outcome of the first pass.

    With further investigation, they figured out why. The first time the lignin oil drips through the reactor, it deposits oxygen onto the catalyst. The deposition of the oxygen changes the behavior of the catalyst such that certain products appear or disappear — with the temperature being critical. “The temperature and oxygen content set the condition of the catalyst in the first pass,” says Román-Leshkov. “Then, on the second pass, the oxygen content in the flow is lower, and the catalyst can fully break the remaining carbon-oxygen bonds.” The process can thus operate continuously: Two separate reactors containing independent catalyst beds would be connected in series, with the first pretreating the lignin oil and the second removing any oxygen that remains.

    Based on a series of experiments involving lignin oil from poplar biomass, the researchers determined the operating conditions yielding the best outcome: 350 degrees Celsius in the first step and 375 C in the second step. Under those optimized conditions, the mixture that forms is dominated by the targeted aromatic products, with the remainder consisting of small amounts of other jet-fuel aliphatic molecules and some remaining oxygen-containing molecules. The catalyst remains stable while generating more than 87 percent (by weight) of aromatic molecules.

    “When we do our chemistry with the molybdenum carbide catalyst, our total carbon yields are nearly 85 percent of the theoretical carbon yield,” says Román-Leshkov. “In most lignin-conversion processes, the carbon yields are very low, on the order of 10 percent. That’s why the catalysis community got very excited about our results — because people had not seen carbon yields as high as the ones we generated with this catalyst.”

    There remains one key question: Does the mixture of components that forms have the properties required for aviation fuel? “When we work with these new substrates to make new fuels, the blend that we create is different from standard jet fuel,” says Román-Leshkov. “Unless it has the exact properties required, it will not qualify for certification as jet fuel.”

    To check their products, Román-Leshkov and his team send samples to Washington State University, where a team operates a combustion lab devoted to testing fuels. Results from initial testing of the composition and properties of the samples have been encouraging. Based on the composition and published prescreening tools and procedures, the researchers have made initial property predictions for their samples, and they looked good. For example, the freezing point, viscosity, and threshold sooting index are predicted to be lower than the values for conventional aviation aromatics. (In other words, their material should flow more easily and be less likely to freeze than conventional aromatics while also generating less soot in the atmosphere when they burn.) Overall, the predicted properties are near to or more favorable than those of conventional fuel aromatics.

    Next steps

    The researchers are continuing to study how their sample blends behave at different temperatures and, in particular, how well they perform that key task: soaking into and swelling the seals inside jet engines. “These molecules are not the typical aromatic molecules that you use in jet fuel,” says Román-Leshkov. “Preliminary tests with sample seals show that there’s no difference in how our lignin-derived aromatics swell the seals, but we need to confirm that. There’s no room for error.”

    In addition, he and his team are working with their NREL collaborators to scale up their methods. NREL has much larger reactors and other infrastructure needed to produce large quantities of the new sustainable blend. Based on the promising results thus far, the team wants to be prepared for the further testing required for the certification of jet fuels. In addition to testing samples of the fuel, the full certification procedure calls for demonstrating its behavior in an operating engine — “not while flying, but in a lab,” clarifies Román-Leshkov. In addition to requiring large samples, that demonstration is both time-consuming and expensive — which is why it’s the very last step in the strict testing required for a new sustainable aviation fuel to be approved.

    Román-Leshkov and his colleagues are now exploring the use of their approach with other types of biomass, including pine, switchgrass, and corn stover (the leaves, stalks, and cobs left after corn is harvested). But their results with poplar biomass are promising. If further testing confirms that their aromatic products can replace the aromatics now in jet fuel, “the blending wall could disappear,” says Román-Leshkov. “We’ll have a means of producing all the components in aviation fuel from renewable material, potentially leading to aircraft fuel that’s 100 percent sustainable.”

    This research was initially funded by the Center for Bioenergy Innovation, a U.S. Department of Energy (DOE) Research Center supported by the Office of Biological and Environmental Research in the DOE Office of Science. More recent funding came from the DOE Bioenergy Technologies Office and from Eni S.p.A. through the MIT Energy Initiative. Michael L. Stone PhD ’21 is now a postdoc in chemical engineering at Stanford University. Matthew S. Webber is a graduate student in the Román-Leshkov group, now on leave for an internship at the National Renewable Energy Laboratory.

    This article appears in the Spring 2023 issue of Energy Futures, the magazine of the MIT Energy Initiative. More