More stories

  • in

    Inaugural J-WAFS Grand Challenge aims to develop enhanced crop variants and move them from lab to land

    According to MIT’s charter, established in 1861, part of the Institute’s mission is to advance the “development and practical application of science in connection with arts, agriculture, manufactures, and commerce.” Today, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) is one of the driving forces behind water and food-related research on campus, much of which relates to agriculture. In 2022, J-WAFS established the Water and Food Grand Challenge Grant to inspire MIT researchers to work toward a water-secure and food-secure future for our changing planet. Not unlike MIT’s Climate Grand Challenges, the J-WAFS Grand Challenge seeks to leverage multiple areas of expertise, programs, and Institute resources. The initial call for statements of interests returned 23 letters from MIT researchers spanning 18 departments, labs, and centers. J-WAFS hosted workshops for the proposers to present and discuss their initial ideas. These were winnowed down to a smaller set of invited concept papers, followed by the final proposal stage. 

    Today, J-WAFS is delighted to report that the inaugural J-WAFS Grand Challenge Grant has been awarded to a team of researchers led by Professor Matt Shoulders and research scientist Robert Wilson of the Department of Chemistry. A panel of expert, external reviewers highly endorsed their proposal, which tackles a longstanding problem in crop biology — how to make photosynthesis more efficient. The team will receive $1.5 million over three years to facilitate a multistage research project that combines cutting-edge innovations in synthetic and computational biology. If successful, this project could create major benefits for agriculture and food systems worldwide.

    “Food systems are a major source of global greenhouse gas emissions, and they are also increasingly vulnerable to the impacts of climate change. That’s why when we talk about climate change, we have to talk about food systems, and vice versa,” says Maria T. Zuber, MIT’s vice president for research. “J-WAFS is central to MIT’s efforts to address the interlocking challenges of climate, water, and food. This new grant program aims to catalyze innovative projects that will have real and meaningful impacts on water and food. I congratulate Professor Shoulders and the rest of the research team on being the inaugural recipients of this grant.”

    Shoulders will work with Bryan Bryson, associate professor of biological engineering, as well as Bin Zhang, associate professor of chemistry, and Mary Gehring, a professor in the Department of Biology and the Whitehead Institute for Biomedical Research. Robert Wilson from the Shoulders lab will be coordinating the research effort. The team at MIT will work with outside collaborators Spencer Whitney, a professor from the Australian National University, and Ahmed Badran, an assistant professor at the Scripps Research Institute. A milestone-based collaboration will also take place with Stephen Long, a professor from the University of Illinois at Urbana-Champaign. The group consists of experts in continuous directed evolution, machine learning, molecular dynamics simulations, translational plant biochemistry, and field trials.

    “This project seeks to fundamentally improve the RuBisCO enzyme that plants use to convert carbon dioxide into the energy-rich molecules that constitute our food,” says J-WAFS Director John H. Lienhard V. “This difficult problem is a true grand challenge, calling for extensive resources. With J-WAFS’ support, this long-sought goal may finally be achieved through MIT’s leading-edge research,” he adds.

    RuBisCO: No, it’s not a new breakfast cereal; it just might be the key to an agricultural revolution

    A growing global population, the effects of climate change, and social and political conflicts like the war in Ukraine are all threatening food supplies, particularly grain crops. Current projections estimate that crop production must increase by at least 50 percent over the next 30 years to meet food demands. One key barrier to increased crop yields is a photosynthetic enzyme called Ribulose-1,5-Bisphosphate Carboxylase/Oxygenase (RuBisCO). During photosynthesis, crops use energy gathered from light to draw carbon dioxide (CO2) from the atmosphere and transform it into sugars and cellulose for growth, a process known as carbon fixation. RuBisCO is essential for capturing the CO2 from the air to initiate conversion of CO2 into energy-rich molecules like glucose. This reaction occurs during the second stage of photosynthesis, also known as the Calvin cycle. Without RuBisCO, the chemical reactions that account for virtually all carbon acquisition in life could not occur.

    Unfortunately, RuBisCO has biochemical shortcomings. Notably, the enzyme acts slowly. Many other enzymes can process a thousand molecules per second, but RuBisCO in chloroplasts fixes less than six carbon dioxide molecules per second, often limiting the rate of plant photosynthesis. Another problem is that oxygen (O2) molecules and carbon dioxide molecules are relatively similar in shape and chemical properties, and RuBisCO is unable to fully discriminate between the two. The inadvertent fixation of oxygen by RuBisCO leads to energy and carbon loss. What’s more, at higher temperatures RuBisCO reacts even more frequently with oxygen, which will contribute to decreased photosynthetic efficiency in many staple crops as our climate warms.

    The scientific consensus is that genetic engineering and synthetic biology approaches could revolutionize photosynthesis and offer protection against crop losses. To date, crop RuBisCO engineering has been impaired by technological obstacles that have limited any success in significantly enhancing crop production. Excitingly, genetic engineering and synthetic biology tools are now at a point where they can be applied and tested with the aim of creating crops with new or improved biological pathways for producing more food for the growing population.

    An epic plan for fighting food insecurity

    The 2023 J-WAFS Grand Challenge project will use state-of-the-art, transformative protein engineering techniques drawn from biomedicine to improve the biochemistry of photosynthesis, specifically focusing on RuBisCO. Shoulders and his team are planning to build what they call the Enhanced Photosynthesis in Crops (EPiC) platform. The project will evolve and design better crop RuBisCO in the laboratory, followed by validation of the improved enzymes in plants, ultimately resulting in the deployment of enhanced RuBisCO in field trials to evaluate the impact on crop yield. 

    Several recent developments make high-throughput engineering of crop RuBisCO possible. RuBisCO requires a complex chaperone network for proper assembly and function in plants. Chaperones are like helpers that guide proteins during their maturation process, shielding them from aggregation while coordinating their correct assembly. Wilson and his collaborators previously unlocked the ability to recombinantly produce plant RuBisCO outside of plant chloroplasts by reconstructing this chaperone network in Escherichia coli (E. coli). Whitney has now established that the RuBisCO enzymes from a range of agriculturally relevant crops, including potato, carrot, strawberry, and tobacco, can also be expressed using this technology. Whitney and Wilson have further developed a range of RuBisCO-dependent E. coli screens that can identify improved RuBisCO from complex gene libraries. Moreover, Shoulders and his lab have developed sophisticated in vivo mutagenesis technologies that enable efficient continuous directed evolution campaigns. Continuous directed evolution refers to a protein engineering process that can accelerate the steps of natural evolution simultaneously in an uninterrupted cycle in the lab, allowing for rapid testing of protein sequences. While Shoulders and Badran both have prior experience with cutting-edge directed evolution platforms, this will be the first time directed evolution is applied to RuBisCO from plants.

    Artificial intelligence is changing the way enzyme engineering is undertaken by researchers. Principal investigators Zhang and Bryson will leverage modern computational methods to simulate the dynamics of RuBisCO structure and explore its evolutionary landscape. Specifically, Zhang will use molecular dynamics simulations to simulate and monitor the conformational dynamics of the atoms in a protein and its programmed environment over time. This approach will help the team evaluate the effect of mutations and new chemical functionalities on the properties of RuBisCO. Bryson will employ artificial intelligence and machine learning to search the RuBisCO activity landscape for optimal sequences. The computational and biological arms of the EPiC platform will work together to both validate and inform each other’s approaches to accelerate the overall engineering effort.

    Shoulders and the group will deploy their designed enzymes in tobacco plants to evaluate their effects on growth and yield relative to natural RuBisCO. Gehring, a plant biologist, will assist with screening improved RuBisCO variants using the tobacco variety Nicotiana benthamianaI, where transient expression can be deployed. Transient expression is a speedy approach to test whether novel engineered RuBisCO variants can be correctly synthesized in leaf chloroplasts. Variants that pass this quality-control checkpoint at MIT will be passed to the Whitney Lab at the Australian National University for stable transformation into Nicotiana tabacum (tobacco), enabling robust measurements of photosynthetic improvement. In a final step, Professor Long at the University of Illinois at Urbana-Champaign will perform field trials of the most promising variants.

    Even small improvements could have a big impact

    A common criticism of efforts to improve RuBisCO is that natural evolution has not already identified a better enzyme, possibly implying that none will be found. Traditional views have speculated a catalytic trade-off between RuBisCO’s specificity factor for CO2 / O2 versus its CO2 fixation efficiency, leading to the belief that specificity factor improvements might be offset by even slower carbon fixation or vice versa. This trade-off has been suggested to explain why natural evolution has been slow to achieve a better RuBisCO. But Shoulders and the team are convinced that the EPiC platform can unlock significant overall improvements to plant RuBisCO. This view is supported by the fact that Wilson and Whitney have previously used directed evolution to improve CO2 fixation efficiency by 50 percent in RuBisCO from cyanobacteria (the ancient progenitors of plant chloroplasts) while simultaneously increasing the specificity factor. 

    The EPiC researchers anticipate that their initial variants could yield 20 percent increases in RuBisCO’s specificity factor without impairing other aspects of catalysis. More sophisticated variants could lift RuBisCO out of its evolutionary trap and display attributes not currently observed in nature. “If we achieve anywhere close to such an improvement and it translates to crops, the results could help transform agriculture,” Shoulders says. “If our accomplishments are more modest, it will still recruit massive new investments to this essential field.”

    Successful engineering of RuBisCO would be a scientific feat of its own and ignite renewed enthusiasm for improving plant CO2 fixation. Combined with other advances in photosynthetic engineering, such as improved light usage, a new green revolution in agriculture could be achieved. Long-term impacts of the technology’s success will be measured in improvements to crop yield and grain availability, as well as resilience against yield losses under higher field temperatures. Moreover, improved land productivity together with policy initiatives would assist in reducing the environmental footprint of agriculture. With more “crop per drop,” reductions in water consumption from agriculture would be a major boost to sustainable farming practices.

    “Our collaborative team of biochemists and synthetic biologists, computational biologists, and chemists is deeply integrated with plant biologists and field trial experts, yielding a robust feedback loop for enzyme engineering,” Shoulders adds. “Together, this team will be able to make a concerted effort using the most modern, state-of-the-art techniques to engineer crop RuBisCO with an eye to helping make meaningful gains in securing a stable crop supply, hopefully with accompanying improvements in both food and water security.” More

  • in

    Moving perovskite advancements from the lab to the manufacturing floor

    The following was issued as a joint announcement from MIT.nano and the MIT Research Laboratory for Electronics; CubicPV; Verde Technologies; Princeton University; and the University of California at San Diego.

    Tandem solar cells are made of stacked materials — such as silicon paired with perovskites — that together absorb more of the solar spectrum than single materials, resulting in a dramatic increase in efficiency. Their potential to generate significantly more power than conventional cells could make a meaningful difference in the race to combat climate change and the transition to a clean-energy future.

    However, current methods to create stable and efficient perovskite layers require time-consuming, painstaking rounds of design iteration and testing, inhibiting their development for commercial use. Today, the U.S. Department of Energy Solar Energy Technologies Office (SETO) announced that MIT has been selected to receive an $11.25 million cost-shared award to establish a new research center to address this challenge by using a co-optimization framework guided by machine learning and automation.

    A collaborative effort with lead industry participant CubicPV, solar startup Verde Technologies, and academic partners Princeton University and the University of California San Diego (UC San Diego), the center will bring together teams of researchers to support the creation of perovskite-silicon tandem solar modules that are co-designed for both stability and performance, with goals to significantly accelerate R&D and the transfer of these achievements into commercial environments.

    “Urgent challenges demand rapid action. This center will accelerate the development of tandem solar modules by bringing academia and industry into closer partnership,” says MIT professor of mechanical engineering Tonio Buonassisi, who will direct the center. “We’re grateful to the Department of Energy for supporting this powerful new model and excited to get to work.”

    Adam Lorenz, CTO of solar energy technology company CubicPV, stresses the importance of thinking about scale, alongside quality and efficiency, to accelerate the perovskite effort into the commercial environment. “Instead of chasing record efficiencies with tiny pixel-sized devices and later attempting to stabilize them, we will simultaneously target stability, reproducibility, and efficiency,” he says. “It’s a module-centric approach that creates a direct channel for R&D advancements into industry.”

    The center will be named Accelerated Co-Design of Durable, Reproducible, and Efficient Perovskite Tandems, or ADDEPT. The grant will be administered through the MIT Research Laboratory for Electronics (RLE).

    David Fenning, associate professor of nanoengineering at UC San Diego, has worked with Buonassisi on the idea of merging materials, automation, and computation, specifically in this field of artificial intelligence and solar, since 2014. Now, a central thrust of the ADDEPT project will be to deploy machine learning and robotic screening to optimize processing of perovskite-based solar materials for efficiency and durability.

    “We have already seen early indications of successful technology transfer between our UC San Diego robot PASCAL and industry,” says Fenning. “With this new center, we will bring research labs and the emerging perovskite industry together to improve reproducibility and reduce time to market.”

    “Our generation has an obligation to work collaboratively in the fight against climate change,” says Skylar Bagdon, CEO of Verde Technologies, which received the American-Made Perovskite Startup Prize. “Throughout the course of this center, Verde will do everything in our power to help this brilliant team transition lab-scale breakthroughs into the world where they can have an impact.”

    Several of the academic partners echoed the importance of the joint effort between academia and industry. Barry Rand, professor of electrical and computer engineering at the Andlinger Center for Energy and the Environment at Princeton University, pointed to the intersection of scientific knowledge and market awareness. “Understanding how chemistry affects films and interfaces will empower us to co-design for stability and performance,” he says. “The center will accelerate this use-inspired science, with close guidance from our end customers, the industry partners.”

    A critical resource for the center will be MIT.nano, a 200,000-square-foot research facility set in the heart of the campus. MIT.nano Director Vladimir Bulović, the Fariborz Maseeh (1990) Professor of Emerging Technology, says he envisions MIT.nano as a hub for industry and academic partners, facilitating technology development and transfer through shared lab space, open-access equipment, and streamlined intellectual property frameworks.

    “MIT has a history of groundbreaking innovation using perovskite materials for solar applications,” says Bulović. “We’re thrilled to help build on that history by anchoring ADDEPT at MIT.nano and working to help the nation advance the future of these promising materials.”

    MIT was selected as a part of the SETO Fiscal Year 2022 Photovoltaics (PV) funding program, an effort to reduce costs and supply chain vulnerabilities, further develop durable and recyclable solar technologies, and advance perovskite PV technologies toward commercialization. ADDEPT is one project that will tackle perovskite durability, which will extend module life. The overarching goal of these projects is to lower the levelized cost of electricity generated by PV.

    Research groups involved with the ADDEPT project at MIT include Buonassisi’s Accelerated Materials Laboratory for Sustainability (AMLS), Bulović’s Organic and Nanostructured Electronics (ONE) Lab, and the Bawendi Group led by Lester Wolfe Professor in Chemistry Moungi Bawendi. Also working on the project is Jeremiah Mwaura, research scientist in the ONE Lab. More

  • in

    Detailed images from space offer clearer picture of drought effects on plants

    “MIT is a place where dreams come true,” says César Terrer, an assistant professor in the Department of Civil and Environmental Engineering. Here at MIT, Terrer says he’s given the resources needed to explore ideas he finds most exciting, and at the top of his list is climate science. In particular, he is interested in plant-soil interactions, and how the two can mitigate impacts of climate change. In 2022, Terrer received seed grant funding from the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) to produce drought monitoring systems for farmers. The project is leveraging a new generation of remote sensing devices to provide high-resolution plant water stress at regional to global scales.

    Growing up in Granada, Spain, Terrer always had an aptitude and passion for science. He studied environmental science at the University of Murcia, where he interned in the Department of Ecology. Using computational analysis tools, he worked on modeling species distribution in response to human development. Early on in his undergraduate experience, Terrer says he regarded his professors as “superheroes” with a kind of scholarly prowess. He knew he wanted to follow in their footsteps by one day working as a faculty member in academia. Of course, there would be many steps along the way before achieving that dream. 

    Upon completing his undergraduate studies, Terrer set his sights on exciting and adventurous research roles. He thought perhaps he would conduct field work in the Amazon, engaging with native communities. But when the opportunity arose to work in Australia on a state-of-the-art climate change experiment that simulates future levels of carbon dioxide, he headed south to study how plants react to CO2 in a biome of native Australian eucalyptus trees. It was during this experience that Terrer started to take a keen interest in the carbon cycle and the capacity of ecosystems to buffer rising levels of CO2 caused by human activity.

    Around 2014, he began to delve deeper into the carbon cycle as he began his doctoral studies at Imperial College London. The primary question Terrer sought to answer during his PhD was “will plants be able to absorb predicted future levels of CO2 in the atmosphere?” To answer the question, Terrer became an early adopter of artificial intelligence, machine learning, and remote sensing to analyze data from real-life, global climate change experiments. His findings from these “ground truth” values and observations resulted in a paper in the journal Science. In it, he claimed that climate models most likely overestimated how much carbon plants will be able to absorb by the end of the century, by a factor of three. 

    After postdoctoral positions at Stanford University and the Universitat Autonoma de Barcelona, followed by a prestigious Lawrence Fellowship, Terrer says he had “too many ideas and not enough time to accomplish all those ideas.” He knew it was time to lead his own group. Not long after applying for faculty positions, he landed at MIT. 

    New ways to monitor drought

    Terrer is employing similar methods to those he used during his PhD to analyze data from all over the world for his J-WAFS project. He and postdoc Wenzhe Jiao collect data from remote sensing satellites and field experiments and use machine learning to come up with new ways to monitor drought. Terrer says Jiao is a “remote sensing wizard,” who fuses data from different satellite products to understand the water cycle. With Jiao’s hydrology expertise and Terrer’s knowledge of plants, soil, and the carbon cycle, the duo is a formidable team to tackle this project.

    According to the U.N. World Meteorological Organization, the number and duration of droughts has increased by 29 percent since 2000, as compared to the two previous decades. From the Horn of Africa to the Western United States, drought is devastating vegetation and severely stressing water supplies, compromising food production and spiking food insecurity. Drought monitoring can offer fundamental information on drought location, frequency, and severity, but assessing the impact of drought on vegetation is extremely challenging. This is because plants’ sensitivity to water deficits varies across species and ecosystems. 

    Terrer and Jiao are able to obtain a clearer picture of how drought is affecting plants by employing the latest generation of remote sensing observations, which offer images of the planet with incredible spatial and temporal resolution. Satellite products such as Sentinel, Landsat, and Planet can provide daily images from space with such high resolution that individual trees can be discerned. Along with the images and datasets from satellites, the team is using ground-based observations from meteorological data. They are also using the MIT SuperCloud at MIT Lincoln Laboratory to process and analyze all of the data sets. The J-WAFS project is among one of the first to leverage high-resolution data to quantitatively measure plant drought impacts in the United States with the hopes of expanding to a global assessment in the future.

    Assisting farmers and resource managers 

    Every week, the U.S. Drought Monitor provides a map of drought conditions in the United States. The map has zero resolution and is more of a drought recap or summary, unable to predict future drought scenarios. The lack of a comprehensive spatiotemporal evaluation of historic and future drought impacts on global vegetation productivity is detrimental to farmers both in the United States and worldwide.  

    Terrer and Jiao plan to generate metrics for plant water stress at an unprecedented resolution of 10-30 meters. This means that they will be able to provide drought monitoring maps at the scale of a typical U.S. farm, giving farmers more precise, useful data every one to two days. The team will use the information from the satellites to monitor plant growth and soil moisture, as well as the time lag of plant growth response to soil moisture. In this way, Terrer and Jiao say they will eventually be able to create a kind of “plant water stress forecast” that may be able to predict adverse impacts of drought four weeks in advance. “According to the current soil moisture and lagged response time, we hope to predict plant water stress in the future,” says Jiao. 

    The expected outcomes of this project will give farmers, land and water resource managers, and decision-makers more accurate data at the farm-specific level, allowing for better drought preparation, mitigation, and adaptation. “We expect to make our data open-access online, after we finish the project, so that farmers and other stakeholders can use the maps as tools,” says Jiao. 

    Terrer adds that the project “has the potential to help us better understand the future states of climate systems, and also identify the regional hot spots more likely to experience water crises at the national, state, local, and tribal government scales.” He also expects the project will enhance our understanding of global carbon-water-energy cycle responses to drought, with applications in determining climate change impacts on natural ecosystems as a whole. More

  • in

    Computers that power self-driving cars could be a huge driver of global carbon emissions

    In the future, the energy needed to run the powerful computers on board a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today.

    That is one key finding of a new study from MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles are widely adopted.

    The data centers that house the physical computing infrastructure used for running applications are widely known for their large carbon footprint: They currently account for about 0.3 percent of global greenhouse gas emissions, or about as much carbon as the country of Argentina produces annually, according to the International Energy Agency. Realizing that less attention has been paid to the potential footprint of autonomous vehicles, the MIT researchers built a statistical model to study the problem. They determined that 1 billion autonomous vehicles, each driving for one hour per day with a computer consuming 840 watts, would consume enough energy to generate about the same amount of emissions as data centers currently do.

    The researchers also found that in over 90 percent of modeled scenarios, to keep autonomous vehicle emissions from zooming past current data center emissions, each vehicle must use less than 1.2 kilowatts of power for computing, which would require more efficient hardware. In one scenario — where 95 percent of the global fleet of vehicles is autonomous in 2050, computational workloads double every three years, and the world continues to decarbonize at the current rate — they found that hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels.

    “If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn’t seem like it is going to be enough to constrain the emissions from computing onboard autonomous vehicles. This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start,” says first author Soumya Sudhakar, a graduate student in aeronautics and astronautics.

    Sudhakar wrote the paper with her co-advisors Vivienne Sze, associate professor in the Department of Electrical Engineering and Computer Science (EECS) and a member of the Research Laboratory of Electronics (RLE); and Sertac Karaman, associate professor of aeronautics and astronautics and director of the Laboratory for Information and Decision Systems (LIDS). The research appears today in the January-February issue of IEEE Micro.

    Modeling emissions

    The researchers built a framework to explore the operational emissions from computers on board a global fleet of electric vehicles that are fully autonomous, meaning they don’t require a back-up human driver.

    The model is a function of the number of vehicles in the global fleet, the power of each computer on each vehicle, the hours driven by each vehicle, and the carbon intensity of the electricity powering each computer.

    “On its own, that looks like a deceptively simple equation. But each of those variables contains a lot of uncertainty because we are considering an emerging application that is not here yet,” Sudhakar says.

    For instance, some research suggests that the amount of time driven in autonomous vehicles might increase because people can multitask while driving and the young and the elderly could drive more. But other research suggests that time spent driving might decrease because algorithms could find optimal routes that get people to their destinations faster.

    In addition to considering these uncertainties, the researchers also needed to model advanced computing hardware and software that doesn’t exist yet.

    To accomplish that, they modeled the workload of a popular algorithm for autonomous vehicles, known as a multitask deep neural network because it can perform many tasks at once. They explored how much energy this deep neural network would consume if it were processing many high-resolution inputs from many cameras with high frame rates, simultaneously.

    When they used the probabilistic model to explore different scenarios, Sudhakar was surprised by how quickly the algorithms’ workload added up.

    For example, if an autonomous vehicle has 10 deep neural networks processing images from 10 cameras, and that vehicle drives for one hour a day, it will make 21.6 million inferences each day. One billion vehicles would make 21.6 quadrillion inferences. To put that into perspective, all of Facebook’s data centers worldwide make a few trillion inferences each day (1 quadrillion is 1,000 trillion).

    “After seeing the results, this makes a lot of sense, but it is not something that is on a lot of people’s radar. These vehicles could actually be using a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they may have 20 eyes, looking all over the place and trying to understand all the things that are happening at the same time,” Karaman says.

    Autonomous vehicles would be used for moving goods, as well as people, so there could be a massive amount of computing power distributed along global supply chains, he says. And their model only considers computing — it doesn’t take into account the energy consumed by vehicle sensors or the emissions generated during manufacturing.

    Keeping emissions in check

    To keep emissions from spiraling out of control, the researchers found that each autonomous vehicle needs to consume less than 1.2 kilowatts of energy for computing. For that to be possible, computing hardware must become more efficient at a significantly faster pace, doubling in efficiency about every 1.1 years.

    One way to boost that efficiency could be to use more specialized hardware, which is designed to run specific driving algorithms. Because researchers know the navigation and perception tasks required for autonomous driving, it could be easier to design specialized hardware for those tasks, Sudhakar says. But vehicles tend to have 10- or 20-year lifespans, so one challenge in developing specialized hardware would be to “future-proof” it so it can run new algorithms.

    In the future, researchers could also make the algorithms more efficient, so they would need less computing power. However, this is also challenging because trading off some accuracy for more efficiency could hamper vehicle safety.

    Now that they have demonstrated this framework, the researchers want to continue exploring hardware efficiency and algorithm improvements. In addition, they say their model can be enhanced by characterizing embodied carbon from autonomous vehicles — the carbon emissions generated when a car is manufactured — and emissions from a vehicle’s sensors.

    While there are still many scenarios to explore, the researchers hope that this work sheds light on a potential problem people may not have considered.

    “We are hoping that people will think of emissions and carbon efficiency as important metrics to consider in their designs. The energy consumption of an autonomous vehicle is really critical, not just for extending the battery life, but also for sustainability,” says Sze.

    This research was funded, in part, by the National Science Foundation and the MIT-Accenture Fellowship. More

  • in

    Strengthening electron-triggered light emission

    The way electrons interact with photons of light is a key part of many modern technologies, from lasers to solar panels to LEDs. But the interaction is inherently a weak one because of a major mismatch in scale: A wavelength of visible light is about 1,000 times larger than an electron, so the way the two things affect each other is limited by that disparity.

    Now, researchers at MIT and elsewhere have come up with an innovative way to make much stronger interactions between photons and electrons possible, in the process producing a hundredfold increase in the emission of light from a phenomenon called Smith-Purcell radiation. The finding has potential implications for both commercial applications and fundamental scientific research, although it will require more years of research to make it practical.

    The findings are reported today in the journal Nature, in a paper by MIT postdocs Yi Yang (now an assistant professor at the University of Hong Kong) and Charles Roques-Carmes, MIT professors Marin Soljačić and John Joannopoulos, and five others at MIT, Harvard University, and Technion-Israel Institute of Technology.

    In a combination of computer simulations and laboratory experiments, the team found that using a beam of electrons in combination with a specially designed photonic crystal — a slab of silicon on an insulator, etched with an array of nanometer-scale holes — they could theoretically predict stronger emission by many orders of magnitude than would ordinarily be possible in conventional Smith-Purcell radiation. They also experimentally recorded a one hundredfold increase in radiation in their proof-of-concept measurements.

    Unlike other approaches to producing sources of light or other electromagnetic radiation, the free-electron-based method is fully tunable — it can produce emissions of any desired wavelength, simply by adjusting the size of the photonic structure and the speed of the electrons. This may make it especially valuable for making sources of emission at wavelengths that are difficult to produce efficiently, including terahertz waves, ultraviolet light, and X-rays.

    The team has so far demonstrated the hundredfold enhancement in emission using a repurposed electron microscope to function as an electron beam source. But they say that the basic principle involved could potentially enable far greater enhancements using devices specifically adapted for this function.

    The approach is based on a concept called flatbands, which have been widely explored in recent years for condensed matter physics and photonics but have never been applied to affecting the basic interaction of photons and free electrons. The underlying principle involves the transfer of momentum from the electron to a group of photons, or vice versa. Whereas conventional light-electron interactions rely on producing light at a single angle, the photonic crystal is tuned in such a way that it enables the production of a whole range of angles.

    The same process could also be used in the opposite direction, using resonant light waves to propel electrons, increasing their velocity in a way that could potentially be harnessed to build miniaturized particle accelerators on a chip. These might ultimately be able to perform some functions that currently require giant underground tunnels, such as the 30-kilometer-wide Large Hadron Collider in Switzerland.

    “If you could actually build electron accelerators on a chip,” Soljačić says, “you could make much more compact accelerators for some of the applications of interest, which would still produce very energetic electrons. That obviously would be huge. For many applications, you wouldn’t have to build these huge facilities.”

    The new system could also potentially provide a highly controllable X-ray beam for radiotherapy purposes, Roques-Carmes says.

    And the system could be used to generate multiple entangled photons, a quantum effect that could be useful in the creation of quantum-based computational and communications systems, the researchers say. “You can use electrons to couple many photons together, which is a considerably hard problem if using a purely optical approach,” says Yang. “That is one of the most exciting future directions of our work.”

    Much work remains to translate these new findings into practical devices, Soljačić cautions. It may take some years to develop the necessary interfaces between the optical and electronic components and how to connect them on a single chip, and to develop the necessary on-chip electron source producing a continuous wavefront, among other challenges.

    “The reason this is exciting,” Roques-Carmes adds, “is because this is quite a different type of source.” While most technologies for generating light are restricted to very specific ranges of color or wavelength, and “it’s usually difficult to move that emission frequency. Here it’s completely tunable. Simply by changing the velocity of the electrons, you can change the emission frequency. … That excites us about the potential of these sources. Because they’re different, they offer new types of opportunities.”

    But, Soljačić concludes, “in order for them to become truly competitive with other types of sources, I think it will require some more years of research. I would say that with some serious effort, in two to five years they might start competing in at least some areas of radiation.”

    The research team also included Steven Kooi at MIT’s Institute for Soldier Nanotechnologies, Haoning Tang and Eric Mazur at Harvard University, Justin Beroz at MIT, and Ido Kaminer at Technion-Israel Institute of Technology. The work was supported by the U.S. Army Research Office through the Institute for Soldier Nanotechnologies, the U.S. Air Force Office of Scientific Research, and the U.S. Office of Naval Research. More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More

  • in

    Computing for the health of the planet

    The health of the planet is one of the most important challenges facing humankind today. From climate change to unsafe levels of air and water pollution to coastal and agricultural land erosion, a number of serious challenges threaten human and ecosystem health.

    Ensuring the health and safety of our planet necessitates approaches that connect scientific, engineering, social, economic, and political aspects. New computational methods can play a critical role by providing data-driven models and solutions for cleaner air, usable water, resilient food, efficient transportation systems, better-preserved biodiversity, and sustainable sources of energy.

    The MIT Schwarzman College of Computing is committed to hiring multiple new faculty in computing for climate and the environment, as part of MIT’s plan to recruit 20 climate-focused faculty under its climate action plan. This year the college undertook searches with several departments in the schools of Engineering and Science for shared faculty in computing for health of the planet, one of the six strategic areas of inquiry identified in an MIT-wide planning process to help focus shared hiring efforts. The college also undertook searches for core computing faculty in the Department of Electrical Engineering and Computer Science (EECS).

    The searches are part of an ongoing effort by the MIT Schwarzman College of Computing to hire 50 new faculty — 25 shared with other academic departments and 25 in computer science and artificial intelligence and decision-making. The goal is to build capacity at MIT to help more deeply infuse computing and other disciplines in departments.

    Four interdisciplinary scholars were hired in these searches. They will join the MIT faculty in the coming year to engage in research and teaching that will advance physical understanding of low-carbon energy solutions, Earth-climate modeling, biodiversity monitoring and conservation, and agricultural management through high-performance computing, transformational numerical methods, and machine-learning techniques.

    “By coordinating hiring efforts with multiple departments and schools, we were able to attract a cohort of exceptional scholars in this area to MIT. Each of them is developing and using advanced computational methods and tools to help find solutions for a range of climate and environmental issues,” says Daniel Huttenlocher, dean of the MIT Schwarzman College of Computing and the Henry Warren Ellis Professor of Electrical Engineering and Computer Science. “They will also help strengthen cross-departmental ties in computing across an important, critical area for MIT and the world.”

    “These strategic hires in the area of computing for climate and the environment are an incredible opportunity for the college to deepen its academic offerings and create new opportunity for collaboration across MIT,” says Anantha P. Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. “The college plays a pivotal role in MIT’s overarching effort to hire climate-focused faculty — introducing the critical role of computing to address the health of the planet through innovative research and curriculum.”

    The four new faculty members are:

    Sara Beery will join MIT as an assistant professor in the Faculty of Artificial Intelligence and Decision-Making in EECS in September 2023. Beery received her PhD in computing and mathematical sciences at Caltech in 2022, where she was advised by Pietro Perona. Her research focuses on building computer vision methods that enable global-scale environmental and biodiversity monitoring across data modalities, tackling real-world challenges including strong spatiotemporal correlations, imperfect data quality, fine-grained categories, and long-tailed distributions. She partners with nongovernmental organizations and government agencies to deploy her methods in the wild worldwide and works toward increasing the diversity and accessibility of academic research in artificial intelligence through interdisciplinary capacity building and education.

    Priya Donti will join MIT as an assistant professor in the faculties of Electrical Engineering and Artificial Intelligence and Decision-Making in EECS in academic year 2023-24. Donti recently finished her PhD in the Computer Science Department and the Department of Engineering and Public Policy at Carnegie Mellon University, co-advised by Zico Kolter and Inês Azevedo. Her work focuses on machine learning for forecasting, optimization, and control in high-renewables power grids. Specifically, her research explores methods to incorporate the physics and hard constraints associated with electric power systems into deep learning models. Donti is also co-founder and chair of Climate Change AI, a nonprofit initiative to catalyze impactful work at the intersection of climate change and machine learning that is currently running through the Cornell Tech Runway Startup Postdoc Program.

    Ericmoore Jossou will join MIT as an assistant professor in a shared position between the Department of Nuclear Science and Engineering and the faculty of electrical engineering in EECS in July 2023. He is currently an assistant scientist at the Brookhaven National Laboratory, a U.S. Department of Energy-affiliated lab that conducts research in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience, and national security. His research at MIT will focus on understanding the processing-structure-properties correlation of materials for nuclear energy applications through advanced experiments, multiscale simulations, and data science. Jossou obtained his PhD in mechanical engineering in 2019 from the University of Saskatchewan.

    Sherrie Wang will join MIT as an assistant professor in a shared position between the Department of Mechanical Engineering and the Institute for Data, Systems, and Society in academic year 2023-24. Wang is currently a Ciriacy-Wantrup Postdoctoral Fellow at the University of California at Berkeley, hosted by Solomon Hsiang and the Global Policy Lab. She develops machine learning for Earth observation data. Her primary application areas are improving agricultural management and forecasting climate phenomena. She obtained her PhD in computational and mathematical engineering from Stanford University in 2021, where she was advised by David Lobell. More

  • in

    J-WAFS awards $150K Solutions grant to Patrick Doyle and team for rapid removal of micropollutants from water

    The Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) has awarded a 2022 J-WAFS Solutions grant to Patrick S. Doyle, the Robert T. Haslam Professor of Chemical Engineering at MIT, for his innovative system to tackle water pollution. Doyle will be working with co-Principal Investigator Rafael Gomez-Bombarelli, assistant professor in materials processing in the Department of Materials Science, as well as PhD students Devashish Gokhale and Tynan Perez. Building off of findings from a 2019 J-WAFS seed grant, Doyle and the research team will create cost-effective industry-scale processes to remove micropollutants from water. Project work will commence this month.

    The J-WAFS Solutions program provides one-year, renewable, commercialization grants to help move MIT technology from the laboratory to market. Grants of up to $150,000 are awarded to researchers with breakthrough technologies and inventions in water or food. Since its launch in 2015, J-WAFS Solutions grants have led to seven spinout companies and helped commercialize two products as open-source technologies. The grant program is supported by Community Jameel.

    A widespread problem 

    Micropollutants are contaminants that occur in low concentrations in the environment, yet continuous exposure and bioaccumulation of micropollutants make them a cause for concern. According to the U.S. Environmental Protection Agency, the plastics derivative Bisphenol A (BPA), the “forever chemicals” per-and polyfluoroalkyl substances (PFAS), and heavy metals like lead are common micropollutants known to be found in more than 85 percent of rivers, ponds, and lakes in the United States. Many of these bodies of water are sources of drinking water. Over long periods of time, exposure to micropollutants through drinking water can cause physiological damage in humans, increasing the risk of cancer, developmental disorders, and reproductive failure.

    Since micropollutants occur in low concentrations, it is difficult to detect and monitor their presence, and the chemical diversity of micropollutants makes it difficult to inexpensively remove them from water. Currently, activated carbon is the industry standard for micropollutant elimination, but this method cannot efficiently remove contaminants at parts-per-billion and parts-per-trillion concentrations. There are also strong sustainability concerns associated with activated carbon production, which is energy-intensive and releases large volumes of carbon dioxide.

    A solution with societal and economic benefits

    Doyle and his team are developing a technology that uses sustainable hydrogel microparticles to remove micropollutants from water. The polymeric hydrogel microparticles use chemically anchored structures including micelles and other chelating agents that act like a sponge by absorbing organic micropollutants and heavy metal ions. The microparticles are large enough to separate from water using simple gravitational settling. The system is sustainable because the microparticles can be recycled for continuous use. In testing, the long-lasting, reusable microparticles show quicker removal of contaminants than commercial activated carbon. The researchers plan to utilize machine learning to find optimal microparticle compositions that maximize performance on complex combinations of micropollutants in simulated and real wastewater samples.

    Economically, the technology is a new offering that has applications in numerous large markets where micropollutant elimination is vital, including municipal and industrial water treatment equipment, as well as household water purification systems. The J-WAFS Solutions grant will allow the team to build and test prototypes of the water treatment system, identify the best use cases and customers, and perform technoeconomic analyses and market research to formulate a preliminary business plan. With J-WAFS commercialization support, the project could eventually lead to a startup company.

    “Emerging micropollutants are a growing threat to drinking water supplies worldwide,” says J-WAFS Director John H. Lienhard, the Abdul Latif Jameel Professor of Water at MIT. “Cost-effective and scalable technologies for micropollutant removal are urgently needed. This project will develop and commercialize a promising new tool for water treatment, with the goal of improving water quality for millions of people.” More