More stories

  • in

    Fast-tracking fusion energy’s arrival with AI and accessibility

    As the impacts of climate change continue to grow, so does interest in fusion’s potential as a clean energy source. While fusion reactions have been studied in laboratories since the 1930s, there are still many critical questions scientists must answer to make fusion power a reality, and time is of the essence. As part of their strategy to accelerate fusion energy’s arrival and reach carbon neutrality by 2050, the U.S. Department of Energy (DoE) has announced new funding for a project led by researchers at MIT’s Plasma Science and Fusion Center (PSFC) and four collaborating institutions.

    Cristina Rea, a research scientist and group leader at the PSFC, will serve as the primary investigator for the newly funded three-year collaboration to pilot the integration of fusion data into a system that can be read by AI-powered tools. The PSFC, together with scientists from the College of William and Mary, the University of Wisconsin at Madison, Auburn University, and the nonprofit HDF Group, plan to create a holistic fusion data platform, the elements of which could offer unprecedented access for researchers, especially underrepresented students. The project aims to encourage diverse participation in fusion and data science, both in academia and the workforce, through outreach programs led by the group’s co-investigators, of whom four out of five are women. 

    The DoE’s award, part of a $29 million funding package for seven projects across 19 institutions, will support the group’s efforts to distribute data produced by fusion devices like the PSFC’s Alcator C-Mod, a donut-shaped “tokamak” that utilized powerful magnets to control and confine fusion reactions. Alcator C-Mod operated from 1991 to 2016 and its data are still being studied, thanks in part to the PSFC’s commitment to the free exchange of knowledge.

    Currently, there are nearly 50 public experimental magnetic confinement-type fusion devices; however, both historical and current data from these devices can be difficult to access. Some fusion databases require signing user agreements, and not all data are catalogued and organized the same way. Moreover, it can be difficult to leverage machine learning, a class of AI tools, for data analysis and to enable scientific discovery without time-consuming data reorganization. The result is fewer scientists working on fusion, greater barriers to discovery, and a bottleneck in harnessing AI to accelerate progress.

    The project’s proposed data platform addresses technical barriers by being FAIR — Findable, Interoperable, Accessible, Reusable — and by adhering to UNESCO’s Open Science (OS) recommendations to improve the transparency and inclusivity of science; all of the researchers’ deliverables will adhere to FAIR and OS principles, as required by the DoE. The platform’s databases will be built using MDSplusML, an upgraded version of the MDSplus open-source software developed by PSFC researchers in the 1980s to catalogue the results of Alcator C-Mod’s experiments. Today, nearly 40 fusion research institutes use MDSplus to store and provide external access to their fusion data. The release of MDSplusML aims to continue that legacy of open collaboration.

    The researchers intend to address barriers to participation for women and disadvantaged groups not only by improving general access to fusion data, but also through a subsidized summer school that will focus on topics at the intersection of fusion and machine learning, which will be held at William and Mary for the next three years.

    Of the importance of their research, Rea says, “This project is about responding to the fusion community’s needs and setting ourselves up for success. Scientific advancements in fusion are enabled via multidisciplinary collaboration and cross-pollination, so accessibility is absolutely essential. I think we all understand now that diverse communities have more diverse ideas, and they allow faster problem-solving.”

    The collaboration’s work also aligns with vital areas of research identified in the International Atomic Energy Agency’s “AI for Fusion” Coordinated Research Project (CRP). Rea was selected as the technical coordinator for the IAEA’s CRP emphasizing community engagement and knowledge access to accelerate fusion research and development. In a letter of support written for the group’s proposed project, the IAEA stated that, “the work [the researchers] will carry out […] will be beneficial not only to our CRP but also to the international fusion community in large.”

    PSFC Director and Hitachi America Professor of Engineering Dennis Whyte adds, “I am thrilled to see PSFC and our collaborators be at the forefront of applying new AI tools while simultaneously encouraging and enabling extraction of critical data from our experiments.”

    “Having the opportunity to lead such an important project is extremely meaningful, and I feel a responsibility to show that women are leaders in STEM,” says Rea. “We have an incredible team, strongly motivated to improve our fusion ecosystem and to contribute to making fusion energy a reality.” More

  • in

    To improve solar and other clean energy tech, look beyond hardware

    To continue reducing the costs of solar energy and other clean energy technologies, scientists and engineers will likely need to focus, at least in part, on improving technology features that are not based on hardware, according to MIT researchers. They describe this finding and the mechanisms behind it today in Nature Energy.

    While the cost of installing a solar energy system has dropped by more than 99 percent since 1980, this new analysis shows that “soft technology” features, such as the codified permitting practices, supply chain management techniques, and system design processes that go into deploying a solar energy plant, contributed only 10 to 15 percent of total cost declines. Improvements to hardware features were responsible for the lion’s share.

    But because soft technology is increasingly dominating the total costs of installing solar energy systems, this trend threatens to slow future cost savings and hamper the global transition to clean energy, says the study’s senior author, Jessika Trancik, a professor in MIT’s Institute for Data, Systems, and Society (IDSS).

    Trancik’s co-authors include lead author Magdalena M. Klemun, a former IDSS graduate student and postdoc who is now an assistant professor at the Hong Kong University of Science and Technology; Goksin Kavlak, a former IDSS graduate student and postdoc who is now an associate at the Brattle Group; and James McNerney, a former IDSS postdoc and now senior research fellow at the Harvard Kennedy School.

    The team created a quantitative model to analyze the cost evolution of solar energy systems, which captures the contributions of both hardware technology features and soft technology features.

    The framework shows that soft technology hasn’t improved much over time — and that soft technology features contributed even less to overall cost declines than previously estimated.

    Their findings indicate that to reverse this trend and accelerate cost declines, engineers could look at making solar energy systems less reliant on soft technology to begin with, or they could tackle the problem directly by improving inefficient deployment processes.  

    “Really understanding where the efficiencies and inefficiencies are, and how to address those inefficiencies, is critical in supporting the clean energy transition. We are making huge investments of public dollars into this, and soft technology is going to be absolutely essential to making those funds count,” says Trancik.

    “However,” Klemun adds, “we haven’t been thinking about soft technology design as systematically as we have for hardware. That needs to change.”

    The hard truth about soft costs

    Researchers have observed that the so-called “soft costs” of building a solar power plant — the costs of designing and installing the plant — are becoming a much larger share of total costs. In fact, the share of soft costs now typically ranges from 35 to 64 percent.

    “We wanted to take a closer look at where these soft costs were coming from and why they weren’t coming down over time as quickly as the hardware costs,” Trancik says.

    In the past, scientists have modeled the change in solar energy costs by dividing total costs into additive components — hardware components and nonhardware components — and then tracking how these components changed over time.

    “But if you really want to understand where those rates of change are coming from, you need to go one level deeper to look at the technology features. Then things split out differently,” Trancik says.

    The researchers developed a quantitative approach that models the change in solar energy costs over time by assigning contributions to the individual technology features, including both hardware features and soft technology features.

    For instance, their framework would capture how much of the decline in system installation costs — a soft cost — is due to standardized practices of certified installers — a soft technology feature. It would also capture how that same soft cost is affected by increased photovoltaic module efficiency — a hardware technology feature.

    With this approach, the researchers saw that improvements in hardware had the greatest impacts on driving down soft costs in solar energy systems. For example, the efficiency of photovoltaic modules doubled between 1980 and 2017, reducing overall system costs by 17 percent. But about 40 percent of that overall decline could be attributed to reductions in soft costs tied to improved module efficiency.

    The framework shows that, while hardware technology features tend to improve many cost components, soft technology features affect only a few.

    “You can see this structural difference even before you collect data on how the technologies have changed over time. That’s why mapping out a technology’s network of cost dependencies is a useful first step to identify levers of change, for solar PV and for other technologies as well,” Klemun notes.  

    Static soft technology

    The researchers used their model to study several countries, since soft costs can vary widely around the world. For instance, solar energy soft costs in Germany are about 50 percent less than those in the U.S.

    The fact that hardware technology improvements are often shared globally led to dramatic declines in costs over the past few decades across locations, the analysis showed. Soft technology innovations typically aren’t shared across borders. Moreover, the team found that countries with better soft technology performance 20 years ago still have better performance today, while those with worse performance didn’t see much improvement.

    This country-by-country difference could be driven by regulation and permitting processes, cultural factors, or by market dynamics such as how firms interact with each other, Trancik says.

    “But not all soft technology variables are ones that you would want to change in a cost-reducing direction, like lower wages. So, there are other considerations, beyond just bringing the cost of the technology down, that we need to think about when interpreting these results,” she says.

    Their analysis points to two strategies for reducing soft costs. For one, scientists could focus on developing hardware improvements that make soft costs more dependent on hardware technology variables and less on soft technology variables, such as by creating simpler, more standardized equipment that could reduce on-site installation time.

    Or researchers could directly target soft technology features without changing hardware, perhaps by creating more efficient workflows for system installation or automated permitting platforms.

    “In practice, engineers will often pursue both approaches, but separating the two in a formal model makes it easier to target innovation efforts by leveraging specific relationships between technology characteristics and costs,” Klemun says.

    “Often, when we think about information processing, we are leaving out processes that still happen in a very low-tech way through people communicating with one another. But it is just as important to think about that as a technology as it is to design fancy software,” Trancik notes.

    In the future, she and her collaborators want to apply their quantitative model to study the soft costs related to other technologies, such as electrical vehicle charging and nuclear fission. They are also interested in better understanding the limits of soft technology improvement, and how one could design better soft technology from the outset.

    This research is funded by the U.S. Department of Energy Solar Energy Technologies Office. More

  • in

    Chemists discover why photosynthetic light-harvesting is so efficient

    When photosynthetic cells absorb light from the sun, packets of energy called photons leap between a series of light-harvesting proteins until they reach the photosynthetic reaction center. There, cells convert the energy into electrons, which eventually power the production of sugar molecules.

    This transfer of energy through the light-harvesting complex occurs with extremely high efficiency: Nearly every photon of light absorbed generates an electron, a phenomenon known as near-unity quantum efficiency.

    A new study from MIT chemists offers a potential explanation for how proteins of the light-harvesting complex, also called the antenna, achieve that high efficiency. For the first time, the researchers were able to measure the energy transfer between light-harvesting proteins, allowing them to discover that the disorganized arrangement of these proteins boosts the efficiency of the energy transduction.

    “In order for that antenna to work, you need long-distance energy transduction. Our key finding is that the disordered organization of the light-harvesting proteins enhances the efficiency of that long-distance energy transduction,” says Gabriela Schlau-Cohen, an associate professor of chemistry at MIT and the senior author of the new study.

    MIT postdocs Dihao Wang and Dvir Harris and former MIT graduate student Olivia Fiebig PhD ’22 are the lead authors of the paper, which appears this week in the Proceedings of the National Academy of Sciences. Jianshu Cao, an MIT professor of chemistry, is also an author of the paper.

    Energy capture

    For this study, the MIT team focused on purple bacteria, which are often found in oxygen-poor aquatic environments and are commonly used as a model for studies of photosynthetic light-harvesting.

    Within these cells, captured photons travel through light-harvesting complexes consisting of proteins and light-absorbing pigments such as chlorophyll. Using ultrafast spectroscopy, a technique that uses extremely short laser pulses to study events that happen on timescales of femtoseconds to nanoseconds, scientists have been able to study how energy moves within a single one of these proteins. However, studying how energy travels between these proteins has proven much more challenging because it requires positioning multiple proteins in a controlled way.

    To create an experimental setup where they could measure how energy travels between two proteins, the MIT team designed synthetic nanoscale membranes with a composition similar to those of naturally occurring cell membranes. By controlling the size of these membranes, known as nanodiscs, they were able to control the distance between two proteins embedded within the discs.

    For this study, the researchers embedded two versions of the primary light-harvesting protein found in purple bacteria, known as LH2 and LH3, into their nanodiscs. LH2 is the protein that is present during normal light conditions, and LH3 is a variant that is usually expressed only during low light conditions.

    Using the cryo-electron microscope at the MIT.nano facility, the researchers could image their membrane-embedded proteins and show that they were positioned at distances similar to those seen in the native membrane. They were also able to measure the distances between the light-harvesting proteins, which were on the scale of 2.5 to 3 nanometers.

    Disordered is better

    Because LH2 and LH3 absorb slightly different wavelengths of light, it is possible to use ultrafast spectroscopy to observe the energy transfer between them. For proteins spaced closely together, the researchers found that it takes about 6 picoseconds for a photon of energy to travel between them. For proteins farther apart, the transfer takes up to 15 picoseconds.

    Faster travel translates to more efficient energy transfer, because the longer the journey takes, the more energy is lost during the transfer.

    “When a photon gets absorbed, you only have so long before that energy gets lost through unwanted processes such as nonradiative decay, so the faster it can get converted, the more efficient it will be,” Schlau-Cohen says.

    The researchers also found that proteins arranged in a lattice structure showed less efficient energy transfer than proteins that were arranged in randomly organized structures, as they usually are in living cells.

    “Ordered organization is actually less efficient than the disordered organization of biology, which we think is really interesting because biology tends to be disordered. This finding tells us that that may not just be an inevitable downside of biology, but organisms may have evolved to take advantage of it,” Schlau-Cohen says.

    Now that they have established the ability to measure inter-protein energy transfer, the researchers plan to explore energy transfer between other proteins, such as the transfer between proteins of the antenna to proteins of the reaction center. They also plan to study energy transfer between antenna proteins found in organisms other than purple bacteria, such as green plants.

    The research was funded primarily by the U.S. Department of Energy. More

  • in

    Exploring the bow shock and beyond

    For most people, the night sky conjures a sense of stillness, an occasional shooting star the only visible movement. A conversation with Rishabh Datta, however, unveils the supersonic drama crashing above planet Earth. The PhD candidate has focused his recent study on the plasma speeding through space, flung from sources like the sun’s corona and headed toward Earth, halted abruptly by colliding with the planet’s magnetosphere. The resulting shock wave is similar to the “bow shock” that forms around the nose cone of a supersonic jet, which manifests as the familiar sonic boom.

    The bow shock phenomenon has been well studied. “It’s probably one of the things that’s keeping life alive,” says Datta, “protecting us from the solar wind.” While he feels the magnetosphere provides “a very interesting space laboratory,” Datta’s main focus is, “Can we create this high-energy plasma that is moving supersonically in a laboratory, and can we study it? And can we learn things that are hard to diagnose in an astrophysical plasma?”

    Datta’s research journey to the bow shock and beyond began when he joined a research program for high school students at the National University Singapore. Tasked with culturing bacteria and measuring the amount of methane they produced in a biogas tank, Datta found his first research experience “quite nasty.”

    “I was working with chicken manure, and every day I would come home smelling completely awful,” he says.

    As an undergraduate at Georgia Tech, Datta’s interests turned toward solar power, compelled by a new technology he felt could generate sustainable energy. By the time he joined MIT’s Department of Mechanical Engineering, though, his interests had morphed into researching the heat and mass transfer from airborne droplets. After a year of study, he felt the need to go in a yet another direction.

    The subject of astrophysical plasmas had recently piqued his interest, and he followed his curiosity to Department of Nuclear Science and Engineering Professor Nuno Loureiro’s introductory plasma class. There he encountered Professor Jack Hare, who was sitting in on the class and looking for students to work with him.

    “And that’s how I ended up doing plasma physics and studying bow shocks,” he says, “a long and circuitous route that started with culturing bacteria.”

    Gathering measurements from MAGPIE

    Datta is interested in what he can learn about plasma from gathering measurements of a laboratory-created bow shock, seeking to verify theoretical models. He uses data already collected from experiments on a pulsed-power generator known as MAGPIE (the Mega-Ampere Generator of Plasma Implosion Experiments), located at Imperial College, London. By observing how long it takes a plasma to reach an obstacle, in this case a probe that measures magnetic fields, Datta was able to determine its velocity.   

    With the velocity established, an interferometry system was able to provide images of the probe and the plasma around it, allowing Datta to characterize the structure of the bow shock.

    “The shape depends on how fast sound waves can travel in a plasma,” says Datta. “And this ‘sound speed’ depends on the temperature.”

    The interdependency of these characteristics means that by imaging a shock it’s possible to determine temperature, sound speed, and other measurements more easily and cheaply than with other methods.

    “And knowing more about your plasma allows you to make predictions about, for example, electrical resistivity, which can be important for understanding other physics that might interest you,” says Datta, “like magnetic reconnection.”

    This phenomenon, which controls the evolution of such violent events as solar flares, coronal mass ejections, magnetic storms that drive auroras, and even disruptions in fusion tokamaks, has become the focus of his recent research. It happens when opposing magnetic fields in a plasma break and then reconnect, generating vast quantities of heat and accelerating the plasma to high velocities.

    Onward to Z

    Datta travels to Sandia National Laboratories in Albuquerque, New Mexico, to work on the largest pulsed power facility in the world, informally known as “the Z machine,” to research how the properties of magnetic reconnection change when a plasma emits strong radiation and cools rapidly.

    In future years, Datta will only have to travel across Albany Street on the MIT campus to work on yet another machine, PUFFIN, currently being built at the Plasma Science and Fusion Center (PSFC). Like MAGPIE and Z, PUFFIN is a pulsed power facility, but with the ability to drive the current 10 times longer than other machines, opening up new opportunities in high-energy-density laboratory astrophysics.

    Hare, who leads the PUFFIN team, is pleased with Datta’s increasing experience.

    “Working with Rishabh is a real pleasure,” he says, “He has quickly learned the ins and outs of experimental plasma physics, often analyzing data from machines he hasn’t even yet had the chance to see! While we build PUFFIN it’s really useful for us to carry out experiments at other pulsed-power facilities worldwide, and Rishabh has already written papers on results from MAGPIE, COBRA at Cornell in Ithaca, New York, and the Z Machine.”

    Pursuing climate action at MIT

    Hand-in-hand with Datta’s quest to understand plasma is his pursuit of sustainability, including carbon-free energy solutions. A member of the Graduate Student Council’s Sustainability Committee since he arrived in 2019, he was heartened when MIT, revising their climate action plan, provided him and other students the chance to be involved in decision-making. He led focus groups to provide graduate student input on the plan, raising issues surrounding campus decarbonization, the need to expand hiring of early-career researchers working on climate and sustainability, and waste reduction and management for MIT laboratories.

    When not focused on bringing astrophysics to the laboratory, Datta sometimes experiments in a lab closer to home — the kitchen — where he often challenges himself to duplicate a recipe he has recently tried at a favorite restaurant. His stated ambition could apply to his sustainability work as well as to his pursuit of understanding plasma.

    “The goal is to try and make it better,” he says. “I try my best to get there.”

    Datta’s work has been funded, in part, by the National Science Foundation, National Nuclear Security Administration, and the Department of Energy. More

  • in

    Moving perovskite advancements from the lab to the manufacturing floor

    The following was issued as a joint announcement from MIT.nano and the MIT Research Laboratory for Electronics; CubicPV; Verde Technologies; Princeton University; and the University of California at San Diego.

    Tandem solar cells are made of stacked materials — such as silicon paired with perovskites — that together absorb more of the solar spectrum than single materials, resulting in a dramatic increase in efficiency. Their potential to generate significantly more power than conventional cells could make a meaningful difference in the race to combat climate change and the transition to a clean-energy future.

    However, current methods to create stable and efficient perovskite layers require time-consuming, painstaking rounds of design iteration and testing, inhibiting their development for commercial use. Today, the U.S. Department of Energy Solar Energy Technologies Office (SETO) announced that MIT has been selected to receive an $11.25 million cost-shared award to establish a new research center to address this challenge by using a co-optimization framework guided by machine learning and automation.

    A collaborative effort with lead industry participant CubicPV, solar startup Verde Technologies, and academic partners Princeton University and the University of California San Diego (UC San Diego), the center will bring together teams of researchers to support the creation of perovskite-silicon tandem solar modules that are co-designed for both stability and performance, with goals to significantly accelerate R&D and the transfer of these achievements into commercial environments.

    “Urgent challenges demand rapid action. This center will accelerate the development of tandem solar modules by bringing academia and industry into closer partnership,” says MIT professor of mechanical engineering Tonio Buonassisi, who will direct the center. “We’re grateful to the Department of Energy for supporting this powerful new model and excited to get to work.”

    Adam Lorenz, CTO of solar energy technology company CubicPV, stresses the importance of thinking about scale, alongside quality and efficiency, to accelerate the perovskite effort into the commercial environment. “Instead of chasing record efficiencies with tiny pixel-sized devices and later attempting to stabilize them, we will simultaneously target stability, reproducibility, and efficiency,” he says. “It’s a module-centric approach that creates a direct channel for R&D advancements into industry.”

    The center will be named Accelerated Co-Design of Durable, Reproducible, and Efficient Perovskite Tandems, or ADDEPT. The grant will be administered through the MIT Research Laboratory for Electronics (RLE).

    David Fenning, associate professor of nanoengineering at UC San Diego, has worked with Buonassisi on the idea of merging materials, automation, and computation, specifically in this field of artificial intelligence and solar, since 2014. Now, a central thrust of the ADDEPT project will be to deploy machine learning and robotic screening to optimize processing of perovskite-based solar materials for efficiency and durability.

    “We have already seen early indications of successful technology transfer between our UC San Diego robot PASCAL and industry,” says Fenning. “With this new center, we will bring research labs and the emerging perovskite industry together to improve reproducibility and reduce time to market.”

    “Our generation has an obligation to work collaboratively in the fight against climate change,” says Skylar Bagdon, CEO of Verde Technologies, which received the American-Made Perovskite Startup Prize. “Throughout the course of this center, Verde will do everything in our power to help this brilliant team transition lab-scale breakthroughs into the world where they can have an impact.”

    Several of the academic partners echoed the importance of the joint effort between academia and industry. Barry Rand, professor of electrical and computer engineering at the Andlinger Center for Energy and the Environment at Princeton University, pointed to the intersection of scientific knowledge and market awareness. “Understanding how chemistry affects films and interfaces will empower us to co-design for stability and performance,” he says. “The center will accelerate this use-inspired science, with close guidance from our end customers, the industry partners.”

    A critical resource for the center will be MIT.nano, a 200,000-square-foot research facility set in the heart of the campus. MIT.nano Director Vladimir Bulović, the Fariborz Maseeh (1990) Professor of Emerging Technology, says he envisions MIT.nano as a hub for industry and academic partners, facilitating technology development and transfer through shared lab space, open-access equipment, and streamlined intellectual property frameworks.

    “MIT has a history of groundbreaking innovation using perovskite materials for solar applications,” says Bulović. “We’re thrilled to help build on that history by anchoring ADDEPT at MIT.nano and working to help the nation advance the future of these promising materials.”

    MIT was selected as a part of the SETO Fiscal Year 2022 Photovoltaics (PV) funding program, an effort to reduce costs and supply chain vulnerabilities, further develop durable and recyclable solar technologies, and advance perovskite PV technologies toward commercialization. ADDEPT is one project that will tackle perovskite durability, which will extend module life. The overarching goal of these projects is to lower the levelized cost of electricity generated by PV.

    Research groups involved with the ADDEPT project at MIT include Buonassisi’s Accelerated Materials Laboratory for Sustainability (AMLS), Bulović’s Organic and Nanostructured Electronics (ONE) Lab, and the Bawendi Group led by Lester Wolfe Professor in Chemistry Moungi Bawendi. Also working on the project is Jeremiah Mwaura, research scientist in the ONE Lab. More

  • in

    Even as temperatures rise, this hydrogel material keeps absorbing moisture

    The vast majority of absorbent materials will lose their ability to retain water as temperatures rise. This is why our skin starts to sweat and why plants dry out in the heat. Even materials that are designed to soak up moisture, such as the silica gel packs in consumer packaging, will lose their sponge-like properties as their environment heats up.

    But one material appears to uniquely resist heat’s drying effects. MIT engineers have now found that polyethylene glycol (PEG) — a hydrogel commonly used in cosmetic creams, industrial coatings, and pharmaceutical capsules — can absorb moisture from the atmosphere even as temperatures climb.

    The material doubles its water absorption as temperatures climb from 25 to 50 degrees Celsius (77 to 122 degrees Fahrenheit), the team reports.

    PEG’s resilience stems from a heat-triggering transformation. As its surroundings heat up, the hydrogel’s microstructure morphs from a crystal to a less organized “amorphous” phase, which enhances the material’s ability to capture water.

    Based on PEG’s unique properties, the team developed a model that can be used to engineer other heat-resistant, water-absorbing materials. The group envisions such materials could one day be made into devices that harvest moisture from the air for drinking water, particularly in arid desert regions. The materials could also be incorporated into heat pumps and air conditioners to more efficiently regulate temperature and humidity.

    “A huge amount of energy consumption in buildings is used for thermal regulation,” says Lenan Zhang, a research scientist in MIT’s Department of Mechanical Engineering. “This material could be a key component of passive climate-control systems.”

    Zhang and his colleagues detail their work in a study appearing today in Advanced Materials. MIT co-authors include Xinyue Liu, Bachir El Fil, Carlos Diaz-Marin, Yang Zhong, Xiangyu Li, and Evelyn Wang, along with Shaoting Lin of Michigan State University.

    Against intuition

    Evelyn Wang’s group in MIT’s Device Research Lab aims to address energy and water challenges through the design of new materials and devices that sustainably manage water and heat. The team discovered PEG’s unusual properties as they were assessing a slew of similar hydrogels for their water-harvesting abilities.

    “We were looking for a high-performance material that could capture water for different applications,” Zhang says. “Hydrogels are a perfect candidate, because they are mostly made of water and a polymer network. They can simultaneously expand as they absorb water, making them ideal for regulating humidity and water vapor.”

    The team analyzed a variety of hydrogels, including PEG, by placing each material on a scale that was set within a climate-controlled chamber. A material became heavier as it absorbed more moisture. By recording a material’s changing weight, the researchers could track its ability to absorb moisture as they tuned the chamber’s temperature and humidity.

    What they observed was typical of most materials: as the temperature increased, the hyrogels’ ability to capture moisture from the air decreased. The reason for this temperature-dependence is well-understood: With heat comes motion, and at higher temperatures, water molecules move faster and are therefore more difficult to contain in most materials.

    “Our intuition tells us that at higher temperatures, materials tend to lose their ability to capture water,” says co-author Xinyue Liu. “So, we were very surprised by PEG because it has this inverse relationship.”

    In fact, they found that PEG grew heavier and continued to absorb water as the researchers raised the chamber’s temperature from 25 to 50 degrees Celsius.

    “At first, we thought we had measured some errors, and thought this could not be possible,” Liu says. “After we double-checked everything was correct in the experiment, we realized this was really happening, and this is the only known material that shows increasing water absorbing ability with higher temperature.”

    A lucky catch

    The group zeroed in on PEG to try and identify the reason for its unusual, heat-resilient performance. They found that the material has a natural melting point at around 50 degrees Celsius, meaning that the hydrogel’s normally crystal-like microstructure completely breaks down and transforms into an amorphous phase. Zhang says that this melted, amorphous phase provides more opportunity for polymers in the material to grab hold of any fast-moving water molecules.

    “In the crystal phase, there might be only a few sites on a polymer available to attract water and bind,” Zhang says. “But in the amorphous phase, you might have many more sites available. So, the overall performance can increase with increased temperature.”

    The team then developed a theory to predict how hydrogels absorb water, and showed that the theory could also explain PEG’s unusual behavior if the researchers added a “missing term” to the theory. That missing term was the effect of phase transformation. They found that when they included this effect, the theory could predict PEG’s behavior, along with that of other temperature-limiting hydrogels.

    The discovery of PEG’s unique properties was in large part by chance. The material’s melting temperature just happens to be within the range where water is a liquid, enabling them to catch PEG’s phase transformation and its resulting super-soaking behavior. The other hydrogels happen to have melting temperatures that fall outside this range. But the researchers suspect that these materials are also capable of similar phase transformations once they hit their melting temperatures.

    “Other polymers could in theory exhibit this same behavior, if we can engineer their melting points within a selected temperature range,” says team member Shaoting Lin.

    Now that the group has worked out a theory, they plan to use it as a blueprint to design materials specifically for capturing water at higher temperatures.

    “We want to customize our design to make sure a material can absorb a relatively high amount of water, at low humidity and high temperatures,” Liu says. “Then it could be used for atmospheric water harvesting, to bring people potable water in hot, arid environments.”

    This research was supported, in part, by U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy. More

  • in

    Evelyn Wang appointed as director of US Department of Energy’s Advanced Research Projects Agency-Energy

    On Thursday, the United States Senate confirmed the appointment of Evelyn Wang, the Ford Professor of Engineering and head of the Department of Mechanical Engineering, as director of the Department of Energy’s (DOE) Advanced Research Projects Agency-Energy (ARPA-E).

    “I am deeply honored by the opportunity to serve as the director of ARPA-E. I’d like to thank President Biden, for his nomination to this important role, and Secretary Granholm, for her confidence in my abilities. I am thrilled to be joining the incredibly talented team at ARPA-E and look forward to helping bring innovative energy technologies that bolster our nation’s economy and national security to market,” says Wang. 

    An internationally recognized leader in applying nanotechnology to heat transfer, Wang has developed a number of high-efficiency, clean energy, and clean water solutions. Wang received a bachelor’s degree in mechanical engineering from MIT in 2000. After receiving her master’s degree and PhD from Stanford University, she returned to MIT as a faculty member in 2007. In 2018, she was named department head of MIT’s Department of Mechanical Engineering.

    As director of ARPA-E, Wang will advance the agency’s mission to fund and support early-stage energy research that has the potential to impact energy generation, storage, and use. The agency helps researchers commercialize innovative technologies that, according to ARPA-E, “have the potential to radically improve U.S. economic prosperity, national security, and environmental well-being.”

    “I am so grateful to the Senate for confirming Dr. Evelyn Wang to serve as Director of DOE’s Advanced Research Projects Agency-Energy,” U.S. Secretary of Energy Jennifer M. Granholm said in a statement today. “Now more than ever, we rely on ARPA-E to support early-stage energy technologies that will help us tackle climate change and strengthen American competitiveness. Dr. Wang’s experience and expertise with groundbreaking research will ensure that ARPA-E continues its role as a key engine of innovation and climate action. I am deeply grateful for Dr. Wang’s willingness to serve the American people, and we’re so excited to welcome her to DOE.” 

    Wang has served as principal investigator of MIT’s Device Research Lab. She and her team have developed a number of devices that offer solutions to the world’s many energy and water challenges. These devices include an aerogel that drastically improves window insulation, a high-efficiency solar powered desalination system, a radiative cooling device that requires no electricity, and a system that pulls potable water out of air, even in arid conditions.

    Throughout her career, Wang has been recognized with multiple awards and honors. In 2021, she was elected as a Fellow of the American Association for the Advancement of Science. She received the American Society of Mechanical Engineering (ASME) Gustus L. Memorial Award for outstanding achievement in mechanical engineering in 2017 and was named an ASME Fellow in 2015. Having mentored and advised hundreds of students at MIT, Wang was honored with a MIT Committed to Caring Award for her commitment to mentoring graduate students. She has also served as co-chair of the inaugural Rising Stars in Mechanical Engineering program to encourage women graduate students and postdocs considering future careers in academia.

    As department head, Wang has led and implemented a variety of strategic research, educational, and community initiatives in MIT’s Department of Mechanical Engineering. Alongside other departmental leaders, she led a focus on groundbreaking research advances that help address several “grand challenges” that our world faces. She worked closely with faculty and teaching staff on developing educational offerings that prepare the next generation of mechanical engineers for the workforce. She also championed new initiatives to make the department a more diverse, equitable, and inclusive community for students, faculty, and staff. 

    Wang, who is stepping down as department head effective immediately in light of her confirmation, will be taking a temporary leave as a faculty member at MIT while she serves in this role. MIT School of Engineering Dean Anantha Chandrakasan will share plans for the search for her replacement with the mechanical engineering community in the coming days.

    Once sworn in, Wang will officially assume her role as director of ARPA-E. More

  • in

    MIT scientists contribute to National Ignition Facility fusion milestone

    On Monday, Dec. 5, at around 1 a.m., a tiny sphere of deuterium-tritium fuel surrounded by a cylindrical can of gold called a hohlraum was targeted by 192 lasers at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) in California. Over the course of billionths of a second, the lasers fired, generating X-rays inside the gold can, and imploding the sphere of fuel.

    On that morning, for the first time ever, the lasers delivered 2.1 megajoules of energy and yielded 3.15 megajoules in return, achieving a historic fusion energy gain well above 1 — a result verified by diagnostic tools developed by the MIT Plasma Science and Fusion Center (PSFC). The use of these tools and their importance was referenced by Arthur Pak, a LLNL staff scientist who spoke at a U.S. Department of Energy press event on Dec. 13 announcing the NIF’s success.

    Johan Frenje, head of the PSFC High-Energy-Density Physics division, notes that this milestone “will have profound implications for laboratory fusion research in general.”

    Since the late 1950s, researchers worldwide have pursued fusion ignition and energy gain in a laboratory, considering it one of the grand challenges of the 21st century. Ignition can only be reached when the internal fusion heating power is high enough to overcome the physical processes that cool the fusion plasma, creating a positive thermodynamic feedback loop that very rapidly increases the plasma temperature. In the case of inertial confinement fusion, the method used at the NIF, ignition can initiate a “fuel burn propagation” into the surrounding dense and cold fuel, and when done correctly, enable fusion-energy gain.

    Frenje and his PSFC division initially designed dozens of diagnostic systems that were implemented at the NIF, including the vitally important magnetic recoil neutron spectrometer (MRS), which measures the neutron energy spectrum, the data from which fusion yield, plasma ion temperature, and spherical fuel pellet compression (“fuel areal density”) can be determined. Overseen by PSFC Research Scientist Maria Gatu Johnson since 2013, the MRS is one of two systems at the NIF relied upon to measure the absolute neutron yield from the Dec. 5 experiment because of its unique ability to accurately interpret an implosion’s neutron signals.

    “Before the announcement of this historic achievement could be made, the LLNL team wanted to wait until Maria had analyzed the MRS data to an adequate level for a fusion yield to be determined,” says Frenje.

    Response around MIT to NIF’s announcement has been enthusiastic and hopeful. “This is the kind of breakthrough that ignites the imagination,” says Vice President for Research Maria Zuber, “reminding us of the wonder of discovery and the possibilities of human ingenuity. Although we have a long, hard path ahead of us before fusion can deliver clean energy to the electrical grid, we should find much reason for optimism in today’s announcement. Innovation in science and technology holds great power and promise to address some of the world’s biggest challenges, including climate change.”

    Frenje also credits the rest of the team at the PSFC’s High-Energy-Density Physics division, the Laboratory for Laser Energetics at the University of Rochester, LLNL, and other collaborators for their support and involvement in this research, as well as the National Nuclear Security Administration of the Department of Energy, which has funded much of their work since the early 1990s. He is also proud of the number of MIT PhDs that have been generated by the High-Energy-Density Physics Division and subsequently hired by LLNL, including the experimental lead for this experiment, Alex Zylstra PhD ’15.

    “This is really a team effort,” says Frenje. “Without the scientific dialogue and the extensive know-how at the HEDP Division, the critical contributions made by the MRS system would not have happened.” More