More stories

  • in

    Advancing the energy transition amidst global crises

    “The past six years have been the warmest on the planet, and our track record on climate change mitigation is drastically short of what it needs to be,” said Robert C. Armstrong, MIT Energy Initiative (MITEI) director and the Chevron Professor of Chemical Engineering, introducing MITEI’s 15th Annual Research Conference.

    At the symposium, participants from academia, industry, and finance acknowledged the deepening difficulties of decarbonizing a world rocked by geopolitical conflicts and suffering from supply chain disruptions, energy insecurity, inflation, and a persistent pandemic. In spite of this grim backdrop, the conference offered evidence of significant progress in the energy transition. Researchers provided glimpses of a low-carbon future, presenting advances in such areas as long-duration energy storage, carbon capture, and renewable technologies.

    In his keynote remarks, Ernest J. Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus, founding director of MITEI, and former U.S. secretary of energy, highlighted “four areas that have materially changed in the last year” that could shake up, and possibly accelerate, efforts to address climate change.

    Extreme weather seems to be propelling the public and policy makers of both U.S. parties toward “convergence … at least in recognition of the challenge,” Moniz said. He perceives a growing consensus that climate goals will require — in diminishing order of certainty — firm (always-on) power to complement renewable energy sources, a fuel (such as hydrogen) flowing alongside electricity, and removal of atmospheric carbon dioxide (CO2).

    Russia’s invasion of Ukraine, with its “weaponization of natural gas” and global energy impacts, underscores the idea that climate, energy security, and geopolitics “are now more or less recognized widely as one conversation.” Moniz pointed as well to new U.S. laws on climate change and infrastructure that will amplify the role of science and technology and “address the drive to technological dominance by China.”

    The rapid transformation of energy systems will require a comprehensive industrial policy, Moniz said. Government and industry must select and rapidly develop low-carbon fuels, firm power sources (possibly including nuclear power), CO2 removal systems, and long-duration energy storage technologies. “We will need to make progress on all fronts literally in this decade to come close to our goals for climate change mitigation,” he concluded.

    Global cooperation?

    Over two days, conference participants delved into many of the issues Moniz raised. In one of the first panels, scholars pondered whether the international community could forge a coordinated climate change response. The United States’ rift with China, especially over technology trade policies, loomed large.

    “Hatred of China is a bipartisan hobby and passion, but a blanket approach isn’t right, even for the sake of national security,” said Yasheng Huang, the Epoch Foundation Professor of Global Economics and Management at the MIT Sloan School of Management. “Although the United States and China working together would have huge effects for both countries, it is politically unpalatable in the short term,” said F. Taylor Fravel, the Arthur and Ruth Sloan Professor of Political Science and director of the MIT Security Studies Program. John E. Parsons, deputy director for research at the MIT Center for Energy and Environmental Policy Research, suggested that the United States should use this moment “to get our own act together … and start doing things,” such as building nuclear power plants in a cost-effective way.

    Debating carbon removal

    Several panels took up the matter of carbon emissions and the most promising technologies for contending with them. Charles Harvey, MIT professor of civil and environmental engineering, and Howard Herzog, a senior research engineer at MITEI, set the stage early, debating whether capturing carbon was essential to reaching net-zero targets.

    “I have no trouble getting to net zero without carbon capture and storage,” said David Keith, the Gordon McKay Professor of Applied Physics at Harvard University, in a subsequent roundtable. Carbon capture seems more risky to Keith than solar geoengineering, which involves injecting sulfur into the stratosphere to offset CO2 and its heat-trapping impacts.

    There are new ways of moving carbon from where it’s a problem to where it’s safer. Kripa K. Varanasi, MIT professor of mechanical engineering, described a process for modulating the pH of ocean water to remove CO2. Timothy Krysiek, managing director for Equinor Ventures, talked about construction of a 900-kilometer pipeline transporting CO2 from northern Germany to a large-scale storage site located in Norwegian waters 3,000 meters below the seabed. “We can use these offshore Norwegian assets as a giant carbon sink for Europe,” he said.

    A startup showcase featured additional approaches to the carbon challenge. Mantel, which received MITEI Seed Fund money, is developing molten salt material to capture carbon for long-term storage or for use in generating electricity. Verdox has come up with an electrochemical process for capturing dilute CO2 from the atmosphere.

    But while much of the global warming discussion focuses on CO2, other greenhouse gases are menacing. Another panel discussed measuring and mitigating these pollutants. “Methane has 82 times more warming power than CO2 from the point of emission,” said Desirée L. Plata, MIT associate professor of civil and environmental engineering. “Cutting methane is the strongest lever we have to slow climate change in the next 25 years — really the only lever.”

    Steven Hamburg, chief scientist and senior vice president of the Environmental Defense Fund, cautioned that emission of hydrogen molecules into the atmosphere can cause increases in other greenhouse gases such as methane, ozone, and water vapor. As researchers and industry turn to hydrogen as a fuel or as a feedstock for commercial processes, “we will need to minimize leakage … or risk increasing warming,” he said.

    Supply chains, markets, and new energy ventures

    In panels on energy storage and the clean energy supply chain, there were interesting discussions of challenges ahead. High-density energy materials such as lithium, cobalt, nickel, copper, and vanadium for grid-scale energy storage, electric vehicles (EVs), and other clean energy technologies, can be difficult to source. “These often come from water-stressed regions, and we need to be super thoughtful about environmental stresses,” said Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in Materials Science and Engineering. She also noted that in light of the explosive growth in demand for metals such as lithium, recycling EVs won’t be of much help. “The amount of material coming back from end-of-life batteries is minor,” she said, until EVs are much further along in their adoption cycle.

    Arvind Sanger, founder and managing partner of Geosphere Capital, said that the United States should be developing its own rare earths and minerals, although gaining the know-how will take time, and overcoming “NIMBYism” (not in my backyard-ism) is a challenge. Sanger emphasized that we must continue to use “denser sources of energy” to catalyze the energy transition over the next decade. In particular, Sanger noted that “for every transition technology, steel is needed,” and steel is made in furnaces that use coal and natural gas. “It’s completely woolly-headed to think we can just go to a zero-fossil fuel future in a hurry,” he said.

    The topic of power markets occupied another panel, which focused on ways to ensure the distribution of reliable and affordable zero-carbon energy. Integrating intermittent resources such as wind and solar into the grid requires a suite of retail markets and new digital tools, said Anuradha Annaswamy, director of MIT’s Active-Adaptive Control Laboratory. Tim Schittekatte, a postdoc at the MIT Sloan School of Management, proposed auctions as a way of insuring consumers against periods of high market costs.

    Another panel described the very different investment needs of new energy startups, such as longer research and development phases. Hooisweng Ow, technology principal at Eni Next LLC Ventures, which is developing drilling technology for geothermal energy, recommends joint development and partnerships to reduce risk. Michael Kearney SM ’11, PhD ’19, SM ’19 is a partner at The Engine, a venture firm built by MIT investing in path-breaking technology to solve the toughest challenges in climate and other problems. Kearney believes the emergence of new technologies and markets will bring on “a labor transition on an order of magnitude never seen before in this country,” he said. “Workforce development is not a natural zone for startups … and this will have to change.”

    Supporting the global South

    The opportunities and challenges of the energy transition look quite different in the developing world. In conversation with Robert Armstrong, Luhut Binsar Pandjaitan, the coordinating minister for maritime affairs and investment of the Republic of Indonesia, reported that his “nation is rich with solar, wind, and energy transition minerals like nickel and copper,” but cannot on its own tackle developing renewable energy or reducing carbon emissions and improving grid infrastructure. “Education is a top priority, and we are very far behind in high technologies,” he said. “We need help and support from MIT to achieve our target,” he said.

    Technologies that could springboard Indonesia and other nations of the global South toward their climate goals are emerging in MITEI-supported projects and at young companies MITEI helped spawn. Among the promising innovations unveiled at the conference are new materials and designs for cooling buildings in hot climates and reducing the environmental costs of construction, and a sponge-like substance that passively sucks moisture out of the air to lower the energy required for running air conditioners in humid climates.

    Other ideas on the move from lab to market have great potential for industrialized nations as well, such as a computational framework for maximizing the energy output of ocean-based wind farms; a process for using ammonia as a renewable fuel with no CO2 emissions; long-duration energy storage derived from the oxidation of iron; and a laser-based method for unlocking geothermal steam to drive power plants. More

  • in

    New materials could enable longer-lasting implantable batteries

    For the last few decades, battery research has largely focused on rechargeable lithium-ion batteries, which are used in everything from electric cars to portable electronics and have improved dramatically in terms of affordability and capacity. But nonrechargeable batteries have seen little improvement during that time, despite their crucial role in many important uses such as implantable medical devices like pacemakers.

    Now, researchers at MIT have come up with a way to improve the energy density of these nonrechargeable, or “primary,” batteries. They say it could enable up to a 50 percent increase in useful lifetime, or a corresponding decrease in size and weight for a given amount of power or energy capacity, while also improving safety, with little or no increase in cost.

    The new findings, which involve substituting the conventionally inactive battery electrolyte with a material that is active for energy delivery, are reported today in the journal Proceedings of the National Academy of Sciences, in a paper by MIT Kavanaugh Postdoctoral Fellow Haining Gao, graduate student Alejandro Sevilla, associate professor of mechanical engineering Betar Gallant, and four others at MIT and Caltech.

    Replacing the battery in a pacemaker or other medical implant requires a surgical procedure, so any increase in the longevity of their batteries could have a significant impact on the patient’s quality of life, Gallant says. Primary batteries are used for such essential applications because they can provide about three times as much energy for a given size and weight as rechargeable batteries.

    That difference in capacity, Gao says, makes primary batteries “critical for applications where charging is not possible or is impractical.” The new materials work at human body temperature, so would be suitable for medical implants. In addition to implantable devices, with further development to make the batteries operate efficiently at cooler temperatures, applications could also include sensors in tracking devices for shipments, for example to ensure that temperature and humidity requirements for food or drug shipments are properly maintained throughout the shipping process. Or, they might be used in remotely operated aerial or underwater vehicles that need to remain ready for deployment over long periods.

    Pacemaker batteries typically last from five to 10 years, and even less if they require high-voltage functions such as defibrillation. Yet for such batteries, Gao says, the technology is considered mature, and “there haven’t been any major innovations in fundamental cell chemistries in the past 40 years.”

    The key to the team’s innovation is a new kind of electrolyte — the material that lies between the two electrical poles of the battery, the cathode and the anode, and allows charge carriers to pass through from one side to the other. Using a new liquid fluorinated compound, the team found that they could combine some of the functions of the cathode and the electrolyte in one compound, called a catholyte. This allows for saving much of the weight of typical primary batteries, Gao says.

    While there are other materials besides this new compound that could theoretically function in a similar catholyte role in a high-capacity battery, Gallant explains, those materials have lower inherent voltages that do not match those of the remainder of the material in a conventional pacemaker battery, a type known as CFx. Because the overall output from the battery can’t be more than that of the lesser of the two electrode materials,  the extra capacity would go to waste because of the voltage mismatch. But with the new material, “one of the key merits of our fluorinated liquids is that their voltage aligns very well with that of CFx,” Gallant says.

    In a conventional  CFx battery, the liquid electrolyte is essential because it allows charged particles to pass through from one electrode to the other. But “those electrolytes are actually chemically inactive, so they’re basically dead weight,” Gao says. This means about 50 percent of the battery’s key components, mainly the electrolyte, is inactive material. But in the new design with the fluorinated catholyte material, the amount of dead weight can be reduced to about 20 percent, she says.

    The new cells also provide safety improvements over other kinds of proposed chemistries that would use toxic and corrosive catholyte materials, which their formula does not, Gallant says. And preliminary tests have demonstrated a stable shelf life over more than a year, an important characteristic for primary batteries, she says.

    So far, the team has not yet experimentally achieved the full 50 percent improvement in energy density predicted by their analysis. They have demonstrated a 20 percent improvement, which in itself would be an important gain for some applications, Gallant says. The design of the cell itself has not yet been fully optimized, but the researchers can project the cell performance based on the performance of the active material itself. “We can see the projected cell-level performance when it’s scaled up can reach around 50 percent higher than the CFx cell,” she says. Achieving that level experimentally is the team’s next goal.

    Sevilla, a doctoral student in the mechanical engineering department, will be focusing on that work in the coming year. “I was brought into this project to try to understand some of the limitations of why we haven’t been able to attain the full energy density possible,” he says. “My role has been trying to fill in the gaps in terms of understanding the underlying reaction.”

    One big advantage of the new material, Gao says, is that it can easily be integrated into existing battery manufacturing processes, as a simple substitution of one material for another. Preliminary discussions with manufacturers confirm this potentially easy substitution, Gao says. The basic starting material, used for other purposes, has already been scaled up for production, she says, and its price is comparable to that of the materials currently used in CFx batteries. The cost of batteries using the new material is likely to be comparable to the existing batteries as well, she says. The team has already applied for a patent on the catholyte, and they expect that the medical applications are likely to be the first to be commercialized, perhaps with a full-scale prototype ready for testing in real devices within about a year.

    Further down the road, other applications could likely take advantage of the new materials as well, such as smart water or gas meters that can be read out remotely, or devices like EZPass transponders, increasing their usable lifetime, the researchers say. Power for drone aircraft or undersea vehicles would require higher power and so may take longer to be developed. Other uses could include batteries for equipment used at remote sites, such as drilling rigs for oil and gas, including devices sent down into the wells to monitor conditions.

    The team also included Gustavo Hobold, Aaron Melemed, and Rui Guo at MIT and Simon Jones at Caltech. The work was supported by MIT Lincoln Laboratory and the Army Research Office. More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More

  • in

    In nanotube science, is boron nitride the new carbon?

    Engineers at MIT and the University of Tokyo have produced centimeter-scale structures, large enough for the eye to see, that are packed with hundreds of billions of hollow aligned fibers, or nanotubes, made from hexagonal boron nitride.

    Hexagonal boron nitride, or hBN, is a single-atom-thin material that has been coined “white graphene” for its transparent appearance and its similarity to carbon-based graphene in molecular structure and strength. It can also withstand higher temperatures than graphene, and is electrically insulating, rather than conductive. When hBN is rolled into nanometer-scale tubes, or nanotubes, its exceptional properties are significantly enhanced.

    The team’s results, published today in the journal ACS Nano, provide a route toward fabricating aligned boron nitride nanotubes (A-BNNTs) in bulk. The researchers plan to harness the technique to fabricate bulk-scale arrays of these nanotubes, which can then be combined with other materials to make stronger, more heat-resistant composites, for instance to shield space structures and hypersonic aircraft.

    As hBN is transparent and electrically insulating, the team also envisions incorporating the BNNTs into transparent windows and using them to electrically insulate sensors within electronic devices. The team is also investigating ways to weave the nanofibers into membranes for water filtration and for “blue energy” — a concept for renewable energy in which electricity is produced from the ionic filtering of salt water into fresh water.

    Brian Wardle, professor of aeronautics and astronautics at MIT, likens the team’s results to scientists’ decades-long, ongoing pursuit of manufacturing bulk-scale carbon nanotubes.

    “In 1991, a single carbon nanotube was identified as an interesting thing, but it’s been 30 years getting to bulk aligned carbon nanotubes, and the world’s not even fully there yet,” Wardle says. “With the work we’re doing, we’ve just short-circuited about 20 years in getting to bulk-scale versions of aligned boron nitride nanotubes.”

    Wardle is the senior author of the new study, which includes lead author and MIT research scientist Luiz Acauan, former MIT postdoc Haozhe Wang, and collaborators at the University of Tokyo.

    A vision, aligned

    Like graphene, hexagonal boron nitride has a molecular structure resembling chicken wire. In graphene, this chicken wire configuration is made entirely of carbon atoms, arranged in a repeating pattern of hexagons. For hBN, the hexagons are composed of alternating atoms of boron and nitrogen. In recent years, researchers have found that two-dimensional sheets of hBN exhibit exceptional properties of strength, stiffness, and resilience at high temperatures. When sheets of hBN are rolled into nanotube form, these properties are further enhanced, particularly when the nanotubes are aligned, like tiny trees in a densely packed forest.

    But finding ways to synthesize stable, high quality BNNTs has proven challenging. A handful of efforts to do so have produced low-quality, nonaligned fibers.

    “If you can align them, you have much better chance of harnessing BNNTs properties at the bulk scale to make actual physical devices, composites, and membranes,” Wardle says.

    In 2020, Rong Xiang and colleagues at the University of Tokyo found they could produce high-quality boron nitride nanotubes by first using a conventional approach of chemical vapor deposition to grow a forest of short, few micron-long carbon nanotubes. They then coated the carbon-based forest with “precursors” of boron and nitrogen gas, which when baked in an oven at high temperatures crystallized onto the carbon nanotubes to form high-quality nanotubes of hexagonal boron nitride with carbon nanotubes inside.

    Burning scaffolds

    In the new study, Wardle and Acauan have extend and scale Xiang’s approach, essentially removing the underlying carbon nanotubes and leaving the long boron nitride nanotubes to stand on their own. The team drew on the expertise of Wardle’s group, which has focused for years on fabricating high-quality aligned arrays of carbon nanotubes. With their current work, the researchers looked for ways to tweak the temperatures and pressures of the chemical vapor deposition process in order to remove the carbon nanotubes while leaving the boron nitride nanotubes intact.

    “The first few times we did it, it was completely ugly garbage,” Wardle recalls. “The tubes curled up into a ball, and they didn’t work.”

    Eventually, the team hit on a combination of temperatures, pressures, and precursors that did the trick. With this combination of processes, the researchers first reproduced the steps that Xiang took to synthesize the boron-nitride-coated carbon nanotubes. As hBN is resistant to higher temperatures than graphene, the team then cranked up the heat to burn away the underlying black carbon nanotube scaffold, while leaving the transparent, freestanding boron nitride nanotubes intact.
    By using carbon nanotubes as a scaffold, MIT engineers grow forests of “white graphene” that emerge (in MIT pattern) after burning away the black carbon scaffold. Courtesy of the researchersIn microscopic images, the team observed clear crystalline structures — evidence that the boron nitride nanotubes have a high quality. The structures were also dense: Within a square centimeter, the researchers were able to synthesize a forest of more than 100 billion aligned boron nitride nanotubes, that measured about a millimeter in height — large enough to be visible by eye. By nanotube engineering standards, these dimensions are considered to be “bulk” in scale.

    “We are now able to make these nanoscale fibers at bulk scale, which has never been shown before,” Acauan says.

    To demonstrate the flexibility of their technique, the team synthesized larger carbon-based structures, including a weave of carbon fibers, a mat of “fuzzy” carbon nanotubes, and sheets of randomly oriented carbon nanotubes known as “buckypaper.” They coated each carbon-based sample with boron and nitrogen precursors, then went through their process to burn away the underlying carbon. In each demonstration, they were left with a boron-nitride replica of the original black carbon scaffold.

    They also were able to “knock down” the forests of BNNTs, producing horizontally aligned fiber films that are a preferred configuration for incorporating into composite materials.

    “We are now working toward fibers to reinforce ceramic matrix composites, for hypersonic and space applications where there are very high temperatures, and for windows for devices that need to be optically transparent,” Wardle says. “You could make transparent materials that are reinforced with these very strong nanotubes.”

    This research was supported, in part, by Airbus, ANSYS, Boeing, Embraer, Lockheed Martin, Saab AB, and Teijin Carbon America through MIT’s Nano-Engineered Composite aerospace STructures (NECST) Consortium. More

  • in

    Simplifying the production of lithium-ion batteries

    When it comes to battery innovations, much attention gets paid to potential new chemistries and materials. Often overlooked is the importance of production processes for bringing down costs.

    Now the MIT spinout 24M Technologies has simplified lithium-ion battery production with a new design that requires fewer materials and fewer steps to manufacture each cell. The company says the design, which it calls “SemiSolid” for its use of gooey electrodes, reduces production costs by up to 40 percent. The approach also improves the batteries’ energy density, safety, and recyclability.

    Judging by industry interest, 24M is onto something. Since coming out of stealth mode in 2015, 24M has licensed its technology to multinational companies including Volkswagen, Fujifilm, Lucas TVS, Axxiva, and Freyr. Those last three companies are planning to build gigafactories (factories with gigawatt-scale annual production capacity) based on 24M’s technology in India, China, Norway, and the United States.

    “The SemiSolid platform has been proven at the scale of hundreds of megawatts being produced for residential energy-storage systems. Now we want to prove it at the gigawatt scale,” says 24M CEO Naoki Ota, whose team includes 24M co-founder, chief scientist, and MIT Professor Yet-Ming Chiang.

    Establishing large-scale production lines is only the first phase of 24M’s plan. Another key draw of its battery design is that it can work with different combinations of lithium-ion chemistries. That means 24M’s partners can incorporate better-performing materials down the line without substantially changing manufacturing processes.

    The kind of quick, large-scale production of next-generation batteries that 24M hopes to enable could have a dramatic impact on battery adoption across society — from the cost and performance of electric cars to the ability of renewable energy to replace fossil fuels.

    “This is a platform technology,” Ota says. “We’re not just a low-cost and high-reliability operator. That’s what we are today, but we can also be competitive with next-generation chemistry. We can use any chemistry in the market without customers changing their supply chains. Other startups are trying to address that issue tomorrow, not today. Our tech can address the issue today and tomorrow.”

    A simplified design

    Chiang, who is MIT’s Kyocera Professor of Materials Science and Engineering, got his first glimpse into large-scale battery production after co-founding another battery company, A123 Systems, in 2001. As that company was preparing to go public in the late 2000s, Chiang began wondering if he could design a battery that would be easier to manufacture.

    “I got this window into what battery manufacturing looked like, and what struck me was that even though we pulled it off, it was an incredibly complicated manufacturing process,” Chiang says. “It derived from magnetic tape manufacturing that was adapted to batteries in the late 1980s.”

    In his lab at MIT, where he’s been a professor since 1985, Chiang started from scratch with a new kind of device he called a “semi-solid flow battery” that pumps liquids carrying particle-based electrodes to and from tanks to store a charge.

    In 2010, Chiang partnered with W. Craig Carter, who is MIT’s POSCO Professor of Materials Science and Engineering, and the two professors supervised a student, Mihai Duduta ’11, who explored flow batteries for his undergraduate thesis. Within a month, Duduta had developed a prototype in Chiang’s lab, and 24M was born. (Duduta was the company’s first hire.)

    But even as 24M worked with MIT’s Technology Licensing Office (TLO) to commercialize research done in Chiang’s lab, people in the company including Duduta began rethinking the flow battery concept. An internal cost analysis by Carter, who consulted for 24M for several years, ultimately lead the researchers to change directions.

    That left the company with loads of the gooey slurry that made up the electrodes in their flow batteries. A few weeks after Carter’s cost analysis, Duduta, then a senior research scientist at 24M, decided to start using the slurry to assemble batteries by hand, mixing the gooey electrodes directly into the electrolyte. The idea caught on.

    The main components of batteries are the positive and negatively charged electrodes and the electrolyte material that allows ions to flow between them. Traditional lithium-ion batteries use solid electrodes separated from the electrolyte by layers of inert plastics and metals, which hold the electrodes in place.

    Stripping away the inert materials of traditional batteries and embracing the gooey electrode mix gives 24M’s design a number of advantages.

    For one, it eliminates the energy-intensive process of drying and solidifying the electrodes in traditional lithium-ion production. The company says it also reduces the need for more than 80 percent of the inactive materials in traditional batteries, including expensive ones like copper and aluminum. The design also requires no binder and features extra thick electrodes, improving the energy density of the batteries.

    “When you start a company, the smart thing to do is to revisit all of your assumptions  and ask what is the best way to accomplish your objectives, which in our case was simply-manufactured, low-cost batteries,” Chiang says. “We decided our real value was in making a lithium-ion suspension that was electrochemically active from the beginning, with electrolyte in it, and you just use the electrolyte as the processing solvent.”

    In 2017, 24M participated in the MIT Industrial Liaison Program’s STEX25 Startup Accelerator, in which Chiang and collaborators made critical industry connections that would help it secure early partnerships. 24M has also collaborated with MIT researchers on projects funded by the Department of Energy.

    Enabling the battery revolution

    Most of 24M’s partners are eyeing the rapidly growing electric vehicle (EV) market for their batteries, and the founders believe their technology will accelerate EV adoption. (Battery costs make up 30 to 40 percent of the price of EVs, according to the Institute for Energy Research).

    “Lithium-ion batteries have made huge improvements over the years, but even Elon Musk says we need some breakthrough technology,” Ota says, referring to the CEO of EV firm Tesla. “To make EVs more common, we need a production cost breakthrough; we can’t just rely on cost reduction through scaling because we already make a lot of batteries today.”

    24M is also working to prove out new battery chemistries that its partners could quickly incorporate into their gigafactories. In January of this year, 24M received a grant from the Department of Energy’s ARPA-E program to develop and scale a high-energy-density battery that uses a lithium metal anode and semi-solid cathode for use in electric aviation.

    That project is one of many around the world designed to validate new lithium-ion battery chemistries that could enable a long-sought battery revolution. As 24M continues to foster the creation of large scale, global production lines, the team believes it is well-positioned to turn lab innovations into ubiquitous, world-changing products.

    “This technology is a platform, and our vision is to be like Google’s Android [operating system], where other people can build things on our platform,” Ota says. “We want to do that but with hardware. That’s why we’re licensing the technology. Our partners can use the same production lines to get the benefits of new chemistries and approaches. This platform gives everyone more options.” More

  • in

    Doubling down on sustainability innovation in Kendall Square

    From its new headquarters in Cambridge’s Kendall Square, The Engine is investing in a number of “tough tech” startups seeking to transform the world’s energy systems. A few blocks away, the startup Inari is using gene editing to improve seeds’ resilience to climate change. On the MIT campus nearby, researchers are working on groundbreaking innovations to meet the urgent challenges our planet faces.

    Kendall Square is known as the biotech capital of the world, but as the latest annual meeting of the Kendal Square Association (KSA) made clear, it’s also a thriving hub of sustainability-related innovation.

    The Oct. 20 event, which began at MIT’s Welcome Center before moving to the MIT Museum for a panel discussion, brought together professionals from across Cambridge’s prolific innovation ecosystem — not just entrepreneurs working at startups, but also students, restaurant and retail shop owners, and people from local nonprofits.

    Titled “[Re] Imagining a Sustainable Future,” the meeting highlighted advances in climate change technologies that are afoot in Kendall Square, to help inspire and connect the community as it works toward common sustainability goals.

    “Our focus is on building a better future together — and together is the most important word there,” KSA Executive Director Beth O’Neill Maloney said in her opening remarks. “This is an incredibly innovative ecosystem and community that’s making changes that affect us here in Kendall Square and far, far beyond.”

    The pace of change

    The main event of the evening was a panel discussion moderated by Lee McGuire, the chief communications officer of the Broad Institute of MIT and Harvard. The panel featured Stuart Brown, chief financial officer at Inari; Emily Knight, chief operating officer at The Engine; and Joe Higgins, vice president for campus services and stewardship at MIT.

    “Sustainability is obviously one of the most important — if not the most important — challenge facing us as a society today,” said McGuire, opening the discussion. “Kendall Square is known for its work in biotech, life sciences, AI, and climate, and the more we dug into it the more we realized how interconnected all of those things are. The talent in Kendall Square wants to work on problems relevant for humanity, and the tools and skills you need for that can be very similar depending on the problem you’re working on.”

    Higgins, who oversees the creation of programs to reduce MIT’s environmental impact and improve the resilience of campus operations, focused on the enormity of the problem humanity is facing. He showed the audience a map of the U.S. power grid, with power plants and transmission lines illuminated in a complex web across the country, to underscore the scale of electrification that will be needed to mitigate the worst effects of climate change.

    “The U.S. power grid is the largest machine ever made by mankind,” Higgins said. “It’s been developed over 100 years; it has 7,000 generating plants that feed into it every day; it has 7 million miles of cable and wires; there are transformers and substations; and it lives in every single one of your walls. But people don’t think about it that much.”

    Many cities, states, and organizations like MIT have made commitments to shift to 100 percent clean energy in coming decades. Higgins wanted the audience to try to grasp what that’s going to take.

    “Hundreds of millions of devices and equipment across the planet are going to have to be swapped from fossil fuel to electric-based,” Higgins said. “Our cars, appliances, processes in industry, like making steel and concrete, are going to need to come from this grid. It’ll need to undergo a major modernization and transformation. The good news is it’s already changing.”

    Multiple panelists pointed to developments like the passing of the Inflation Reduction Act to show there was progress being made in reaching urgent sustainability goals.

    “There is a tide change coming, and it’s not only being driven by private capital,” Knight said. “There’s a huge opportunity here, and it’s a really important part of this [Kendall Square] ecosystem.”

    Chief among the topics of discussion was technology development. Even as leaders implement today’s technologies to decarbonize, people in Kendall Square keep a close eye on the new tech being developed and commercialized nearby.

    “I was trying to think about where we are with gene editing,” Brown said. “CRISPR’s been around for 10 years. Compare that to video games. Pong was the first video game when it came out in 1972. Today you have Chess.com using artificial intelligence to power chess games. On gene editing and a lot of these other technologies, we’re much closer to Pong than we are to where it’s going to be. We just can’t imagine today the technology changes we’re going to see over the next five to 10 years.”

    In that regard, Knight discussed some of the promising portfolio companies of The Engine, which invests in early stage, technologically innovative companies. In particular, she highlighted two companies seeking to transform the world’s energy systems with entirely new, 100 percent clean energy sources. MIT spinout Commonwealth Fusion Systems is working on nuclear fusion reactors that could provide abundant, safe, and constant streams of clean energy to our grids, while fellow MIT spinout Quaise Energy is seeking to harvest a new kind of deep geothermal energy using millimeter wave drilling technology.

    “All of our portfolio companies have a focus on sustainability in one way or another,” Knight said. “People who are working on these very hard technologies will change the world.”

    Knight says the kind of collaboration championed by the KSA is important for startups The Engine invests in.

    “We know these companies need a lot of people around them, whether from government, academia, advisors, corporate partners, anyone who can help them on their path, because for a lot of them this is a new path and a new market,” Knight said.

    Reasons for hope

    The KSA is made up of over 150 organizations across Kendall Square. From major employers like Sanofi, Pfizer, MIT, and the Broad Institute to local nonprofit organizations, startups, and independent shops and restaurants, the KSA represents the entire Kendall ecosystem.

    O’Neill Maloney celebrated a visible example of sustainability in Kendall Square early on by the Charles River Conservancy, which has built a floating wetland designed to naturally remove harmful algae blooms from Charles River.

    Other examples of sustainability work in the neighborhood can be found at MIT. Under its “Fast Forward” climate action plan, the Institute has set a goal of eliminating direct emissions from its campus by 2050, including a near-term milestone of achieving net-zero emissions by 2026. Since 2014, when MIT launched a five-year plan for action on climate change, net campus emissions have already been cut by 20 percent by making its campus buildings more energy efficient, transitioning to electric vehicles, and enabling large-scale renewable energy projects, among other strategies.

    In the face of a daunting global challenge, such milestones are reason for optimism.

    “If anybody’s going to be able to do this [shift to 100 percent clean energy] and show how it can be done at an urban, city scale, it’s probably MIT and the city of Cambridge,” McGuire said. “We have a lot of good ingredients to figure this out.”

    Throughout the night, many speakers, attendees, and panelists echoed that sentiment. They said they see plenty of reasons for hope.

    “I’m absolutely optimistic,” Higgins said. “I’m seeing utility companies working with businesses working with regulators — people are coming together on this topic. And one of these new technologies being commercialized is going to change things before 2030, whether its fusion, deep geothermal, small modular nuclear reactors, the technology is just moving so quickly.” More

  • in

    Finding community in high-energy-density physics

    Skylar Dannhoff knew one thing: She did not want to be working alone.

    As an undergraduate at Case Western Reserve University, she had committed to a senior project that often felt like solitary lab work, a feeling heightened by the pandemic. Though it was an enriching experience, she was determined to find a graduate school environment that would foster community, one “with lots of people, lots of collaboration; where it’s impossible to work until 3 a.m. without anyone noticing.” A unique group at the Plasma Science and Fusion Center (PSFC) looked promising: the High-Energy-Density Physics (HEDP) division, a lead partner in the National Nuclear Security Administration’s Center for Excellence at MIT.

    “It was a shot in the dark, just more of a whim than anything,” she says of her request to join HEDP on her application to MIT’s Department of Physics. “And then, somehow, they reached out to me. I told them I’m willing to learn about plasma. I didn’t know anything about it.”

    What she did know was that the HEDP group collaborates with other U.S. laboratories on an approach to creating fusion energy known as inertial confinement fusion (ICF). One version of the technique, known as direct-drive ICF, aims multiple laser beams symmetrically onto a spherical capsule filled with nuclear fuel. The other, indirect-drive ICF, instead aims multiple lasers beams into a gold cylindrical cavity called a hohlraum, within which the spherical fuel capsule is positioned. The laser beams are configured to hit the inner hohlraum wall, generating a “bath” of X-rays, which in turn compress the fuel capsule.

    Imploding the capsule generates intense fusion energy within a tiny fraction of a second (an order of tens of picoseconds). In August 2021, the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) used this method to produce an historic fusion yield of 1.3 megajoules, putting researchers within reach of “ignition,” the point where the self-sustained fusion burn spreads into the surrounding fuel, leading to a high fusion-energy gain.  

    Joining the group just a month before this long-sought success, Dannhoff was impressed more with the response of her new teammates and the ICF community than with the scientific milestone. “I got a better appreciation for people who had spent their entire careers working on this project, just chugging along doing their best, ignoring the naysayers. I was excited for the people.”

    Dannhoff is now working toward extending the success of NIF and other ICF experiments, like the OMEGA laser at the University of Rochester’s Laboratory for Laser Energetics. Under the supervision of Senior Research Scientist Chikang Li, she is studying what happens to the flow of plasma within the hohlraum cavity during indirect ICF experiments, particularly for hohlraums with inner-wall aerogel foam linings. Experiments, over the last decade, have shown just how excruciatingly precise the symmetry in ICF targets must be. The more symmetric the X-ray drive, the more effective the implosion, and it is possible that these foam linings will improve the X-ray symmetry and drive efficiency.

    Dannhoff is specifically interested in studying the behavior of silicon and tantalum-based foam liners. She is as concerned with the challenges of the people at General Atomics (GA) and LLNL who are creating these targets as she is with the scientific outcome.

    “I just had a meeting with GA yesterday,” she notes. “And it’s a really tricky process. It’s kind of pushing the boundaries of what is doable at the moment. I got a much better sense of how demanding this project is for them, how much we’re asking of them.”

    What excites Dannhoff is the teamwork she observes, both at MIT and between ICF institutions around the United States. With roughly 10 graduate students and postdocs down the hall, each with an assigned lead role in lab management, she knows she can consult an expert on almost any question. And collaborators across the country are just an email away. “Any information that people can give you, they will give you, and usually very freely,” she notes. “Everyone just wants to see this work.”

    That Dannhoff is a natural team player is also evidenced in her hobbies. A hockey goalie, she prioritizes playing with MIT’s intramural teams, “because goalies are a little hard to come by. I just play with whoever needs a goalie on that night, and it’s a lot of fun.”

    She is also a member of the radio community, a fellowship she first embraced at Case Western — a moment she describes as a turning point in her life. “I literally don’t know who I would be today if I hadn’t figured out radio is something I’m interested in,” she admits. The MIT Radio Society provided the perfect landing pad for her arrival in Cambridge, full of the kinds of supportive, interesting, knowledgeable students she had befriended as an undergraduate. She credits radio with helping her realize that she could make her greatest contributions to science by focusing on engineering.

    Danhoff gets philosophical as she marvels at the invisible waves that surround us.

    “Not just radio waves: every wave,” she asserts. “The voice is the everywhere. Music, signal, space phenomena: it’s always around. And all we have to do is make the right little device and have the right circuit elements put in the right order to unmix and mix the signals and amplify them. And bada-bing, bada-boom, we’re talking with the universe.”

    “Maybe that epitomizes physics to me,” she adds. “We’re trying to listen to the universe, and it’s talking to us. We just have to come up with the right tools and hear what it’s trying to say.” More

  • in

    3 Questions: Blue hydrogen and the world’s energy systems

    In the past several years, hydrogen energy has increasingly become a more central aspect of the clean energy transition. Hydrogen can produce clean, on-demand energy that could complement variable renewable energy sources such as wind and solar power. That being said, pathways for deploying hydrogen at scale have yet to be fully explored. In particular, the optimal form of hydrogen production remains in question.

    MIT Energy Initiative Research Scientist Emre Gençer and researchers from a wide range of global academic and research institutions recently published “On the climate impacts of blue hydrogen production,” a comprehensive life-cycle assessment analysis of blue hydrogen, a term referring to natural gas-based hydrogen production with carbon capture and storage. Here, Gençer describes blue hydrogen and the role that hydrogen will play more broadly in decarbonizing the world’s energy systems.

    Q: What are the differences between gray, green, and blue hydrogen?

    A: Though hydrogen does not generate any emissions directly when it is used, hydrogen production can have a huge environmental impact. Colors of hydrogen are increasingly used to distinguish different production methods and as a proxy to represent the associated environmental impact. Today, close to 95 percent of hydrogen production comes from fossil resources. As a result, the carbon dioxide (CO2) emissions from hydrogen production are quite high. Gray, black, and brown hydrogen refer to fossil-based production. Gray is the most common form of production and comes from natural gas, or methane, using steam methane reformation but without capturing CO2.

    There are two ways to move toward cleaner hydrogen production. One is applying carbon capture and storage to the fossil fuel-based hydrogen production processes. Natural gas-based hydrogen production with carbon capture and storage is referred to as blue hydrogen. If substantial amounts of CO2 from natural gas reforming are captured and permanently stored, such hydrogen could be a low-carbon energy carrier. The second way to produce cleaner hydrogen is by using electricity to produce hydrogen via electrolysis. In this case, the source of the electricity determines the environmental impact of the hydrogen, with the lowest impact being achieved when electricity is generated from renewable sources, such as wind and solar. This is known as green hydrogen.

    Q: What insights have you gleaned with a life cycle assessment (LCA) of blue hydrogen and other low-carbon energy systems?

    A: Mitigating climate change requires significant decarbonization of the global economy. Accurate estimation of cumulative greenhouse gas (GHG) emissions and its reduction pathways is critical irrespective of the source of emissions. An LCA approach allows the quantification of the environmental life cycle of a commercial product, process, or service impact with all the stages (cradle-to-grave). The LCA-based comparison of alternative energy pathways, fuel options, etc., provides an apples-to-apples comparison of low-carbon energy choices. In the context of low-carbon hydrogen, it is essential to understand the GHG impact of supply chain options. Depending on the production method, contribution of life-cycle stages to the total emissions might vary. For example, with natural gas–based hydrogen production, emissions associated with production and transport of natural gas might be a significant contributor based on its leakage and flaring rates. If these rates are not precisely accounted for, the environmental impact of blue hydrogen can be underestimated. However, the same rationale is also true for electricity-based hydrogen production. If the electricity is not supplied from low-
carbon sources such as wind, solar, or nuclear, the carbon intensity of hydrogen can be significantly underestimated. In the case of nuclear, there are also other environmental impact considerations.

    An LCA approach — if performed with consistent system boundaries — can provide an accurate environmental impact comparison. It should also be noted that these estimations can only be as good as the assumptions and correlations used unless they are supported by measurements. 

    Q: What conditions are needed to make blue hydrogen production most effective, and how can it complement other decarbonization pathways?

    A: Hydrogen is considered one of the key vectors for the decarbonization of hard-to-abate sectors such as heavy-duty transportation. Currently, more than 95 percent of global hydrogen production is fossil-fuel based. In the next decade, massive amounts of hydrogen must be produced to meet this anticipated demand. It is very hard, if not impossible, to meet this demand without leveraging existing production assets. The immediate and relatively cost-effective option is to retrofit existing plants with carbon capture and storage (blue hydrogen).

    The environmental impact of blue hydrogen may vary over large ranges but depends on only a few key parameters: the methane emission rate of the natural gas supply chain, the CO2 removal rate at the hydrogen production plant, and the global warming metric applied. State-of-the-art reforming with high CO2 capture rates, combined with natural gas supply featuring low methane emissions, substantially reduces GHG emissions compared to conventional natural gas reforming. Under these conditions, blue hydrogen is compatible with low-carbon economies and exhibits climate change impacts at the upper end of the range of those caused by hydrogen production from renewable-based electricity. However, neither current blue nor green hydrogen production pathways render fully “net-zero” hydrogen without additional CO2 removal.

    This article appears in the Spring 2022 issue of Energy Futures, the magazine of the MIT Energy Initiative. More