More stories

  • in

    Reversing the charge

    Owners of electric vehicles (EVs) are accustomed to plugging into charging stations at home and at work and filling up their batteries with electricity from the power grid. But someday soon, when these drivers plug in, their cars will also have the capacity to reverse the flow and send electrons back to the grid. As the number of EVs climbs, the fleet’s batteries could serve as a cost-effective, large-scale energy source, with potentially dramatic impacts on the energy transition, according to a new paper published by an MIT team in the journal Energy Advances.

    “At scale, vehicle-to-grid (V2G) can boost renewable energy growth, displacing the need for stationary energy storage and decreasing reliance on firm [always-on] generators, such as natural gas, that are traditionally used to balance wind and solar intermittency,” says Jim Owens, lead author and a doctoral student in the MIT Department of Chemical Engineering. Additional authors include Emre Gençer, a principal research scientist at the MIT Energy Initiative (MITEI), and Ian Miller, a research specialist for MITEI at the time of the study.

    The group’s work is the first comprehensive, systems-based analysis of future power systems, drawing on a novel mix of computational models integrating such factors as carbon emission goals, variable renewable energy (VRE) generation, and costs of building energy storage, production, and transmission infrastructure.

    “We explored not just how EVs could provide service back to the grid — thinking of these vehicles almost like energy storage on wheels — but also the value of V2G applications to the entire energy system and if EVs could reduce the cost of decarbonizing the power system,” says Gençer. “The results were surprising; I personally didn’t believe we’d have so much potential here.”

    Displacing new infrastructure

    As the United States and other nations pursue stringent goals to limit carbon emissions, electrification of transportation has taken off, with the rate of EV adoption rapidly accelerating. (Some projections show EVs supplanting internal combustion vehicles over the next 30 years.) With the rise of emission-free driving, though, there will be increased demand for energy. “The challenge is ensuring both that there’s enough electricity to charge the vehicles and that this electricity is coming from renewable sources,” says Gençer.

    But solar and wind energy is intermittent. Without adequate backup for these sources, such as stationary energy storage facilities using lithium-ion batteries, for instance, or large-scale, natural gas- or hydrogen-fueled power plants, achieving clean energy goals will prove elusive. More vexing, costs for building the necessary new energy infrastructure runs to the hundreds of billions.

    This is precisely where V2G can play a critical, and welcome, role, the researchers reported. In their case study of a theoretical New England power system meeting strict carbon constraints, for instance, the team found that participation from just 13.9 percent of the region’s 8 million light-duty (passenger) EVs displaced 14.7 gigawatts of stationary energy storage. This added up to $700 million in savings — the anticipated costs of building new storage capacity.

    Their paper also described the role EV batteries could play at times of peak demand, such as hot summer days. “V2G technology has the ability to inject electricity back into the system to cover these episodes, so we don’t need to install or invest in additional natural gas turbines,” says Owens. “The way that EVs and V2G can influence the future of our power systems is one of the most exciting and novel aspects of our study.”

    Modeling power

    To investigate the impacts of V2G on their hypothetical New England power system, the researchers integrated their EV travel and V2G service models with two of MITEI’s existing modeling tools: the Sustainable Energy System Analysis Modeling Environment (SESAME) to project vehicle fleet and electricity demand growth, and GenX, which models the investment and operation costs of electricity generation, storage, and transmission systems. They incorporated such inputs as different EV participation rates, costs of generation for conventional and renewable power suppliers, charging infrastructure upgrades, travel demand for vehicles, changes in electricity demand, and EV battery costs.

    Their analysis found benefits from V2G applications in power systems (in terms of displacing energy storage and firm generation) at all levels of carbon emission restrictions, including one with no emissions caps at all. However, their models suggest that V2G delivers the greatest value to the power system when carbon constraints are most aggressive — at 10 grams of carbon dioxide per kilowatt hour load. Total system savings from V2G ranged from $183 million to $1,326 million, reflecting EV participation rates between 5 percent and 80 percent.

    “Our study has begun to uncover the inherent value V2G has for a future power system, demonstrating that there is a lot of money we can save that would otherwise be spent on storage and firm generation,” says Owens.

    Harnessing V2G

    For scientists seeking ways to decarbonize the economy, the vision of millions of EVs parked in garages or in office spaces and plugged into the grid for 90 percent of their operating lives proves an irresistible provocation. “There is all this storage sitting right there, a huge available capacity that will only grow, and it is wasted unless we take full advantage of it,” says Gençer.

    This is not a distant prospect. Startup companies are currently testing software that would allow two-way communication between EVs and grid operators or other entities. With the right algorithms, EVs would charge from and dispatch energy to the grid according to profiles tailored to each car owner’s needs, never depleting the battery and endangering a commute.

    “We don’t assume all vehicles will be available to send energy back to the grid at the same time, at 6 p.m. for instance, when most commuters return home in the early evening,” says Gençer. He believes that the vastly varied schedules of EV drivers will make enough battery power available to cover spikes in electricity use over an average 24-hour period. And there are other potential sources of battery power down the road, such as electric school buses that are employed only for short stints during the day and then sit idle.

    The MIT team acknowledges the challenges of V2G consumer buy-in. While EV owners relish a clean, green drive, they may not be as enthusiastic handing over access to their car’s battery to a utility or an aggregator working with power system operators. Policies and incentives would help.

    “Since you’re providing a service to the grid, much as solar panel users do, you could be paid for your participation, and paid at a premium when electricity prices are very high,” says Gençer.

    “People may not be willing to participate ’round the clock, but if we have blackout scenarios like in Texas last year, or hot-day congestion on transmission lines, maybe we can turn on these vehicles for 24 to 48 hours, sending energy back to the system,” adds Owens. “If there’s a power outage and people wave a bunch of money at you, you might be willing to talk.”

    “Basically, I think this comes back to all of us being in this together, right?” says Gençer. “As you contribute to society by giving this service to the grid, you will get the full benefit of reducing system costs, and also help to decarbonize the system faster and to a greater extent.”

    Actionable insights

    Owens, who is building his dissertation on V2G research, is now investigating the potential impact of heavy-duty electric vehicles in decarbonizing the power system. “The last-mile delivery trucks of companies like Amazon and FedEx are likely to be the earliest adopters of EVs,” Owen says. “They are appealing because they have regularly scheduled routes during the day and go back to the depot at night, which makes them very useful for providing electricity and balancing services in the power system.”

    Owens is committed to “providing insights that are actionable by system planners, operators, and to a certain extent, investors,” he says. His work might come into play in determining what kind of charging infrastructure should be built, and where.

    “Our analysis is really timely because the EV market has not yet been developed,” says Gençer. “This means we can share our insights with vehicle manufacturers and system operators — potentially influencing them to invest in V2G technologies, avoiding the costs of building utility-scale storage, and enabling the transition to a cleaner future. It’s a huge win, within our grasp.”

    The research for this study was funded by MITEI’s Future Energy Systems Center. More

  • in

    Engineers solve a mystery on the path to smaller, lighter batteries

    A discovery by MIT researchers could finally unlock the door to the design of a new kind of rechargeable lithium battery that is more lightweight, compact, and safe than current versions, and that has been pursued by labs around the world for years.

    The key to this potential leap in battery technology is replacing the liquid electrolyte that sits between the positive and negative electrodes with a much thinner, lighter layer of solid ceramic material, and replacing one of the electrodes with solid lithium metal. This would greatly reduce the overall size and weight of the battery and remove the safety risk associated with liquid electrolytes, which are flammable. But that quest has been beset with one big problem: dendrites.

    Dendrites, whose name comes from the Latin for branches, are projections of metal that can build up on the lithium surface and penetrate into the solid electrolyte, eventually crossing from one electrode to the other and shorting out the battery cell. Researchers haven’t been able to agree on what gives rise to these metal filaments, nor has there been much progress on how to prevent them and thus make lightweight solid-state batteries a practical option.

    The new research, being published today in the journal Joule in a paper by MIT Professor Yet-Ming Chiang, graduate student Cole Fincher, and five others at MIT and Brown University, seems to resolve the question of what causes dendrite formation. It also shows how dendrites can be prevented from crossing through the electrolyte.

    Chiang says in the group’s earlier work, they made a “surprising and unexpected” finding, which was that the hard, solid electrolyte material used for a solid-state battery can be penetrated by lithium, which is a very soft metal, during the process of charging and discharging the battery, as ions of lithium move between the two sides.

    This shuttling back and forth of ions causes the volume of the electrodes to change. That inevitably causes stresses in the solid electrolyte, which has to remain fully in contact with both of the electrodes that it is sandwiched between. “To deposit this metal, there has to be an expansion of the volume because you’re adding new mass,” Chiang says. “So, there’s an increase in volume on the side of the cell where the lithium is being deposited. And if there are even microscopic flaws present, this will generate a pressure on those flaws that can cause cracking.”

    Those stresses, the team has now shown, cause the cracks that allow dendrites to form. The solution to the problem turns out to be more stress, applied in just the right direction and with the right amount of force.

    While previously, some researchers thought that dendrites formed by a purely electrochemical process, rather than a mechanical one, the team’s experiments demonstrate that it is mechanical stresses that cause the problem.

    The process of dendrite formation normally takes place deep within the opaque materials of the battery cell and cannot be observed directly, so Fincher developed a way of making thin cells using a transparent electrolyte, allowing the whole process to be directly seen and recorded. “You can see what happens when you put a compression on the system, and you can see whether or not the dendrites behave in a way that’s commensurate with a corrosion process or a fracture process,” he says.

    The team demonstrated that they could directly manipulate the growth of dendrites simply by applying and releasing pressure, causing the dendrites to zig and zag in perfect alignment with the direction of the force.

    Applying mechanical stresses to the solid electrolyte doesn’t eliminate the formation of dendrites, but it does control the direction of their growth. This means they can be directed to remain parallel to the two electrodes and prevented from ever crossing to the other side, and thus rendered harmless.

    In their tests, the researchers used pressure induced by bending the material, which was formed into a beam with a weight at one end. But they say that in practice, there could be many different ways of producing the needed stress. For example, the electrolyte could be made with two layers of material that have different amounts of thermal expansion, so that there is an inherent bending of the material, as is done in some thermostats.

    Another approach would be to “dope” the material with atoms that would become embedded in it, distorting it and leaving it in a permanently stressed state. This is the same method used to produce the super-hard glass used in the screens of smart phones and tablets, Chiang explains. And the amount of pressure needed is not extreme: The experiments showed that pressures of 150 to 200 megapascals were sufficient to stop the dendrites from crossing the electrolyte.

    The required pressure is “commensurate with stresses that are commonly induced in commercial film growth processes and many other manufacturing processes,” so should not be difficult to implement in practice, Fincher adds.

    In fact, a different kind of stress, called stack pressure, is often applied to battery cells, by essentially squishing the material in the direction perpendicular to the battery’s plates — somewhat like compressing a sandwich by putting a weight on top of it. It was thought that this might help prevent the layers from separating. But the experiments have now demonstrated that pressure in that direction actually exacerbates dendrite formation. “We showed that this type of stack pressure actually accelerates dendrite-induced failure,” Fincher says.

    What is needed instead is pressure along the plane of the plates, as if the sandwich were being squeezed from the sides. “What we have shown in this work is that when you apply a compressive force you can force the dendrites to travel in the direction of the compression,” Fincher says, and if that direction is along the plane of the plates, the dendrites “will never get to the other side.”

    That could finally make it practical to produce batteries using solid electrolyte and metallic lithium electrodes. Not only would these pack more energy into a given volume and weight, but they would eliminate the need for liquid electrolytes, which are flammable materials.

    Having demonstrated the basic principles involved, the team’s next step will be to try to apply these to the creation of a functional prototype battery, Chiang says, and then to figure out exactly what manufacturing processes would be needed to produce such batteries in quantity. Though they have filed for a patent, the researchers don’t plan to commercialize the system themselves, he says, as there are already companies working on the development of solid-state batteries. “I would say this is an understanding of failure modes in solid-state batteries that we believe the industry needs to be aware of and try to use in designing better products,” he says.

    The research team included Christos Athanasiou and Brian Sheldon at Brown University, and Colin Gilgenbach, Michael Wang, and W. Craig Carter at MIT. The work was supported by the U.S. National Science Foundation, the U.S. Department of Defense, the U.S. Defense Advanced Research Projects Agency, and the U.S. Department of Energy. More

  • in

    On batteries, teaching, and world peace

    Over his long career as an electrochemist and professor, Donald Sadoway has earned an impressive variety of honors, from being named one of Time magazine’s 100 most influential people in 2012 to appearing on “The Colbert Report,” where he talked about “renewable energy and world peace,” according to Comedy Central.

    What does he personally consider to be his top achievements?

    “That’s easy,” he says immediately. “For teaching, it’s 3.091,” the MIT course on solid-state chemistry he led for some 18 years. An MIT core requirement, 3.091 is also one of the largest classes at the Institute. In 2003 it was the largest, with 630 students. Sadoway, who retires this year after 45 years in the Department of Materials Science and Engineering, estimates that over the years he’s taught the course to some 10,000 undergraduates.

    A passion for teaching

    Along the way he turned the class into an MIT favorite, complete with music, art, and literature. “I brought in all that enrichment because I knew that 95 percent of the students in that room weren’t going to major in anything chemical and this might be the last class they’d take in the subject. But it’s a requirement. So they’re 18 years old, they’re very smart, and many of them are very bored. You have to find a hook [to reach them]. And I did.”

    In 1995, Sadoway was named a Margaret MacVicar Faculty Fellow, an honor that recognizes outstanding classroom teaching at the Institute. Among the communications in support of his nomination:

    “His contributions are enormous and the class is in rapt attention from beginning to end. His lectures are highly articulate yet animated and he has uncommon grace and style. I was awed by his ability to introduce playful and creative elements into a core lecture…”

    Bill Gates would agree. In the early 2000s Sadoway’s lectures were shared with the world through OpenCourseWare, the web-based publication of MIT course materials. Gates was so inspired by the lectures that he asked to meet with Sadoway to learn more about his research. (Sadoway initially ignored Gates’ email because he thought his account had been hacked by MIT pranksters.)

    Research breakthroughs

    Teaching is not Sadoway’s only passion. He’s also proud of his accomplishments in electrochemistry. The discipline that involves electron transfer reactions is key to everything from batteries to the primary extraction of metals like aluminum and magnesium. “It’s quite wide-ranging,” says the John F. Elliott Professor Emeritus of Materials Chemistry.

    Sadoway’s contributions include two battery breakthroughs. First came the liquid metal battery, which could enable the large-scale storage of renewable energy. “That represents a huge step forward in the transition to green energy,” said António Campinos, president of the European Patent Office, earlier this year when Sadoway won the 2022 European Inventor Award for the invention in the category for Non-European Patent Office Countries.

    On “The Colbert Report,” Sadoway alluded to that work when he told Stephen Colbert that electrochemistry is the key to world peace. Why? Because it could lead to a battery capable of storing energy from the sun when the sun doesn’t shine and otherwise make renewables an important part of the clean energy mix. And that in turn could “plummet the price of petroleum and depose dictators all over the world without one shot being fired,” he recently recalled.

    The liquid metal battery is the focus of Ambri, one of six companies based on Sadoway’s inventions. Bill Gates was the first funder of the company, which formed in 2010 and aims to install its first battery soon. That battery will store energy from a reported 500 megawatts of on-site renewable generation, the same output as a natural gas power plant.

    Then, in August of this year, Sadoway and colleagues published a paper in Nature about “one of the first new battery chemistries in 30 years,” Sadoway says. “I wanted to invent something that was better, much better,” than the expensive lithium-ion batteries used in, for example, today’s electric cars.

    That battery is the focus of Avanti, one of three Sadoway companies formed just last year. The other two are Pure Lithium, to commercialize his inventions related to that element, and Sadoway Labs. The latter, a nonprofit, is essentially “a space to try radical innovations. We’re gonna start working on wild ideas.”

    Another focus of Sadoway’s research: green steel. Steelmaking produces huge amounts of greenhouse gases. Enter Boston Metal, another Sadoway company. This one is developing a new approach to producing steel based on research begun some 25 years ago. Unlike the current technology for producing steel, the Boston Metal approach — molten oxide electrolysis — does not use the element at the root of steel’s problems: carbon. The principal byproduct of the new system? Oxygen.

    In 2012, Sadoway gave a TED talk to 2,000 people on the liquid metal battery. He believes that that talk, which has now been seen by almost 2.5 million people, led to the wider publicity of his work — and science overall — on “The Colbert Report” and elsewhere. “The moral here is that if you step out of your comfort zone, you might be surprised at what can happen,” he concludes.

    Colleagues’ reflections

    “I met Don in 2006 when I was working for the iron and steel industry in Europe on ways to reduce greenhouse gas emissions from the production of those materials,” says Antoine Allanore, professor of metallurgy, Department of Materials Science and Engineering. “He was the same Don Sadoway that you see in recordings of his lectures: very elegant, very charismatic, and passionate about the technical solutions and underlying science of the process we were all investigating; electrolysis. A few years later, when I decided to pursue an academic career, I contacted Don and became a postdoctoral associate in his lab. That ultimately led to my becoming an MIT professor. People don’t believe me, but before I came to MIT the only thing I knew about the Institute was that Noam Chomsky was there … and Don Sadoway. And I felt, that’s a great place to be. And I stayed because I saw the exceptional things that can be accomplished at MIT and Don is the perfect example of that.”

    “I had the joy of meeting Don when I first arrived on the MIT campus in 1994,” recalls Felice Frankel, research scientist in the MIT departments of Chemical Engineering and Mechanical Engineering. “I didn’t have to talk him into the idea that researchers needed to take their images and graphics more seriously.  He got it — that it wasn’t just about pretty pictures. He was an important part of our five-year National Science Foundation project — Picturing to Learn — to bring that concept into the classroom. How lucky that was for me!”

    “Don has been a friend and mentor since we met in 1995 when I was an MIT senior,” says Luis Ortiz, co-founder and chief executive officer, Avanti Battery Co. “One story that is emblematic of Don’s insistence on excellence is from when he and I met with Bill Gates about the challenges in addressing climate change and how batteries could be the linchpin in solving them. I suggested that we create our presentation in PowerPoint [Microsoft software]. Don balked. He insisted that we present using Keynote on his MacBook Air, because ‘it looks so much better.’ I was incredulous that he wanted to walk into that venue exclusively using Apple products. Of course, he won the argument, but not without my admonition that there had better not be even a blip of an issue. In the meeting room, Microsoft’s former chief technology officer asked Don if he needed anything to hook up to the screen, ‘we have all those dongles.’ Don declined, but gave me that knowing look and whispered, ‘You see, they know, too.’ I ate my crow and we had a great long conversation without any issues.”

    “I remember when I first started working with Don on the liquid metal battery project at MIT, after I had chosen it as the topic for my master’s of engineering thesis,” adds David Bradwell, co-founder and chief technology officer, Ambri. “I was a wide-eyed graduate student, sitting in his office, amongst his art deco decorations, unique furniture, and historical and stylistic infographics, and from our first meeting, I could see Don’s passion for coming up with new and creative, yet practical scientific ideas, and for working on hard problems, in service of society. Don’s approaches always appear to be unconventional — wanting to stand out in a crowd, take the path less trodden, both based on his ideas, and his sense of style. It’s been an amazing journey working with him over the past decade-and-a-half, and I remain excited to see what other new, unconventional ideas, he can bring to this world.” More

  • in

    3 Questions: Robert Stoner unpacks US climate and infrastructure laws

    This month, the 2022 United Nations Climate Change Conference (COP27) takes place in Sharm El Sheikh, Egypt, bringing together governments, experts, journalists, industry, and civil society to discuss climate action to enable countries to collectively sharply limit anthropogenic climate change. As MIT Energy Initiative Deputy Director for Science and Technology Robert Stoner attends the conference, he takes a moment to speak about the climate and infrastructure laws enacted in the last year in the United States, and about the impact these laws can have in the global energy transition.

    Q: COP27 is now underway. Can you set the scene?

    A: There’s a lot of interest among vulnerable countries about compensation for the impacts climate change has had on them, or “loss and damage,” a topic that the United States refused to address last year at COP26, for fear of opening up a floodgate and leaving U.S. taxpayers exposed to unlimited liability for our past (and future) emissions. This is a crucial issue of fairness for developed countries — and, well, of acknowledging our common humanity. But in a sense, it’s also a sideshow, and addressing it won’t prevent a climate catastrophe — we really need to focus on mitigation. With the passage of the bipartisan Infrastructure Investment and Jobs Act and the Inflation Reduction Act (IRA), the United States is now in a strong position to twist some arms. These laws are largely about subsidizing the deployment of low-carbon technologies — pretty much all of them. We’re going to do a lot in the United States in the next decade that will lead to dramatic cost reductions for these technologies and enable other countries with fewer resources to adopt them as well. It’s exactly the leadership role the United States has needed to assume. Now we have the opportunity to rally the rest of the world and get other countries to commit to more ambitious decarbonization goals, and to build practical programs that take advantage of the investable pathways we’re going to create for public and private actors.

    But that alone won’t get us there — money is still a huge problem, especially in emerging markets and developing countries. And I don’t think the institutions we rely on to help these countries fund infrastructure — energy and everything else — are adequately funded. Nor do these institutions have the right structures, incentives, and staffing to fund low-carbon development in these countries rapidly enough or on the necessary scale. I’m talking about the World Bank, for instance, but the other multilateral organizations have similar issues. I frankly don’t think the multilaterals can be reformed or sufficiently redirected on a short enough time frame. We definitely need new leadership for these organizations, and I think we probably need to quickly establish new multilaterals with new people, more money, and a clarity of purpose that is likely beyond what can be achieved incrementally. I don’t know if this is going to be an active public discussion at COP27, but I hope it takes place somewhere soon. Given the strong role our government plays in financing and selecting the leadership of these institutions, perhaps this is another opportunity for the United States to demonstrate courage and leadership.

    Q: What “investable pathways” are you talking about?

    A: Well, the pathways we’re implicitly trying to pursue with the Infrastructure Act and IRA are pretty clear, and I’ll come back to them. But first let me describe the landscape: There are three main sources of demand for energy in the economy — industry (meaning chemical production, fuel for electricity generation, cement production, materials and manufacturing, and so on), transportation (cars, trucks, ships, planes, and trains), and buildings (for heating and cooling, mostly). That’s about it, and these three sectors account for 75 percent of our total greenhouse gas emissions. So the pathways are all about how to decarbonize these three end-use sectors. There are a lot of technologies — some that exist, some that don’t — that will have to be brought to bear. And so it can be a little overwhelming to try to imagine how it will all transpire, but it’s pretty clear at a high level what our options are:

    First, generate a lot of low-carbon electricity and electrify as many industrial processes, vehicles, and building heating systems as we can.
    Second, develop and deploy at massive scale technologies that can capture carbon dioxide from smokestacks, or the air, and put it somewhere that it can never escape from — in other words, carbon capture and sequestration, or CCS.
    Third, for end uses like aviation that really need to use fuels because of their extraordinary energy density, develop low-carbon alternatives to fossil fuels.
    And fourth is energy efficiency across the board — but I don’t really count that as a separate pathway per se.
    So, by “investable pathways” I mean specific ways to pursue these options that will attract investors. What the Infrastructure Act and the IRA do is deploy carrots (in the form of subsidies) in a variety of ways to close the gap between what it costs to deploy technologies like CCS that aren’t yet at a commercial stage because they’re immature, and what energy markets will tolerate. A similar situation occurs for low-carbon production of hydrogen, one of the leading low-carbon fuel candidates. We can make it by splitting water with electricity (electrolysis), but that costs too much with present-day technology; or we can make it more cheaply by separating it from methane (which is what natural gas mainly is), but that creates CO2 that has to be transported and sequestered somewhere. And then we have to store the hydrogen until we’re ready to use it, and transport it by pipeline to the industrial facilities where it will be used. That requires infrastructure that doesn’t exist — pipelines, compression stations, big tanks! Come to think of it, the demand for all that hydrogen doesn’t exist either — at least not if industry has to pay what it actually costs.

    So, one very important thing these new acts do is subsidize production of hydrogen in various ways — and subsidize the creation of a CCS industry. The other thing they do is subsidize the deployment at enormous scale of low-carbon energy technologies. Some of them are already pretty cheap, like solar and wind, but they need to be supported by a lot of storage on the grid (which we don’t yet have) and by other sorts of grid infrastructure that, again, don’t exist. So, they now get subsidized, too, along with other carbon-free and low-carbon generation technologies — basically all of them. The idea is that by stimulating at-scale deployment of all these established and emerging technologies, and funding demonstrations of novel infrastructure — effectively lowering the cost of supply of low-carbon energy in the form of electricity and fuels — we will draw out the private sector to build out much more of the connective infrastructure and invest in new industrial processes, new home heating systems, and low-carbon transportation. This subsidized build-out will take place over a decade and then phase out as costs fall — hopefully, leaving the foundation for a thriving low-carbon energy economy in its wake, along with crucial technologies and knowledge that will benefit the whole world.

    Q: Is all of the federal investment in energy infrastructure in the United States relevant to the energy crisis in Europe right now?

    A: Not in a direct way — Europe is a near-term catastrophe with a long-term challenge that is in many ways more difficult than ours because Europe doesn’t have the level of primary energy resources like oil and gas that we have in abundance. Energy costs more in Europe, especially absent Russian pipelines. In a way, the narrowing of Europe’s options creates an impetus to invest in low-carbon technologies sooner than otherwise. The result either way will be expensive energy and quite a lot of economic suffering for years. The near-term challenge is to protect people from high energy prices. The big spikes in electricity prices we see now are driven by the natural gas market disruption, which will eventually dissipate as new sources of electricity come online (Sweden, for example, just announced a plan to develop new nuclear, and we’re seeing other countries like Germany soften their stance on nuclear) — and gas markets will sort themselves out. Meanwhile governments are trying to shield their people with electricity price caps and other subsidies, but that’s enormously burdensome.

    The EU recently announced gas price caps for imported gas to try to eliminate price-gouging by importers and reduce the subsidy burden. That may help to lower downstream prices, or it may make matters worse by reducing the flow of gas into the EU and fueling scarcity pricing, and ultimately adding to the subsidy burden. A lot people are quite reasonably suggesting that if electricity prices are subject to crazy behavior in gas markets, then why not disconnect from the grid and self-generate? Wouldn’t that also help reduce demand for gas overall and also reduce CO2 emissions? It would. But it’s expensive to put solar panels on your roof and batteries in your basement — so for those rich enough to do this, it would lead to higher average electricity costs that would live on far into the future, even when grid prices eventually come down.

    So, an interesting idea is taking hold, with considerable encouragement from national governments — the idea of “energy communities,” basically, towns or cities that encourage local firms and homeowners to install solar and batteries, and make some sort of business arrangement with the local utility to allow the community to disconnect from the national grid at times of high prices and self-supply — in other words, use the utility’s wires to sell locally generated power locally. It’s interesting to think about — it takes less battery storage to handle the intermittency of solar when you have a lot of generators and consumers, so forming a community helps lower costs, and with a good deal from the utility for using their wires, it might not be that much more expensive. And of course, when the national grid is working well and prices are normal, the community would reconnect and buy power cheaply, while selling back its self-generated power to the grid. There are also potentially important social benefits that might accrue in these energy communities, too. It’s not a dumb idea, and we’ll see some interesting experimentation in this area in the coming years — as usual, the Germans are enthusiastic! More

  • in

    With new heat treatment, 3D-printed metals can withstand extreme conditions

    A new MIT-developed heat treatment transforms the microscopic structure of 3D-printed metals, making the materials stronger and more resilient in extreme thermal environments. The technique could make it possible to 3D print high-performance blades and vanes for power-generating gas turbines and jet engines, which would enable new designs with improved fuel consumption and energy efficiency.

    Today’s gas turbine blades are manufactured through conventional casting processes in which molten metal is poured into complex molds and directionally solidified. These components are made from some of the most heat-resistant metal alloys on Earth, as they are designed to rotate at high speeds in extremely hot gas, extracting work to generate electricity in power plants and thrust in jet engines.

    There is growing interest in manufacturing turbine blades through 3D-printing, which, in addition to its environmental and cost benefits, could allow manufacturers to quickly produce more intricate, energy-efficient blade geometries. But efforts to 3D-print turbine blades have yet to clear a big hurdle: creep.

    In metallurgy, creep refers to a metal’s tendency to permanently deform in the face of persistent mechanical stress and high temperatures. While researchers have explored printing turbine blades, they have found that the printing process produces fine grains on the order of tens to hundreds of microns in size — a microstructure that is especially vulnerable to creep.

    “In practice, this would mean a gas turbine would have a shorter life or less fuel efficiency,” says Zachary Cordero, the Boeing Career Development Professor in Aeronautics and Astronautics at MIT. “These are costly, undesirable outcomes.”

    Cordero and his colleagues found a way to improve the structure of 3D-printed alloys by adding an additional heat-treating step, which transforms the as-printed material’s fine grains into much larger “columnar” grains — a sturdier microstructure that should minimize the material’s creep potential, since the “columns” are aligned with the axis of greatest stress. The researchers say the method, outlined today in Additive Manufacturing, clears the way for industrial 3D-printing of gas turbine blades.

    “In the near future, we envision gas turbine manufacturers will print their blades and vanes at large-scale additive manufacturing plants, then post-process them using our heat treatment,” Cordero says. “3D-printing will enable new cooling architectures that can improve the thermal efficiency of a turbine, so that it produces the same amount of power while burning less fuel and ultimately emits less carbon dioxide.”

    Cordero’s co-authors on the study are lead author Dominic Peachey, Christopher Carter, and Andres Garcia-Jimenez at MIT, Anugrahaprada Mukundan and Marie-Agathe Charpagne of the University of Illinois at Urbana-Champaign, and Donovan Leonard of Oak Ridge National Laboratory.

    Triggering a transformation

    The team’s new method is a form of directional recrystallization — a heat treatment that passes a material through a hot zone at a precisely controlled speed to meld a material’s many microscopic grains into larger, sturdier, and more uniform crystals.

    Directional recrystallization was invented more than 80 years ago and has been applied to wrought materials. In their new study, the MIT team adapted directional recrystallization for 3D-printed superalloys.

    The team tested the method on 3D-printed nickel-based superalloys — metals that are typically cast and used in gas turbines. In a series of experiments, the researchers placed 3D-printed samples of rod-shaped superalloys in a room-temperature water bath placed just below an induction coil. They slowly drew each rod out of the water and through the coil at various speeds, dramatically heating the rods to temperatures varying between 1,200 and 1,245 degrees Celsius.

    They found that drawing the rods at a particular speed (2.5 millimeters per hour) and through a specific temperature (1,235 degrees Celsius) created a steep thermal gradient that triggered a transformation in the material’s printed, fine-grained microstructure.

    “The material starts as small grains with defects called dislocations, that are like a mangled spaghetti,” Cordero explains. “When you heat this material up, those defects can annihilate and reconfigure, and the grains are able to grow. We’re continuously elongating the grains by consuming the defective material and smaller grains — a process termed recrystallization.”

    Creep away

    After cooling the heat-treated rods, the researchers examined their microstructure using optical and electron microscopy, and found that the material’s printed microscopic grains were replaced with “columnar” grains, or long crystal-like regions that were significantly larger than the original grains.

    “We’ve completely transformed the structure,” says lead author Dominic Peachey. “We show we can increase the grain size by orders of magnitude, to massive columnar grains, which theoretically should lead to dramatic improvements in creep properties.”

    The team also showed they could manipulate the draw speed and temperature of the rod samples to tailor the material’s growing grains, creating regions of specific grain size and orientation. This level of control, Cordero says, can enable manufacturers to print turbine blades with site-specific microstructures that are resilient to specific operating conditions.

    Cordero plans to test the heat treatment on 3D-printed geometries that more closely resemble turbine blades. The team is also exploring ways to speed up the draw rate, as well as test a heat-treated structure’s resistance to creep. Then, they envision that the heat treatment could enable the practical application of 3D-printing to produce industrial-grade turbine blades, with more complex shapes and patterns.

    “New blade and vane geometries will enable more energy-efficient land-based gas turbines, as well as, eventually, aeroengines,” Cordero notes. “This could from a baseline perspective lead to lower carbon dioxide emissions, just through improved efficiency of these devices.”

    This research was supported, in part, by the U.S. Office of Naval Research. More

  • in

    Advancing the energy transition amidst global crises

    “The past six years have been the warmest on the planet, and our track record on climate change mitigation is drastically short of what it needs to be,” said Robert C. Armstrong, MIT Energy Initiative (MITEI) director and the Chevron Professor of Chemical Engineering, introducing MITEI’s 15th Annual Research Conference.

    At the symposium, participants from academia, industry, and finance acknowledged the deepening difficulties of decarbonizing a world rocked by geopolitical conflicts and suffering from supply chain disruptions, energy insecurity, inflation, and a persistent pandemic. In spite of this grim backdrop, the conference offered evidence of significant progress in the energy transition. Researchers provided glimpses of a low-carbon future, presenting advances in such areas as long-duration energy storage, carbon capture, and renewable technologies.

    In his keynote remarks, Ernest J. Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus, founding director of MITEI, and former U.S. secretary of energy, highlighted “four areas that have materially changed in the last year” that could shake up, and possibly accelerate, efforts to address climate change.

    Extreme weather seems to be propelling the public and policy makers of both U.S. parties toward “convergence … at least in recognition of the challenge,” Moniz said. He perceives a growing consensus that climate goals will require — in diminishing order of certainty — firm (always-on) power to complement renewable energy sources, a fuel (such as hydrogen) flowing alongside electricity, and removal of atmospheric carbon dioxide (CO2).

    Russia’s invasion of Ukraine, with its “weaponization of natural gas” and global energy impacts, underscores the idea that climate, energy security, and geopolitics “are now more or less recognized widely as one conversation.” Moniz pointed as well to new U.S. laws on climate change and infrastructure that will amplify the role of science and technology and “address the drive to technological dominance by China.”

    The rapid transformation of energy systems will require a comprehensive industrial policy, Moniz said. Government and industry must select and rapidly develop low-carbon fuels, firm power sources (possibly including nuclear power), CO2 removal systems, and long-duration energy storage technologies. “We will need to make progress on all fronts literally in this decade to come close to our goals for climate change mitigation,” he concluded.

    Global cooperation?

    Over two days, conference participants delved into many of the issues Moniz raised. In one of the first panels, scholars pondered whether the international community could forge a coordinated climate change response. The United States’ rift with China, especially over technology trade policies, loomed large.

    “Hatred of China is a bipartisan hobby and passion, but a blanket approach isn’t right, even for the sake of national security,” said Yasheng Huang, the Epoch Foundation Professor of Global Economics and Management at the MIT Sloan School of Management. “Although the United States and China working together would have huge effects for both countries, it is politically unpalatable in the short term,” said F. Taylor Fravel, the Arthur and Ruth Sloan Professor of Political Science and director of the MIT Security Studies Program. John E. Parsons, deputy director for research at the MIT Center for Energy and Environmental Policy Research, suggested that the United States should use this moment “to get our own act together … and start doing things,” such as building nuclear power plants in a cost-effective way.

    Debating carbon removal

    Several panels took up the matter of carbon emissions and the most promising technologies for contending with them. Charles Harvey, MIT professor of civil and environmental engineering, and Howard Herzog, a senior research engineer at MITEI, set the stage early, debating whether capturing carbon was essential to reaching net-zero targets.

    “I have no trouble getting to net zero without carbon capture and storage,” said David Keith, the Gordon McKay Professor of Applied Physics at Harvard University, in a subsequent roundtable. Carbon capture seems more risky to Keith than solar geoengineering, which involves injecting sulfur into the stratosphere to offset CO2 and its heat-trapping impacts.

    There are new ways of moving carbon from where it’s a problem to where it’s safer. Kripa K. Varanasi, MIT professor of mechanical engineering, described a process for modulating the pH of ocean water to remove CO2. Timothy Krysiek, managing director for Equinor Ventures, talked about construction of a 900-kilometer pipeline transporting CO2 from northern Germany to a large-scale storage site located in Norwegian waters 3,000 meters below the seabed. “We can use these offshore Norwegian assets as a giant carbon sink for Europe,” he said.

    A startup showcase featured additional approaches to the carbon challenge. Mantel, which received MITEI Seed Fund money, is developing molten salt material to capture carbon for long-term storage or for use in generating electricity. Verdox has come up with an electrochemical process for capturing dilute CO2 from the atmosphere.

    But while much of the global warming discussion focuses on CO2, other greenhouse gases are menacing. Another panel discussed measuring and mitigating these pollutants. “Methane has 82 times more warming power than CO2 from the point of emission,” said Desirée L. Plata, MIT associate professor of civil and environmental engineering. “Cutting methane is the strongest lever we have to slow climate change in the next 25 years — really the only lever.”

    Steven Hamburg, chief scientist and senior vice president of the Environmental Defense Fund, cautioned that emission of hydrogen molecules into the atmosphere can cause increases in other greenhouse gases such as methane, ozone, and water vapor. As researchers and industry turn to hydrogen as a fuel or as a feedstock for commercial processes, “we will need to minimize leakage … or risk increasing warming,” he said.

    Supply chains, markets, and new energy ventures

    In panels on energy storage and the clean energy supply chain, there were interesting discussions of challenges ahead. High-density energy materials such as lithium, cobalt, nickel, copper, and vanadium for grid-scale energy storage, electric vehicles (EVs), and other clean energy technologies, can be difficult to source. “These often come from water-stressed regions, and we need to be super thoughtful about environmental stresses,” said Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in Materials Science and Engineering. She also noted that in light of the explosive growth in demand for metals such as lithium, recycling EVs won’t be of much help. “The amount of material coming back from end-of-life batteries is minor,” she said, until EVs are much further along in their adoption cycle.

    Arvind Sanger, founder and managing partner of Geosphere Capital, said that the United States should be developing its own rare earths and minerals, although gaining the know-how will take time, and overcoming “NIMBYism” (not in my backyard-ism) is a challenge. Sanger emphasized that we must continue to use “denser sources of energy” to catalyze the energy transition over the next decade. In particular, Sanger noted that “for every transition technology, steel is needed,” and steel is made in furnaces that use coal and natural gas. “It’s completely woolly-headed to think we can just go to a zero-fossil fuel future in a hurry,” he said.

    The topic of power markets occupied another panel, which focused on ways to ensure the distribution of reliable and affordable zero-carbon energy. Integrating intermittent resources such as wind and solar into the grid requires a suite of retail markets and new digital tools, said Anuradha Annaswamy, director of MIT’s Active-Adaptive Control Laboratory. Tim Schittekatte, a postdoc at the MIT Sloan School of Management, proposed auctions as a way of insuring consumers against periods of high market costs.

    Another panel described the very different investment needs of new energy startups, such as longer research and development phases. Hooisweng Ow, technology principal at Eni Next LLC Ventures, which is developing drilling technology for geothermal energy, recommends joint development and partnerships to reduce risk. Michael Kearney SM ’11, PhD ’19, SM ’19 is a partner at The Engine, a venture firm built by MIT investing in path-breaking technology to solve the toughest challenges in climate and other problems. Kearney believes the emergence of new technologies and markets will bring on “a labor transition on an order of magnitude never seen before in this country,” he said. “Workforce development is not a natural zone for startups … and this will have to change.”

    Supporting the global South

    The opportunities and challenges of the energy transition look quite different in the developing world. In conversation with Robert Armstrong, Luhut Binsar Pandjaitan, the coordinating minister for maritime affairs and investment of the Republic of Indonesia, reported that his “nation is rich with solar, wind, and energy transition minerals like nickel and copper,” but cannot on its own tackle developing renewable energy or reducing carbon emissions and improving grid infrastructure. “Education is a top priority, and we are very far behind in high technologies,” he said. “We need help and support from MIT to achieve our target,” he said.

    Technologies that could springboard Indonesia and other nations of the global South toward their climate goals are emerging in MITEI-supported projects and at young companies MITEI helped spawn. Among the promising innovations unveiled at the conference are new materials and designs for cooling buildings in hot climates and reducing the environmental costs of construction, and a sponge-like substance that passively sucks moisture out of the air to lower the energy required for running air conditioners in humid climates.

    Other ideas on the move from lab to market have great potential for industrialized nations as well, such as a computational framework for maximizing the energy output of ocean-based wind farms; a process for using ammonia as a renewable fuel with no CO2 emissions; long-duration energy storage derived from the oxidation of iron; and a laser-based method for unlocking geothermal steam to drive power plants. More

  • in

    New materials could enable longer-lasting implantable batteries

    For the last few decades, battery research has largely focused on rechargeable lithium-ion batteries, which are used in everything from electric cars to portable electronics and have improved dramatically in terms of affordability and capacity. But nonrechargeable batteries have seen little improvement during that time, despite their crucial role in many important uses such as implantable medical devices like pacemakers.

    Now, researchers at MIT have come up with a way to improve the energy density of these nonrechargeable, or “primary,” batteries. They say it could enable up to a 50 percent increase in useful lifetime, or a corresponding decrease in size and weight for a given amount of power or energy capacity, while also improving safety, with little or no increase in cost.

    The new findings, which involve substituting the conventionally inactive battery electrolyte with a material that is active for energy delivery, are reported today in the journal Proceedings of the National Academy of Sciences, in a paper by MIT Kavanaugh Postdoctoral Fellow Haining Gao, graduate student Alejandro Sevilla, associate professor of mechanical engineering Betar Gallant, and four others at MIT and Caltech.

    Replacing the battery in a pacemaker or other medical implant requires a surgical procedure, so any increase in the longevity of their batteries could have a significant impact on the patient’s quality of life, Gallant says. Primary batteries are used for such essential applications because they can provide about three times as much energy for a given size and weight as rechargeable batteries.

    That difference in capacity, Gao says, makes primary batteries “critical for applications where charging is not possible or is impractical.” The new materials work at human body temperature, so would be suitable for medical implants. In addition to implantable devices, with further development to make the batteries operate efficiently at cooler temperatures, applications could also include sensors in tracking devices for shipments, for example to ensure that temperature and humidity requirements for food or drug shipments are properly maintained throughout the shipping process. Or, they might be used in remotely operated aerial or underwater vehicles that need to remain ready for deployment over long periods.

    Pacemaker batteries typically last from five to 10 years, and even less if they require high-voltage functions such as defibrillation. Yet for such batteries, Gao says, the technology is considered mature, and “there haven’t been any major innovations in fundamental cell chemistries in the past 40 years.”

    The key to the team’s innovation is a new kind of electrolyte — the material that lies between the two electrical poles of the battery, the cathode and the anode, and allows charge carriers to pass through from one side to the other. Using a new liquid fluorinated compound, the team found that they could combine some of the functions of the cathode and the electrolyte in one compound, called a catholyte. This allows for saving much of the weight of typical primary batteries, Gao says.

    While there are other materials besides this new compound that could theoretically function in a similar catholyte role in a high-capacity battery, Gallant explains, those materials have lower inherent voltages that do not match those of the remainder of the material in a conventional pacemaker battery, a type known as CFx. Because the overall output from the battery can’t be more than that of the lesser of the two electrode materials,  the extra capacity would go to waste because of the voltage mismatch. But with the new material, “one of the key merits of our fluorinated liquids is that their voltage aligns very well with that of CFx,” Gallant says.

    In a conventional  CFx battery, the liquid electrolyte is essential because it allows charged particles to pass through from one electrode to the other. But “those electrolytes are actually chemically inactive, so they’re basically dead weight,” Gao says. This means about 50 percent of the battery’s key components, mainly the electrolyte, is inactive material. But in the new design with the fluorinated catholyte material, the amount of dead weight can be reduced to about 20 percent, she says.

    The new cells also provide safety improvements over other kinds of proposed chemistries that would use toxic and corrosive catholyte materials, which their formula does not, Gallant says. And preliminary tests have demonstrated a stable shelf life over more than a year, an important characteristic for primary batteries, she says.

    So far, the team has not yet experimentally achieved the full 50 percent improvement in energy density predicted by their analysis. They have demonstrated a 20 percent improvement, which in itself would be an important gain for some applications, Gallant says. The design of the cell itself has not yet been fully optimized, but the researchers can project the cell performance based on the performance of the active material itself. “We can see the projected cell-level performance when it’s scaled up can reach around 50 percent higher than the CFx cell,” she says. Achieving that level experimentally is the team’s next goal.

    Sevilla, a doctoral student in the mechanical engineering department, will be focusing on that work in the coming year. “I was brought into this project to try to understand some of the limitations of why we haven’t been able to attain the full energy density possible,” he says. “My role has been trying to fill in the gaps in terms of understanding the underlying reaction.”

    One big advantage of the new material, Gao says, is that it can easily be integrated into existing battery manufacturing processes, as a simple substitution of one material for another. Preliminary discussions with manufacturers confirm this potentially easy substitution, Gao says. The basic starting material, used for other purposes, has already been scaled up for production, she says, and its price is comparable to that of the materials currently used in CFx batteries. The cost of batteries using the new material is likely to be comparable to the existing batteries as well, she says. The team has already applied for a patent on the catholyte, and they expect that the medical applications are likely to be the first to be commercialized, perhaps with a full-scale prototype ready for testing in real devices within about a year.

    Further down the road, other applications could likely take advantage of the new materials as well, such as smart water or gas meters that can be read out remotely, or devices like EZPass transponders, increasing their usable lifetime, the researchers say. Power for drone aircraft or undersea vehicles would require higher power and so may take longer to be developed. Other uses could include batteries for equipment used at remote sites, such as drilling rigs for oil and gas, including devices sent down into the wells to monitor conditions.

    The team also included Gustavo Hobold, Aaron Melemed, and Rui Guo at MIT and Simon Jones at Caltech. The work was supported by MIT Lincoln Laboratory and the Army Research Office. More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More