More stories

  • in

    The role of modeling in the energy transition

    Joseph F. DeCarolis, administrator for the U.S. Energy Information Administration (EIA), has one overarching piece of advice for anyone poring over long-term energy projections.“Whatever you do, don’t start believing the numbers,” DeCarolis said at the MIT Energy Initiative (MITEI) Fall Colloquium. “There’s a tendency when you sit in front of the computer and you’re watching the model spit out numbers at you … that you’ll really start to believe those numbers with high precision. Don’t fall for it. Always remain skeptical.”This event was part of MITEI’s new speaker series, MITEI Presents: Advancing the Energy Transition, which connects the MIT community with the energy experts and leaders who are working on scientific, technological, and policy solutions that are urgently needed to accelerate the energy transition.The point of DeCarolis’s talk, titled “Stay humble and prepare for surprises: Lessons for the energy transition,” was not that energy models are unimportant. On the contrary, DeCarolis said, energy models give stakeholders a framework that allows them to consider present-day decisions in the context of potential future scenarios. However, he repeatedly stressed the importance of accounting for uncertainty, and not treating these projections as “crystal balls.”“We can use models to help inform decision strategies,” DeCarolis said. “We know there’s a bunch of future uncertainty. We don’t know what’s going to happen, but we can incorporate that uncertainty into our model and help come up with a path forward.”Dialogue, not forecastsEIA is the statistical and analytic agency within the U.S. Department of Energy, with a mission to collect, analyze, and disseminate independent and impartial energy information to help stakeholders make better-informed decisions. Although EIA analyzes the impacts of energy policies, the agency does not make or advise on policy itself. DeCarolis, who was previously professor and University Faculty Scholar in the Department of Civil, Construction, and Environmental Engineering at North Carolina State University, noted that EIA does not need to seek approval from anyone else in the federal government before publishing its data and reports. “That independence is very important to us, because it means that we can focus on doing our work and providing the best information we possibly can,” he said.Among the many reports produced by EIA is the agency’s Annual Energy Outlook (AEO), which projects U.S. energy production, consumption, and prices. Every other year, the agency also produces the AEO Retrospective, which shows the relationship between past projections and actual energy indicators.“The first question you might ask is, ‘Should we use these models to produce a forecast?’” DeCarolis said. “The answer for me to that question is: No, we should not do that. When models are used to produce forecasts, the results are generally pretty dismal.”DeCarolis pointed to wildly inaccurate past projections about the proliferation of nuclear energy in the United States as an example of the problems inherent in forecasting. However, he noted, there are “still lots of really valuable uses” for energy models. Rather than using them to predict future energy consumption and prices, DeCarolis said, stakeholders should use models to inform their own thinking.“[Models] can simply be an aid in helping us think and hypothesize about the future of energy,” DeCarolis said. “They can help us create a dialogue among different stakeholders on complex issues. If we’re thinking about something like the energy transition, and we want to start a dialogue, there has to be some basis for that dialogue. If you have a systematic representation of the energy system that you can advance into the future, we can start to have a debate about the model and what it means. We can also identify key sources of uncertainty and knowledge gaps.”Modeling uncertaintyThe key to working with energy models is not to try to eliminate uncertainty, DeCarolis said, but rather to account for it. One way to better understand uncertainty, he noted, is to look at past projections, and consider how they ended up differing from real-world results. DeCarolis pointed to two “surprises” over the past several decades: the exponential growth of shale oil and natural gas production (which had the impact of limiting coal’s share of the energy market and therefore reducing carbon emissions), as well as the rapid rise in wind and solar energy. In both cases, market conditions changed far more quickly than energy modelers anticipated, leading to inaccurate projections.“For all those reasons, we ended up with [projected] CO2 [carbon dioxide] emissions that were quite high compared to actual,” DeCarolis said. “We’re a statistical agency, so we’re really looking carefully at the data, but it can take some time to identify the signal through the noise.”Although EIA does not produce forecasts in the AEO, people have sometimes interpreted the reference case in the agency’s reports as predictions. In an effort to illustrate the unpredictability of future outcomes in the 2023 edition of the AEO, the agency added “cones of uncertainty” to its projection of energy-related carbon dioxide emissions, with ranges of outcomes based on the difference between past projections and actual results. One cone captures 50 percent of historical projection errors, while another represents 95 percent of historical errors.“They capture whatever bias there is in our projections,” DeCarolis said of the uncertainty cones. “It’s being captured because we’re comparing actual [emissions] to projections. The weakness of this, though, is: who’s to say that those historical projection errors apply to the future? We don’t know that, but I still think that there’s something useful to be learned from this exercise.”The future of energy modelingLooking ahead, DeCarolis said, there is a “laundry list of things that keep me up at night as a modeler.” These include the impacts of climate change; how those impacts will affect demand for renewable energy; how quickly industry and government will overcome obstacles to building out clean energy infrastructure and supply chains; technological innovation; and increased energy demand from data centers running compute-intensive workloads.“What about enhanced geothermal? Fusion? Space-based solar power?” DeCarolis asked. “Should those be in the model? What sorts of technology breakthroughs are we missing? And then, of course, there are the unknown unknowns — the things that I can’t conceive of to put on this list, but are probably going to happen.”In addition to capturing the fullest range of outcomes, DeCarolis said, EIA wants to be flexible, nimble, transparent, and accessible — creating reports that can easily incorporate new model features and produce timely analyses. To that end, the agency has undertaken two new initiatives. First, the 2025 AEO will use a revamped version of the National Energy Modeling System that includes modules for hydrogen production and pricing, carbon management, and hydrocarbon supply. Second, an effort called Project BlueSky is aiming to develop the agency’s next-generation energy system model, which DeCarolis said will be modular and open source.DeCarolis noted that the energy system is both highly complex and rapidly evolving, and he warned that “mental shortcuts” and the fear of being wrong can lead modelers to ignore possible future developments. “We have to remain humble and intellectually honest about what we know,” DeCarolis said. “That way, we can provide decision-makers with an honest assessment of what we think could happen in the future.”  More

  • in

    A new catalyst can turn methane into something useful

    Although it is less abundant than carbon dioxide, methane gas contributes disproportionately to global warming because it traps more heat in the atmosphere than carbon dioxide, due to its molecular structure.MIT chemical engineers have now designed a new catalyst that can convert methane into useful polymers, which could help reduce greenhouse gas emissions.“What to do with methane has been a longstanding problem,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT and the senior author of the study. “It’s a source of carbon, and we want to keep it out of the atmosphere but also turn it into something useful.”The new catalyst works at room temperature and atmospheric pressure, which could make it easier and more economical to deploy at sites of methane production, such as power plants and cattle barns.Daniel Lundberg PhD ’24 and MIT postdoc Jimin Kim are the lead authors of the study, which appears today in Nature Catalysis. Former postdoc Yu-Ming Tu and postdoc Cody Ritt also authors of the paper.Capturing methaneMethane is produced by bacteria known as methanogens, which are often highly concentrated in landfills, swamps, and other sites of decaying biomass. Agriculture is a major source of methane, and methane gas is also generated as a byproduct of transporting, storing, and burning natural gas. Overall, it is believed to account for about 15 percent of global temperature increases.At the molecular level, methane is made of a single carbon atom bound to four hydrogen atoms. In theory, this molecule should be a good building block for making useful products such as polymers. However, converting methane to other compounds has proven difficult because getting it to react with other molecules usually requires high temperature and high pressures.To achieve methane conversion without that input of energy, the MIT team designed a hybrid catalyst with two components: a zeolite and a naturally occurring enzyme. Zeolites are abundant, inexpensive clay-like minerals, and previous work has found that they can be used to catalyze the conversion of methane to carbon dioxide.In this study, the researchers used a zeolite called iron-modified aluminum silicate, paired with an enzyme called alcohol oxidase. Bacteria, fungi, and plants use this enzyme to oxidize alcohols.This hybrid catalyst performs a two-step reaction in which zeolite converts methane to methanol, and then the enzyme converts methanol to formaldehyde. That reaction also generates hydrogen peroxide, which is fed back into the zeolite to provide a source of oxygen for the conversion of methane to methanol.This series of reactions can occur at room temperature and doesn’t require high pressure. The catalyst particles are suspended in water, which can absorb methane from the surrounding air. For future applications, the researchers envision that it could be painted onto surfaces.“Other systems operate at high temperature and high pressure, and they use hydrogen peroxide, which is an expensive chemical, to drive the methane oxidation. But our enzyme produces hydrogen peroxide from oxygen, so I think our system could be very cost-effective and scalable,” Kim says.Creating a system that incorporates both enzymes and artificial catalysts is a “smart strategy,” says Damien Debecker, a professor at the Institute of Condensed Matter and Nanosciences at the University of Louvain, Belgium.“Combining these two families of catalysts is challenging, as they tend to operate in rather distinct operation conditions. By unlocking this constraint and mastering the art of chemo-enzymatic cooperation, hybrid catalysis becomes key-enabling: It opens new perspectives to run complex reaction systems in an intensified way,” says Debecker, who was not involved in the research.Building polymersOnce formaldehyde is produced, the researchers showed they could use that molecule to generate polymers by adding urea, a nitrogen-containing molecule found in urine. This resin-like polymer, known as urea-formaldehyde, is now used in particle board, textiles and other products.The researchers envision that this catalyst could be incorporated into pipes used to transport natural gas. Within those pipes, the catalyst could generate a polymer that could act as a sealant to heal cracks in the pipes, which are a common source of methane leakage. The catalyst could also be applied as a film to coat surfaces that are exposed to methane gas, producing polymers that could be collected for use in manufacturing, the researchers say.Strano’s lab is now working on catalysts that could be used to remove carbon dioxide from the atmosphere and combine it with nitrate to produce urea. That urea could then be mixed with the formaldehyde produced by the zeolite-enzyme catalyst to produce urea-formaldehyde.The research was funded by the U.S. Department of Energy. More

  • in

    Turning automotive engines into modular chemical plants to make green fuels

    Reducing methane emissions is a top priority in the fight against climate change because of its propensity to trap heat in the atmosphere: Methane’s warming effects are 84 times more potent than CO2 over a 20-year timescale.And yet, as the main component of natural gas, methane is also a valuable fuel and a precursor to several important chemicals. The main barrier to using methane emissions to create carbon-negative materials is that human sources of methane gas — landfills, farms, and oil and gas wells — are relatively small and spread out across large areas, while traditional chemical processing facilities are huge and centralized. That makes it prohibitively expensive to capture, transport, and convert methane gas into anything useful. As a result, most companies burn or “flare” their methane at the site where it’s emitted, seeing it as a sunk cost and an environmental liability.The MIT spinout Emvolon is taking a new approach to processing methane by repurposing automotive engines to serve as modular, cost-effective chemical plants. The company’s systems can take methane gas and produce liquid fuels like methanol and ammonia on-site; these fuels can then be used or transported in standard truck containers.”We see this as a new way of chemical manufacturing,” Emvolon co-founder and CEO Emmanuel Kasseris SM ’07, PhD ’11 says. “We’re starting with methane because methane is an abundant emission that we can use as a resource. With methane, we can solve two problems at the same time: About 15 percent of global greenhouse gas emissions come from hard-to-abate sectors that need green fuel, like shipping, aviation, heavy heavy-duty trucks, and rail. Then another 15 percent of emissions come from distributed methane emissions like landfills and oil wells.”By using mass-produced engines and eliminating the need to invest in infrastructure like pipelines, the company says it’s making methane conversion economically attractive enough to be adopted at scale. The system can also take green hydrogen produced by intermittent renewables and turn it into ammonia, another fuel that can also be used to decarbonize fertilizers.“In the future, we’re going to need green fuels because you can’t electrify a large ship or plane — you have to use a high-energy-density, low-carbon-footprint, low-cost liquid fuel,” Kasseris says. “The energy resources to produce those green fuels are either distributed, as is the case with methane, or variable, like wind. So, you cannot have a massive plant [producing green fuels] that has its own zip code. You either have to be distributed or variable, and both of those approaches lend themselves to this modular design.”From a “crazy idea” to a companyKasseris first came to MIT to study mechanical engineering as a graduate student in 2004, when he worked in the Sloan Automotive Lab on a report on the future of transportation. For his PhD, he developed a novel technology for improving internal combustion engine fuel efficiency for a consortium of automotive and energy companies, which he then went to work for after graduation.Around 2014, he was approached by Leslie Bromberg ’73, PhD ’77, a serial inventor with more than 100 patents, who has been a principal research engineer in MIT’s Plasma Science and Fusion Center for nearly 50 years.“Leslie had this crazy idea of repurposing an internal combustion engine as a reactor,” Kasseris recalls. “I had looked at that while working in industry, and I liked it, but my company at the time thought the work needed more validation.”Bromberg had done that validation through a U.S. Department of Energy-funded project in which he used a diesel engine to “reform” methane — a high-pressure chemical reaction in which methane is combined with steam and oxygen to produce hydrogen. The work impressed Kasseris enough to bring him back to MIT as a research scientist in 2016.“We worked on that idea in addition to some other projects, and eventually it had reached the point where we decided to license the work from MIT and go full throttle,” Kasseris recalls. “It’s very easy to work with MIT’s Technology Licensing Office when you are an MIT inventor. You can get a low-cost licensing option, and you can do a lot with that, which is important for a new company. Then, once you are ready, you can finalize the license, so MIT was instrumental.”Emvolon continued working with MIT’s research community, sponsoring projects with Professor Emeritus John Heywood and participating in the MIT Venture Mentoring Service and the MIT Industrial Liaison Program.An engine-powered chemical plantAt the core of Emvolon’s system is an off-the-shelf automotive engine that runs “fuel rich” — with a higher ratio of fuel to air than what is needed for complete combustion.“That’s easy to say, but it takes a lot of [intellectual property], and that’s what was developed at MIT,” Kasseris says. “Instead of burning the methane in the gas to carbon dioxide and water, you partially burn it, or partially oxidize it, to carbon monoxide and hydrogen, which are the building blocks to synthesize a variety of chemicals.”The hydrogen and carbon monoxide are intermediate products used to synthesize different chemicals through further reactions. Those processing steps take place right next to the engine, which makes its own power. Each of Emvolon’s standalone systems fits within a 40-foot shipping container and can produce about 8 tons of methanol per day from 300,000 standard cubic feet of methane gas.The company is starting with green methanol because it’s an ideal fuel for hard-to-abate sectors such as shipping and heavy-duty transport, as well as an excellent feedstock for other high-value chemicals, such as sustainable aviation fuel. Many shipping vessels have already converted to run on green methanol in an effort to meet decarbonization goals.This summer, the company also received a grant from the Department of Energy to adapt its process to produce clean liquid fuels from power sources like solar and wind.“We’d like to expand to other chemicals like ammonia, but also other feedstocks, such as biomass and hydrogen from renewable electricity, and we already have promising results in that direction” Kasseris says. “We think we have a good solution for the energy transition and, in the later stages of the transition, for e-manufacturing.”A scalable approachEmvolon has already built a system capable of producing up to six barrels of green methanol a day in its 5,000 square-foot headquarters in Woburn, Massachusetts.“For chemical technologies, people talk about scale up risk, but with an engine, if it works in a single cylinder, we know it will work in a multicylinder engine,” Kasseris says. “It’s just engineering.”Last month, Emvolon announced an agreement with Montauk Renewables to build a commercial-scale demonstration unit next to a Texas landfill that will initially produce up to 15,000 gallons of green methanol a year and later scale up to 2.5 million gallons. That project could be expanded tenfold by scaling across Montauk’s other sites.“Our whole process was designed to be a very realistic approach to the energy transition,” Kasseris says. “Our solution is designed to produce green fuels and chemicals at prices that the markets are willing to pay today, without the need for subsidies. Using the engines as chemical plants, we can get the capital expenditure per unit output close to that of a large plant, but at a modular scale that enables us to be next to low-cost feedstock. Furthermore, our modular systems require small investments — of $1 to 10 million — that are quickly deployed, one at a time, within weeks, as opposed to massive chemical plants that require multiyear capital construction projects and cost hundreds of millions.” More

  • in

    Ensuring a durable transition

    To fend off the worst impacts of climate change, “we have to decarbonize, and do it even faster,” said William H. Green, director of the MIT Energy Initiative (MITEI) and Hoyt C. Hottel Professor, MIT Department of Chemical Engineering, at MITEI’s Annual Research Conference.“But how the heck do we actually achieve this goal when the United States is in the middle of a divisive election campaign, and globally, we’re facing all kinds of geopolitical conflicts, trade protectionism, weather disasters, increasing demand from developing countries building a middle class, and data centers in countries like the U.S.?”Researchers, government officials, and business leaders convened in Cambridge, Massachusetts, Sept. 25-26 to wrestle with this vexing question at the conference that was themed, “A durable energy transition: How to stay on track in the face of increasing demand and unpredictable obstacles.”“In this room we have a lot of power,” said Green, “if we work together, convey to all of society what we see as real pathways and policies to solve problems, and take collective action.”The critical role of consensus-building in driving the energy transition arose repeatedly in conference sessions, whether the topic involved developing and adopting new technologies, constructing and siting infrastructure, drafting and passing vital energy policies, or attracting and retaining a skilled workforce.Resolving conflictsThere is “blowback and a social cost” in transitioning away from fossil fuels, said Stephen Ansolabehere, the Frank G. Thompson Professor of Government at Harvard University, in a panel on the social barriers to decarbonization. “Companies need to engage differently and recognize the rights of communities,” he said.Nora DeDontney, director of development at Vineyard Offshore, described her company’s two years of outreach and negotiations to bring large cables from ocean-based wind turbines onshore.“Our motto is, ‘community first,’” she said. Her company works to mitigate any impacts towns might feel because of offshore wind infrastructure construction with projects, such as sewer upgrades; provides workforce training to Tribal Nations; and lays out wind turbines in a manner that provides safe and reliable areas for local fisheries.Elsa A. Olivetti, professor in the Department of Materials Science and Engineering at MIT and the lead of the Decarbonization Mission of MIT’s new Climate Project, discussed the urgent need for rapid scale-up of mineral extraction. “Estimates indicate that to electrify the vehicle fleet by 2050, about six new large copper mines need to come on line each year,” she said. To meet the demand for metals in the United States means pushing into Indigenous lands and environmentally sensitive habitats. “The timeline of permitting is not aligned with the temporal acceleration needed,” she said.Larry Susskind, the Ford Professor of Urban and Environmental Planning in the MIT Department of Urban Studies and Planning, is trying to resolve such tensions with universities playing the role of mediators. He is creating renewable energy clinics where students train to participate in emerging disputes over siting. “Talk to people before decisions are made, conduct joint fact finding, so that facilities reduce harms and share the benefits,” he said.Clean energy boom and pressureA relatively recent and unforeseen increase in demand for energy comes from data centers, which are being built by large technology companies for new offerings, such as artificial intelligence.“General energy demand was flat for 20 years — and now, boom,” said Sean James, Microsoft’s senior director of data center research. “It caught utilities flatfooted.” With the expansion of AI, the rush to provision data centers with upwards of 35 gigawatts of new (and mainly renewable) power in the near future, intensifies pressure on big companies to balance the concerns of stakeholders across multiple domains. Google is pursuing 24/7 carbon-free energy by 2030, said Devon Swezey, the company’s senior manager for global energy and climate.“We’re pursuing this by purchasing more and different types of clean energy locally, and accelerating technological innovation such as next-generation geothermal projects,” he said. Pedro Gómez Lopez, strategy and development director, Ferrovial Digital, which designs and constructs data centers, incorporates renewable energy into their projects, which contributes to decarbonization goals and benefits to locales where they are sited. “We can create a new supply of power, taking the heat generated by a data center to residences or industries in neighborhoods through District Heating initiatives,” he said.The Inflation Reduction Act and other legislation has ramped up employment opportunities in clean energy nationwide, touching every region, including those most tied to fossil fuels. “At the start of 2024 there were about 3.5 million clean energy jobs, with ‘red’ states showing the fastest growth in clean energy jobs,” said David S. Miller, managing partner at Clean Energy Ventures. “The majority (58 percent) of new jobs in energy are now in clean energy — that transition has happened. And one-in-16 new jobs nationwide were in clean energy, with clean energy jobs growing more than three times faster than job growth economy-wide”In this rapid expansion, the U.S. Department of Energy (DoE) is prioritizing economically marginalized places, according to Zoe Lipman, lead for good jobs and labor standards in the Office of Energy Jobs at the DoE. “The community benefit process is integrated into our funding,” she said. “We are creating the foundation of a virtuous circle,” encouraging benefits to flow to disadvantaged and energy communities, spurring workforce training partnerships, and promoting well-paid union jobs. “These policies incentivize proactive community and labor engagement, and deliver community benefits, both of which are key to building support for technological change.”Hydrogen opportunity and challengeWhile engagement with stakeholders helps clear the path for implementation of technology and the spread of infrastructure, there remain enormous policy, scientific, and engineering challenges to solve, said multiple conference participants. In a “fireside chat,” Prasanna V. Joshi, vice president of low-carbon-solutions technology at ExxonMobil, and Ernest J. Moniz, professor of physics and special advisor to the president at MIT, discussed efforts to replace natural gas and coal with zero-carbon hydrogen in order to reduce greenhouse gas emissions in such major industries as steel and fertilizer manufacturing.“We have gone into an era of industrial policy,” said Moniz, citing a new DoE program offering incentives to generate demand for hydrogen — more costly than conventional fossil fuels — in end-use applications. “We are going to have to transition from our current approach, which I would call carrots-and-twigs, to ultimately, carrots-and-sticks,” Moniz warned, in order to create “a self-sustaining, major, scalable, affordable hydrogen economy.”To achieve net zero emissions by 2050, ExxonMobil intends to use carbon capture and sequestration in natural gas-based hydrogen and ammonia production. Ammonia can also serve as a zero-carbon fuel. Industry is exploring burning ammonia directly in coal-fired power plants to extend the hydrogen value chain. But there are challenges. “How do you burn 100 percent ammonia?”, asked Joshi. “That’s one of the key technology breakthroughs that’s needed.” Joshi believes that collaboration with MIT’s “ecosystem of breakthrough innovation” will be essential to breaking logjams around the hydrogen and ammonia-based industries.MIT ingenuity essentialThe energy transition is placing very different demands on different regions around the world. Take India, where today per capita power consumption is one of the lowest. But Indians “are an aspirational people … and with increasing urbanization and industrial activity, the growth in power demand is expected to triple by 2050,” said Praveer Sinha, CEO and managing director of the Tata Power Co. Ltd., in his keynote speech. For that nation, which currently relies on coal, the move to clean energy means bringing another 300 gigawatts of zero-carbon capacity online in the next five years. Sinha sees this power coming from wind, solar, and hydro, supplemented by nuclear energy.“India plans to triple nuclear power generation capacity by 2032, and is focusing on advancing small modular reactors,” said Sinha. “The country also needs the rapid deployment of storage solutions to firm up the intermittent power.” The goal is to provide reliable electricity 24/7 to a population living both in large cities and in geographically remote villages, with the help of long-range transmission lines and local microgrids. “India’s energy transition will require innovative and affordable technology solutions, and there is no better place to go than MIT, where you have the best brains, startups, and technology,” he said.These assets were on full display at the conference. Among them a cluster of young businesses, including:the MIT spinout Form Energy, which has developed a 100-hour iron battery as a backstop to renewable energy sources in case of multi-day interruptions;startup Noya that aims for direct air capture of atmospheric CO2 using carbon-based materials;the firm Active Surfaces, with a lightweight material for putting solar photovoltaics in previously inaccessible places;Copernic Catalysts, with new chemistry for making ammonia and sustainable aviation fuel far more inexpensively than current processes; andSesame Sustainability, a software platform spun out of MITEI that gives industries a full financial analysis of the costs and benefits of decarbonization.The pipeline of research talent extended into the undergraduate ranks, with a conference “slam” competition showcasing students’ summer research projects in areas from carbon capture using enzymes to 3D design for the coils used in fusion energy confinement.“MIT students like me are looking to be the next generation of energy leaders, looking for careers where we can apply our engineering skills to tackle exciting climate problems and make a tangible impact,” said Trent Lee, a junior in mechanical engineering researching improvements in lithium-ion energy storage. “We are stoked by the energy transition, because it’s not just the future, but our chance to build it.” More

  • in

    Smart handling of neutrons is crucial to fusion power success

    In fall 2009, when Ethan Peterson ’13 arrived at MIT as an undergraduate, he already had some ideas about possible career options. He’d always liked building things, even as a child, so he imagined his future work would involve engineering of some sort. He also liked physics. And he’d recently become intent on reducing our dependence on fossil fuels and simultaneously curbing greenhouse gas emissions, which made him consider studying solar and wind energy, among other renewable sources.Things crystallized for him in the spring semester of 2010, when he took an introductory course on nuclear fusion, taught by Anne White, during which he discovered that when a deuterium nucleus and a tritium nucleus combine to produce a helium nucleus, an energetic (14 mega electron volt) neutron — traveling at one-sixth the speed of light — is released. Moreover, 1020 (100 billion billion) of these neutrons would be produced every second that a 500-megawatt fusion power plant operates. “It was eye-opening for me to learn just how energy-dense the fusion process is,” says Peterson, who became the Class of 1956 Career Development Professor of nuclear science and engineering in July 2024. “I was struck by the richness and interdisciplinary nature of the fusion field. This was an engineering discipline where I could apply physics to solve a real-world problem in a way that was both interesting and beautiful.”He soon became a physics and nuclear engineering double major, and by the time he graduated from MIT in 2013, the U.S. Department of Energy (DoE) had already decided to cut funding for MIT’s Alcator C-Mod fusion project. In view of that facility’s impending closure, Peterson opted to pursue graduate studies at the University of Wisconsin. There, he acquired a basic science background in plasma physics, which is central not only to nuclear fusion but also to astrophysical phenomena such as the solar wind.When Peterson received his PhD from Wisconsin in 2019, nuclear fusion had rebounded at MIT with the launch, a year earlier, of the SPARC project — a collaborative effort being carried out with the newly founded MIT spinout Commonwealth Fusion Systems. He returned to his alma mater as a postdoc and then a research scientist in the Plasma Science and Fusion Center, taking his time, at first, to figure out how to best make his mark in the field.Minding your neutronsAround that time, Peterson was participating in a community planning process, sponsored by the DoE, that focused on critical gaps that needed to be closed for a successful fusion program. In the course of these discussions, he came to realize that inadequate attention had been paid to the handling of neutrons, which carry 80 percent of the energy coming out of a fusion reaction — energy that needs to be harnessed for electrical generation. However, these neutrons are so energetic that they can penetrate through many tens of centimeters of material, potentially undermining the structural integrity of components and damaging vital equipment such as superconducting magnets. Shielding is also essential for protecting humans from harmful radiation.One goal, Peterson says, is to minimize the number of neutrons that escape and, in so doing, to reduce the amount of lost energy. A complementary objective, he adds, “is to get neutrons to deposit heat where you want them to and to stop them from depositing heat where you don’t want them to.” These considerations, in turn, can have a profound influence on fusion reactor design. This branch of nuclear engineering, called neutronics — which analyzes where neutrons are created and where they end up going — has become Peterson’s specialty.It was never a high-profile area of research in the fusion community — as plasma physics, for example, has always garnered more of the spotlight and more of the funding. That’s exactly why Peterson has stepped up. “The impacts of neutrons on fusion reactor design haven’t been a high priority for a long time,” he says. “I felt that some initiative needed to be taken,” and that prompted him to make the switch from plasma physics to neutronics. It has been his principal focus ever since — as a postdoc, a research scientist, and now as a faculty member.A code to design byThe best way to get a neutron to transfer its energy is to make it collide with a light atom. Lithium, with an atomic number of three, or lithium-containing materials are normally good choices — and necessary for producing tritium fuel. The placement of lithium “blankets,” which are intended to absorb energy from neutrons and produce tritium, “is a critical part of the design of fusion reactors,” Peterson says. High-density materials, such as lead and tungsten, can be used, conversely, to block the passage of neutrons and other types of radiation. “You might want to layer these high- and low-density materials in a complicated way that isn’t immediately intuitive” he adds. Determining which materials to put where — and of what thickness and mass — amounts to a tricky optimization problem, which will affect the size, cost, and efficiency of a fusion power plant.To that end, Peterson has developed modelling tools that can make analyses of these sorts easier and faster, thereby facilitating the design process. “This has traditionally been the step that takes the longest time and causes the biggest holdups,” he says. The models and algorithms that he and his colleagues are devising are general enough, moreover, to be compatible with a diverse range of fusion power plant concepts, including those that use magnets or lasers to confine the plasma.Now that he’s become a professor, Peterson is in a position to introduce more people to nuclear engineering, and to neutronics in particular. “I love teaching and mentoring students, sharing the things I’m excited about,” he says. “I was inspired by all the professors I had in physics and nuclear engineering at MIT, and I hope to give back to the community in the same way.”He also believes that if you are going to work on fusion, there is no better place to be than MIT, “where the facilities are second-to-none. People here are extremely innovative and passionate. And the sheer number of people who excel in their fields is staggering.” Great ideas can sometimes be sparked by off-the-cuff conversations in the hallway — something that happens more frequently than you expect, Peterson remarks. “All of these things taken together makes MIT a very special place.” More

  • in

    Bubble findings could unlock better electrode and electrolyzer designs

    Industrial electrochemical processes that use electrodes to produce fuels and chemical products are hampered by the formation of bubbles that block parts of the electrode surface, reducing the area available for the active reaction. Such blockage reduces the performance of the electrodes by anywhere from 10 to 25 percent.But new research reveals a decades-long misunderstanding about the extent of that interference. The findings show exactly how the blocking effect works and could lead to new ways of designing electrode surfaces to minimize inefficiencies in these widely used electrochemical processes.It has long been assumed that the entire area of the electrode shadowed by each bubble would be effectively inactivated. But it turns out that a much smaller area — roughly the area where the bubble actually contacts the surface — is blocked from its electrochemical activity. The new insights could lead directly to new ways of patterning the surfaces to minimize the contact area and improve overall efficiency.The findings are reported today in the journal Nanoscale, in a paper by recent MIT graduate Jack Lake PhD ’23, graduate student Simon Rufer, professor of mechanical engineering Kripa Varanasi, research scientist Ben Blaiszik, and six others at the University of Chicago and Argonne National Laboratory. The team has made available an open-source, AI-based software tool that engineers and scientists can now use to automatically recognize and quantify bubbles formed on a given surface, as a first step toward controlling the electrode material’s properties.

    Play video

    Gas-evolving electrodes, often with catalytic surfaces that promote chemical reactions, are used in a wide variety of processes, including the production of “green” hydrogen without the use of fossil fuels, carbon-capture processes that can reduce greenhouse gas emissions, aluminum production, and the chlor-alkali process that is used to make widely used chemical products.These are very widespread processes. The chlor-alkali process alone accounts for 2 percent of all U.S. electricity usage; aluminum production accounts for 3 percent of global electricity; and both carbon capture and hydrogen production are likely to grow rapidly in coming years as the world strives to meet greenhouse-gas reduction targets. So, the new findings could make a real difference, Varanasi says.“Our work demonstrates that engineering the contact and growth of bubbles on electrodes can have dramatic effects” on how bubbles form and how they leave the surface, he says. “The knowledge that the area under bubbles can be significantly active ushers in a new set of design rules for high-performance electrodes to avoid the deleterious effects of bubbles.”“The broader literature built over the last couple of decades has suggested that not only that small area of contact but the entire area under the bubble is passivated,” Rufer says. The new study reveals “a significant difference between the two models because it changes how you would develop and design an electrode to minimize these losses.”To test and demonstrate the implications of this effect, the team produced different versions of electrode surfaces with patterns of dots that nucleated and trapped bubbles at different sizes and spacings. They were able to show that surfaces with widely spaced dots promoted large bubble sizes but only tiny areas of surface contact, which helped to make clear the difference between the expected and actual effects of bubble coverage.Developing the software to detect and quantify bubble formation was necessary for the team’s analysis, Rufer explains. “We wanted to collect a lot of data and look at a lot of different electrodes and different reactions and different bubbles, and they all look slightly different,” he says. Creating a program that could deal with different materials and different lighting and reliably identify and track the bubbles was a tricky process, and machine learning was key to making it work, he says.Using that tool, he says, they were able to collect “really significant amounts of data about the bubbles on a surface, where they are, how big they are, how fast they’re growing, all these different things.” The tool is now freely available for anyone to use via the GitHub repository.By using that tool to correlate the visual measures of bubble formation and evolution with electrical measurements of the electrode’s performance, the researchers were able to disprove the accepted theory and to show that only the area of direct contact is affected. Videos further proved the point, revealing new bubbles actively evolving directly under parts of a larger bubble.The researchers developed a very general methodology that can be applied to characterize and understand the impact of bubbles on any electrode or catalyst surface. They were able to quantify the bubble passivation effects in a new performance metric they call BECSA (Bubble-induced electrochemically active surface), as opposed to ECSA (electrochemically active surface area), that is used in the field. “The BECSA metric was a concept we defined in an earlier study but did not have an effective method to estimate until this work,” says Varanasi.The knowledge that the area under bubbles can be significantly active ushers in a new set of design rules for high-performance electrodes. This means that electrode designers should seek to minimize bubble contact area rather than simply bubble coverage, which can be achieved by controlling the morphology and chemistry of the electrodes. Surfaces engineered to control bubbles can not only improve the overall efficiency of the processes and thus reduce energy use, they can also save on upfront materials costs. Many of these gas-evolving electrodes are coated with catalysts made of expensive metals like platinum or iridium, and the findings from this work can be used to engineer electrodes to reduce material wasted by reaction-blocking bubbles.Varanasi says that “the insights from this work could inspire new electrode architectures that not only reduce the usage of precious materials, but also improve the overall electrolyzer performance,” both of which would provide large-scale environmental benefits.The research team included Jim James, Nathan Pruyne, Aristana Scourtas, Marcus Schwarting, Aadit Ambalkar, Ian Foster, and Ben Blaiszik at the University of Chicago and Argonne National Laboratory. The work was supported by the U.S. Department of Energy under the ARPA-E program. More

  • in

    Applying risk and reliability analysis across industries

    On Feb. 1, 2003, the space shuttle Columbia disintegrated as it returned to Earth, killing all seven astronauts on board. The tragic incident compelled NASA to amp up their risk safety assessments and protocols. They knew whom to call: Curtis Smith PhD ’02, who is now the KEPCO Professor of the Practice of Nuclear Science and Engineering at MIT.The nuclear community has always been a leader in probabilistic risk analysis and Smith’s work in risk-related research had made him an established expert in the field. When NASA came knocking, Smith had been working for the Nuclear Regulatory Commission (NRC) at the Idaho National Laboratory (INL). He pivoted quickly. For the next decade, Smith worked with NASA’s Office of Safety and Mission Assurance supporting their increased use of risk analysis. It was a software tool that Smith helped develop, SAPHIRE, that NASA would adopt to bolster its own risk analysis program.At MIT, Smith’s focus is on both sides of system operation: risk and reliability. A research project he has proposed involves evaluating the reliability of 3D-printed components and parts for nuclear reactors.Growing up in IdahoMIT is a distance from where Smith grew up on the Shoshone-Bannock Native American reservation in Fort Hall, Idaho. His father worked at a chemical manufacturing plant, while his mother and grandmother operated a small restaurant on the reservation.Southeast Idaho had a significant population of migrant workers and Smith grew up with a diverse group of friends, mostly Native American and Hispanic. “It was a largely positive time and set a worldview for me in many wonderful ways,” Smith remembers. When he was a junior in high school, the family moved to Pingree, Idaho, a small town of barely 500. Smith attended Snake River High, a regional school, and remembered the deep impact his teachers had. “I learned a lot in grade school and had great teachers, so my love for education probably started there. I tried to emulate my teachers,” Smith says.Smith went to Idaho State University in Pocatello for college, a 45-minute drive from his family. Drawn to science, he decided he wanted to study a subject that would benefit humanity the most: nuclear engineering. Fortunately, Idaho State has a strong nuclear engineering program. Smith completed a master’s degree in the same field at ISU while working for the Federal Bureau of Investigation in the security department during the swing shift — 5 p.m. to 1 a.m. — at the FBI offices in Pocatello. “It was a perfect job while attending grad school,” Smith says.His KEPCO Professor of the Practice appointment is the second stint for Smith at MIT: He completed his PhD in the Department of Nuclear Science and Engineering (NSE) under the advisement of Professor George Apostolakis in 2002.A career in risk analysis and managementAfter a doctorate at MIT, Smith returned to Idaho, conducting research in risk analysis for the NRC. He also taught technical courses and developed risk analysis software. “We did a whole host of work that supported the current fleet of nuclear reactors that we have,” Smith says.He was 10 years into his career at INL when NASA recruited him, leaning on his expertise in risk analysis to translate it into space missions. “I didn’t really have a background in aerospace, but I was able to bring all the engineering I knew, conducting risk analysis for nuclear missions. It was really exciting and I learned a lot about aerospace,” Smith says.Risk analysis uses statistics and data to answer complex questions involving safety. Among his projects: analyzing the risk involved in a Mars rover mission with a radioisotope-generated power source for the rover. Even if the necessary plutonium is encased in really strong material, calculations for risk have to factor in all eventualities, including the rocket blowing up.When the Fukushima incident happened in 2011, the Department of Energy (DoE) was more supportive of safety and risk analysis research. Smith found himself in the center of the action again, supporting large DoE research programs. He then moved to become the director of the Nuclear Safety and Regulatory Research Division at the INL. Smith found he loved the role, mentoring and nurturing the careers of a diverse set of scientists. “It turned out to be much more rewarding than I had expected,” Smith says. Under his leadership, the division grew from 45 to almost 90 research staff and won multiple national awards.Return to MITMIT NSE came calling in 2022, looking to fill the position of professor of the practice, an offer Smith couldn’t refuse. The department was looking to bulk up its risk and reliability offerings and Smith made a great fit. The DoE division he had been supervising had grown wings enough for Smith to seek out something new.“Just getting back to Boston is exciting,” Smith says. The last go-around involved bringing the family to the city and included a lot of sleepless nights. Smith’s wife, Jacquie, is also excited about being closer to the New England fan base. The couple has invested in season tickets for the Patriots and look to attend as many sporting events as possible.Smith is most excited about adding to the risk and reliability offerings at MIT at a time when the subject has become especially important for nuclear power. “I’m grateful for the opportunity to bring my knowledge and expertise from the last 30 years to the field,” he says. Being a professor of the practice of NSE carries with it a responsibility to unite theory and practice, something Smith is especially good at. “We always have to answer the question of, ‘How do I take the research and make that practical,’ especially for something important like nuclear power, because we need much more of these ideas in industry,” he says.He is particularly excited about developing the next generation of nuclear scientists. “Having the ability to do this at a place like MIT is especially fulfilling and something I have been desiring my whole career,” Smith says. More

  • in

    Affordable high-tech windows for comfort and energy savings

    Imagine if the windows of your home didn’t transmit heat. They’d keep the heat indoors in winter and outdoors on a hot summer’s day. Your heating and cooling bills would go down; your energy consumption and carbon emissions would drop; and you’d still be comfortable all year ’round.AeroShield, a startup spun out of MIT, is poised to start manufacturing such windows. Building operations make up 36 percent of global carbon dioxide emissions, and today’s windows are a major contributor to energy inefficiency in buildings. To improve building efficiency, AeroShield has developed a window technology that promises to reduce heat loss by up to 65 percent, significantly reducing energy use and carbon emissions in buildings, and the company just announced the opening of a new facility to manufacture its breakthrough energy-efficient windows.“Our mission is to decarbonize the built environment,” says Elise Strobach SM ’17, PhD ’20, co-founder and CEO of AeroShield. “The availability of affordable, thermally insulating windows will help us achieve that goal while also reducing homeowner’s heating and cooling bills.” According to the U.S. Department of Energy, for most homeowners, 30 percent of that bill results from window inefficiencies.Technology development at MITResearch on AeroShield’s window technology began a decade ago in the MIT lab of Evelyn Wang, Ford Professor of Engineering, now on leave to serve as director of the Advanced Research Projects Agency-Energy (ARPA-E). In late 2014, the MIT team received funding from ARPA-E, and other sponsors followed, including the MIT Energy Initiative through the MIT Tata Center for Technology and Design in 2016.The work focused on aerogels, remarkable materials that are ultra-porous, lighter than a marshmallow, strong enough to support a brick, and an unparalleled barrier to heat flow. Aerogels were invented in the 1930s and used by NASA and others as thermal insulation. The team at MIT saw the potential for incorporating aerogel sheets into windows to keep heat from escaping or entering buildings. But there was one problem: Nobody had been able to make aerogels transparent.An aerogel is made of transparent, loosely connected nanoscale silica particles and is 95 percent air. But an aerogel sheet isn’t transparent because light traveling through it gets scattered by the silica particles.After five years of theoretical and experimental work, the MIT team determined that the key to transparency was having the silica particles both small and uniform in size. This allows light to pass directly through, so the aerogel becomes transparent. Indeed, as long as the particle size is small and uniform, increasing the thickness of an aerogel sheet to achieve greater thermal insulation won’t make it less clear.Teams in the MIT lab looked at various applications for their super-insulating, transparent aerogels. Some focused on improving solar thermal collectors by making the systems more efficient and less expensive. But to Strobach, increasing the thermal efficiency of windows looked especially promising and potentially significant as a means of reducing climate change.The researchers determined that aerogel sheets could be inserted into the gap in double-pane windows, making them more than twice as insulating. The windows could then be manufactured on existing production lines with minor changes, and the resulting windows would be affordable and as wide-ranging in style as the window options available today. Best of all, once purchased and installed, the windows would reduce electricity bills, energy use, and carbon emissions.The impact on energy use in buildings could be considerable. “If we only consider winter, windows in the United States lose enough energy to power over 50 million homes,” says Strobach. “That wasted energy generates about 350 million tons of carbon dioxide — more than is emitted by 76 million cars.” Super-insulating windows could help home and building owners reduce carbon dioxide emissions by gigatons while saving billions in heating and cooling costs.The AeroShield storyIn 2019, Strobach and her MIT colleagues — Aaron Baskerville-Bridges MBA ’20, SM ’20 and Kyle Wilke PhD ’19 — co-founded AeroShield to further develop and commercialize their aerogel-based technology for windows and other applications. And in the subsequent five years, their hard work has attracted attention, recently leading to two major accomplishments.In spring 2024, the company announced the opening of its new pilot manufacturing facility in Waltham, Massachusetts, where the team will be producing, testing, and certifying their first full-size windows and patio doors for initial product launch. The 12,000 square foot facility will significantly expand the company’s capabilities, with cutting-edge aerogel R&D labs, manufacturing equipment, assembly lines, and testing equipment. Says Strobach, “Our pilot facility will supply window and door manufacturers as we launch our first products and will also serve as our R&D headquarters as we develop the next generation of energy-efficient products using transparent aerogels.”Also in spring 2024, AeroShield received a $14.5 million award from ARPA-E’s “Seeding Critical Advances for Leading Energy technologies with Untapped Potential” (SCALEUP) program, which provides new funding to previous ARPA-E awardees that have “demonstrated a viable path to market.” That funding will enable the company to expand its production capacity to tens of thousands, or even hundreds of thousands, of units per year.Strobach also cites two less-obvious benefits of the SCALEUP award.First, the funding is enabling the company to move more quickly on the scale-up phase of their technology development. “We know from our fundamental studies and lab experiments that we can make large-area aerogel sheets that could go in an entry or patio door,” says Elise. “The SCALEUP award allows us to go straight for that vision. We don’t have to do all the incremental sizes of aerogels to prove that we can make a big one. The award provides capital for us to buy the big equipment to make the big aerogel.”Second, the SCALEUP award confirms the viability of the company to other potential investors and collaborators. Indeed, AeroShield recently announced $5 million of additional funding from existing investors Massachusetts Clean Energy Center and MassVentures, as well as new investor MassMutual Ventures. Strobach notes that the company now has investor, engineering, and customer partners.She stresses the importance of partners in achieving AeroShield’s mission. “We know that what we’ve got from a fundamental perspective can change the industry,” she says. “Now we want to go out and do it. With the right partners and at the right pace, we may actually be able to increase the energy efficiency of our buildings early enough to help make a real dent in climate change.” More