More stories

  • in

    Lidar helps gas industry find methane leaks and avoid costly losses

    Each year, the U.S. energy industry loses an estimated 3 percent of its natural gas production, valued at $1 billion in revenue, to leaky infrastructure. Escaping invisibly into the air, these methane gas plumes can now be detected, imaged, and measured using a specialized lidar flown on small aircraft.This lidar is a product of Bridger Photonics, a leading methane-sensing company based in Bozeman, Montana. MIT Lincoln Laboratory developed the lidar’s optical-power amplifier, a key component of the system, by advancing its existing slab-coupled optical waveguide amplifier (SCOWA) technology. The methane-detecting lidar is 10 to 50 times more capable than other airborne remote sensors on the market.”This drone-capable sensor for imaging methane is a great example of Lincoln Laboratory technology at work, matched with an impactful commercial application,” says Paul Juodawlkis, who pioneered the SCOWA technology with Jason Plant in the Advanced Technology Division and collaborated with Bridger Photonics to enable its commercial application.Today, the product is being adopted widely, including by nine of the top 10 natural gas producers in the United States. “Keeping gas in the pipe is good for everyone — it helps companies bring the gas to market, improves safety, and protects the outdoors,” says Pete Roos, founder and chief innovation officer at Bridger. “The challenge with methane is that you can’t see it. We solved a fundamental problem with Lincoln Laboratory.”A laser source “miracle”In 2014, the Advanced Research Projects Agency-Energy (ARPA-E) was seeking a cost-effective and precise way to detect methane leaks. Highly flammable and a potent pollutant, methane gas (the primary constituent of natural gas) moves through the country via a vast and intricate pipeline network. Bridger submitted a research proposal in response to ARPA-E’s call and was awarded funding to develop a small, sensitive aerial lidar.Aerial lidar sends laser light down to the ground and measures the light that reflects back to the sensor. Such lidar is often used for producing detailed topography maps. Bridger’s idea was to merge topography mapping with gas measurements. Methane absorbs light at the infrared wavelength of 1.65 microns. Operating a laser at that wavelength could allow a lidar to sense the invisible plumes and measure leak rates.”This laser source was one of the hardest parts to get right. It’s a key element,” Roos says. His team needed a laser source with specific characteristics to emit powerfully enough at a wavelength of 1.65 microns to work from useful altitudes. Roos recalled the ARPA-E program manager saying they needed a “miracle” to pull it off.Through mutual connections, Bridger was introduced to a Lincoln Laboratory technology for optically amplifying laser signals: the SCOWA. When Bridger contacted Juodawlkis and Plant, they had been working on SCOWAs for a decade. Although they had never investigated SCOWAs at 1.65 microns, they thought that the fundamental technology could be extended to operate at that wavelength. Lincoln Laboratory received ARPA-E funding to develop 1.65-micron SCOWAs and provide prototype units to Bridger for incorporation into their gas-mapping lidar systems.”That was the miracle we needed,” Roos says.A legacy in laser innovationLincoln Laboratory has long been a leader in semiconductor laser and optical emitter technology. In 1962, the laboratory was among the first to demonstrate the diode laser, which is now the most widespread laser used globally. Several spinout companies, such as Lasertron and TeraDiode, have commercialized innovations stemming from the laboratory’s laser research, including those for fiber-optic telecommunications and metal-cutting applications.In the early 2000s, Juodawlkis, Plant, and others at the laboratory recognized a need for a stable, powerful, and bright single-mode semiconductor optical amplifier, which could enhance lidar and optical communications. They developed the SCOWA (slab-coupled optical waveguide amplifier) concept by extending earlier work on slab-coupled optical waveguide lasers (SCOWLs). The initial SCOWA was funded under the laboratory’s internal technology investment portfolio, a pool of R&D funding provided by the undersecretary of defense for research and engineering to seed new technology ideas. These ideas often mature into sponsored programs or lead to commercialized technology.”Soon, we developed a semiconductor optical amplifier that was 10 times better than anything that had ever been demonstrated before,” Plant says. Like other semiconductor optical amplifiers, the SCOWA guides laser light through semiconductor material. This process increases optical power as the laser light interacts with electrons, causing them to shed photons at the same wavelength as the input laser. The SCOWA’s unique light-guiding design enables it to reach much higher output powers, creating a powerful and efficient beam. They demonstrated SCOWAs at various wavelengths and applied the technology to projects for the Department of Defense.When Bridger Photonics reached out to Lincoln Laboratory, the most impactful application of the device yet emerged. Working iteratively through the ARPA-E funding and a Cooperative Research and Development Agreement (CRADA), the team increased Bridger’s laser power by more than tenfold. This power boost enabled them to extend the range of the lidar to elevations over 1,000 feet.”Lincoln Laboratory had the knowledge of what goes on inside the optical amplifier — they could take our input, adjust the recipe, and make a device that worked very well for us,” Roos says.The Gas Mapping Lidar was commercially released in 2019. That same year, the product won an R&D 100 Award, recognizing it as a revolutionary advancement in the marketplace.A technology transfer takes offToday, the United States is the world’s largest natural gas supplier, driving growth in the methane-sensing market. Bridger Photonics deploys its Gas Mapping Lidar for customers nationwide, attaching the sensor to planes and drones and pinpointing leaks across the entire supply chain, from where gas is extracted, piped through the country, and delivered to businesses and homes. Customers buy the data from these scans to efficiently locate and repair leaks in their gas infrastructure. In January 2025, the Environmental Protection Agency provided regulatory approval for the technology.According to Bruce Niemeyer, president of Chevron’s shale and tight operations, the lidar capability has been game-changing: “Our goal is simple — keep methane in the pipe. This technology helps us assure we are doing that … It can find leaks that are 10 times smaller than other commercial providers are capable of spotting.”At Lincoln Laboratory, researchers continue to innovate new devices in the national interest. The SCOWA is one of many technologies in the toolkit of the laboratory’s Microsystems Prototyping Foundry, which will soon be expanded to include a new Compound Semiconductor Laboratory – Microsystem Integration Facility. Government, industry, and academia can access these facilities through government-funded projects, CRADAs, test agreements, and other mechanisms.At the direction of the U.S. government, the laboratory is also seeking industry transfer partners for a technology that couples SCOWA with a photonic integrated circuit platform. Such a platform could advance quantum computing and sensing, among other applications.”Lincoln Laboratory is a national resource for semiconductor optical emitter technology,” Juodawlkis says. More

  • in

    A new approach could fractionate crude oil using much less energy

    Separating crude oil into products such as gasoline, diesel, and heating oil is an energy-intensive process that accounts for about 6 percent of the world’s CO2 emissions. Most of that energy goes into the heat needed to separate the components by their boiling point.In an advance that could dramatically reduce the amount of energy needed for crude oil fractionation, MIT engineers have developed a membrane that filters the components of crude oil by their molecular size.“This is a whole new way of envisioning a separation process. Instead of boiling mixtures to purify them, why not separate components based on shape and size? The key innovation is that the filters we developed can separate very small molecules at an atomistic length scale,” says Zachary P. Smith, an associate professor of chemical engineering at MIT and the senior author of the new study.The new filtration membrane can efficiently separate heavy and light components from oil, and it is resistant to the swelling that tends to occur with other types of oil separation membranes. The membrane is a thin film that can be manufactured using a technique that is already widely used in industrial processes, potentially allowing it to be scaled up for widespread use.Taehoon Lee, a former MIT postdoc who is now an assistant professor at Sungkyunkwan University in South Korea, is the lead author of the paper, which appears today in Science.Oil fractionationConventional heat-driven processes for fractionating crude oil make up about 1 percent of global energy use, and it has been estimated that using membranes for crude oil separation could reduce the amount of energy needed by about 90 percent. For this to succeed, a separation membrane needs to allow hydrocarbons to pass through quickly, and to selectively filter compounds of different sizes.Until now, most efforts to develop a filtration membrane for hydrocarbons have focused on polymers of intrinsic microporosity (PIMs), including one known as PIM-1. Although this porous material allows the fast transport of hydrocarbons, it tends to excessively absorb some of the organic compounds as they pass through the membrane, leading the film to swell, which impairs its size-sieving ability.To come up with a better alternative, the MIT team decided to try modifying polymers that are used for reverse osmosis water desalination. Since their adoption in the 1970s, reverse osmosis membranes have reduced the energy consumption of desalination by about 90 percent — a remarkable industrial success story.The most commonly used membrane for water desalination is a polyamide that is manufactured using a method known as interfacial polymerization. During this process, a thin polymer film forms at the interface between water and an organic solvent such as hexane. Water and hexane do not normally mix, but at the interface between them, a small amount of the compounds dissolved in them can react with each other.In this case, a hydrophilic monomer called MPD, which is dissolved in water, reacts with a hydrophobic monomer called TMC, which is dissolved in hexane. The two monomers are joined together by a connection known as an amide bond, forming a polyamide thin film (named MPD-TMC) at the water-hexane interface.While highly effective for water desalination, MPD-TMC doesn’t have the right pore sizes and swelling resistance that would allow it to separate hydrocarbons.To adapt the material to separate the hydrocarbons found in crude oil, the researchers first modified the film by changing the bond that connects the monomers from an amide bond to an imine bond. This bond is more rigid and hydrophobic, which allows hydrocarbons to quickly move through the membrane without causing noticeable swelling of the film compared to the polyamide counterpart.“The polyimine material has porosity that forms at the interface, and because of the cross-linking chemistry that we have added in, you now have something that doesn’t swell,” Smith says. “You make it in the oil phase, react it at the water interface, and with the crosslinks, it’s now immobilized. And so those pores, even when they’re exposed to hydrocarbons, no longer swell like other materials.”The researchers also introduced a monomer called triptycene. This shape-persistent, molecularly selective molecule further helps the resultant polyimines to form pores that are the right size for hydrocarbons to fit through.This approach represents “an important step toward reducing industrial energy consumption,” says Andrew Livingston, a professor of chemical engineering at Queen Mary University of London, who was not involved in the study.“This work takes the workhorse technology of the membrane desalination industry, interfacial polymerization, and creates a new way to apply it to organic systems such as hydrocarbon feedstocks, which currently consume large chunks of global energy,” Livingston says. “The imaginative approach using an interfacial catalyst coupled to hydrophobic monomers leads to membranes with high permeance and excellent selectivity, and the work shows how these can be used in relevant separations.”Efficient separationWhen the researchers used the new membrane to filter a mixture of toluene and triisopropylbenzene (TIPB) as a benchmark for evaluating separation performance, it was able to achieve a concentration of toluene 20 times greater than its concentration in the original mixture. They also tested the membrane with an industrially relevant mixture consisting of naphtha, kerosene, and diesel, and found that it could efficiently separate the heavier and lighter compounds by their molecular size.If adapted for industrial use, a series of these filters could be used to generate a higher concentration of the desired products at each step, the researchers say.“You can imagine that with a membrane like this, you could have an initial stage that replaces a crude oil fractionation column. You could partition heavy and light molecules and then you could use different membranes in a cascade to purify complex mixtures to isolate the chemicals that you need,” Smith says.Interfacial polymerization is already widely used to create membranes for water desalination, and the researchers believe it should be possible to adapt those processes to mass produce the films they designed in this study.“The main advantage of interfacial polymerization is it’s already a well-established method to prepare membranes for water purification, so you can imagine just adopting these chemistries into existing scale of manufacturing lines,” Lee says.The research was funded, in part, by ExxonMobil through the MIT Energy Initiative.  More

  • in

    Reducing carbon emissions from residential heating: A pathway forward

    In the race to reduce climate-warming carbon emissions, the buildings sector is falling behind. While carbon dioxide (CO2) emissions in the U.S. electric power sector dropped by 34 percent between 2005 and 2021, emissions in the building sector declined by only 18 percent in that same time period. Moreover, in extremely cold locations, burning natural gas to heat houses can make up a substantial share of the emissions portfolio. Therefore, steps to electrify buildings in general, and residential heating in particular, are essential for decarbonizing the U.S. energy system.But that change will increase demand for electricity and decrease demand for natural gas. What will be the net impact of those two changes on carbon emissions and on the cost of decarbonizing? And how will the electric power and natural gas sectors handle the new challenges involved in their long-term planning for future operations and infrastructure investments?A new study by MIT researchers with support from the MIT Energy Initiative (MITEI) Future Energy Systems Center unravels the impacts of various levels of electrification of residential space heating on the joint power and natural gas systems. A specially devised modeling framework enabled them to estimate not only the added costs and emissions for the power sector to meet the new demand, but also any changes in costs and emissions that result for the natural gas sector.The analyses brought some surprising outcomes. For example, they show that — under certain conditions — switching 80 percent of homes to heating by electricity could cut carbon emissions and at the same time significantly reduce costs over the combined natural gas and electric power sectors relative to the case in which there is only modest switching. That outcome depends on two changes: Consumers must install high-efficiency heat pumps plus take steps to prevent heat losses from their homes, and planners in the power and the natural gas sectors must work together as they make long-term infrastructure and operations decisions. Based on their findings, the researchers stress the need for strong state, regional, and national policies that encourage and support the steps that homeowners and industry planners can take to help decarbonize today’s building sector.A two-part modeling approachTo analyze the impacts of electrification of residential heating on costs and emissions in the combined power and gas sectors, a team of MIT experts in building technology, power systems modeling, optimization techniques, and more developed a two-part modeling framework. Team members included Rahman Khorramfar, a senior postdoc in MITEI and the Laboratory for Information and Decision Systems (LIDS); Morgan Santoni-Colvin SM ’23, a former MITEI graduate research assistant, now an associate at Energy and Environmental Economics, Inc.; Saurabh Amin, a professor in the Department of Civil and Environmental Engineering and principal investigator in LIDS; Audun Botterud, a principal research scientist in LIDS; Leslie Norford, a professor in the Department of Architecture; and Dharik Mallapragada, a former MITEI principal research scientist, now an assistant professor at New York University, who led the project. They describe their new methods and findings in a paper published in the journal Cell Reports Sustainability on Feb. 6.The first model in the framework quantifies how various levels of electrification will change end-use demand for electricity and for natural gas, and the impacts of possible energy-saving measures that homeowners can take to help. “To perform that analysis, we built a ‘bottom-up’ model — meaning that it looks at electricity and gas consumption of individual buildings and then aggregates their consumption to get an overall demand for power and for gas,” explains Khorramfar. By assuming a wide range of building “archetypes” — that is, groupings of buildings with similar physical characteristics and properties — coupled with trends in population growth, the team could explore how demand for electricity and for natural gas would change under each of five assumed electrification pathways: “business as usual” with modest electrification, medium electrification (about 60 percent of homes are electrified), high electrification (about 80 percent of homes make the change), and medium and high electrification with “envelope improvements,” such as sealing up heat leaks and adding insulation.The second part of the framework consists of a model that takes the demand results from the first model as inputs and “co-optimizes” the overall electricity and natural gas system to minimize annual investment and operating costs while adhering to any constraints, such as limits on emissions or on resource availability. The modeling framework thus enables the researchers to explore the impact of each electrification pathway on the infrastructure and operating costs of the two interacting sectors.The New England case study: A challenge for electrificationAs a case study, the researchers chose New England, a region where the weather is sometimes extremely cold and where burning natural gas to heat houses contributes significantly to overall emissions. “Critics will say that electrification is never going to happen [in New England]. It’s just too expensive,” comments Santoni-Colvin. But he notes that most studies focus on the electricity sector in isolation. The new framework considers the joint operation of the two sectors and then quantifies their respective costs and emissions. “We know that electrification will require large investments in the electricity infrastructure,” says Santoni-Colvin. “But what hasn’t been well quantified in the literature is the savings that we generate on the natural gas side by doing that — so, the system-level savings.”Using their framework, the MIT team performed model runs aimed at an 80 percent reduction in building-sector emissions relative to 1990 levels — a target consistent with regional policy goals for 2050. The researchers defined parameters including details about building archetypes, the regional electric power system, existing and potential renewable generating systems, battery storage, availability of natural gas, and other key factors describing New England.They then performed analyses assuming various scenarios with different mixes of home improvements. While most studies assume typical weather, they instead developed 20 projections of annual weather data based on historical weather patterns and adjusted for the effects of climate change through 2050. They then analyzed their five levels of electrification.Relative to business-as-usual projections, results from the framework showed that high electrification of residential heating could more than double the demand for electricity during peak periods and increase overall electricity demand by close to 60 percent. Assuming that building-envelope improvements are deployed in parallel with electrification reduces the magnitude and weather sensitivity of peak loads and creates overall efficiency gains that reduce the combined demand for electricity plus natural gas for home heating by up to 30 percent relative to the present day. Notably, a combination of high electrification and envelope improvements resulted in the lowest average cost for the overall electric power-natural gas system in 2050.Lessons learnedReplacing existing natural gas-burning furnaces and boilers with heat pumps reduces overall energy consumption. Santoni-Colvin calls it “something of an intuitive result” that could be expected because heat pumps are “just that much more efficient than old, fossil fuel-burning systems. But even so, we were surprised by the gains.”Other unexpected results include the importance of homeowners making more traditional energy efficiency improvements, such as adding insulation and sealing air leaks — steps supported by recent rebate policies. Those changes are critical to reducing costs that would otherwise be incurred for upgrading the electricity grid to accommodate the increased demand. “You can’t just go wild dropping heat pumps into everybody’s houses if you’re not also considering other ways to reduce peak loads. So it really requires an ‘all of the above’ approach to get to the most cost-effective outcome,” says Santoni-Colvin.Testing a range of weather outcomes also provided important insights. Demand for heating fuel is very weather-dependent, yet most studies are based on a limited set of weather data — often a “typical year.” The researchers found that electrification can lead to extended peak electric load events that can last for a few days during cold winters. Accordingly, the researchers conclude that there will be a continuing need for a “firm, dispatchable” source of electricity; that is, a power-generating system that can be relied on to produce power any time it’s needed — unlike solar and wind systems. As examples, they modeled some possible technologies, including power plants fired by a low-carbon fuel or by natural gas equipped with carbon capture equipment. But they point out that there’s no way of knowing what types of firm generators will be available in 2050. It could be a system that’s not yet mature, or perhaps doesn’t even exist today.In presenting their findings, the researchers note several caveats. For one thing, their analyses don’t include the estimated cost to homeowners of installing heat pumps. While that cost is widely discussed and debated, that issue is outside the scope of their current project.In addition, the study doesn’t specify what happens to existing natural gas pipelines. “Some homes are going to electrify and get off the gas system and not have to pay for it, leaving other homes with increasing rates because the gas system cost now has to be divided among fewer customers,” says Khorramfar. “That will inevitably raise equity questions that need to be addressed by policymakers.”Finally, the researchers note that policies are needed to drive residential electrification. Current financial support for installation of heat pumps and steps to make homes more thermally efficient are a good start. But such incentives must be coupled with a new approach to planning energy infrastructure investments. Traditionally, electric power planning and natural gas planning are performed separately. However, to decarbonize residential heating, the two sectors should coordinate when planning future operations and infrastructure needs. Results from the MIT analysis indicate that such cooperation could significantly reduce both emissions and costs for residential heating — a change that would yield a much-needed step toward decarbonizing the buildings sector as a whole. More

  • in

    Turning automotive engines into modular chemical plants to make green fuels

    Reducing methane emissions is a top priority in the fight against climate change because of its propensity to trap heat in the atmosphere: Methane’s warming effects are 84 times more potent than CO2 over a 20-year timescale.And yet, as the main component of natural gas, methane is also a valuable fuel and a precursor to several important chemicals. The main barrier to using methane emissions to create carbon-negative materials is that human sources of methane gas — landfills, farms, and oil and gas wells — are relatively small and spread out across large areas, while traditional chemical processing facilities are huge and centralized. That makes it prohibitively expensive to capture, transport, and convert methane gas into anything useful. As a result, most companies burn or “flare” their methane at the site where it’s emitted, seeing it as a sunk cost and an environmental liability.The MIT spinout Emvolon is taking a new approach to processing methane by repurposing automotive engines to serve as modular, cost-effective chemical plants. The company’s systems can take methane gas and produce liquid fuels like methanol and ammonia on-site; these fuels can then be used or transported in standard truck containers.”We see this as a new way of chemical manufacturing,” Emvolon co-founder and CEO Emmanuel Kasseris SM ’07, PhD ’11 says. “We’re starting with methane because methane is an abundant emission that we can use as a resource. With methane, we can solve two problems at the same time: About 15 percent of global greenhouse gas emissions come from hard-to-abate sectors that need green fuel, like shipping, aviation, heavy heavy-duty trucks, and rail. Then another 15 percent of emissions come from distributed methane emissions like landfills and oil wells.”By using mass-produced engines and eliminating the need to invest in infrastructure like pipelines, the company says it’s making methane conversion economically attractive enough to be adopted at scale. The system can also take green hydrogen produced by intermittent renewables and turn it into ammonia, another fuel that can also be used to decarbonize fertilizers.“In the future, we’re going to need green fuels because you can’t electrify a large ship or plane — you have to use a high-energy-density, low-carbon-footprint, low-cost liquid fuel,” Kasseris says. “The energy resources to produce those green fuels are either distributed, as is the case with methane, or variable, like wind. So, you cannot have a massive plant [producing green fuels] that has its own zip code. You either have to be distributed or variable, and both of those approaches lend themselves to this modular design.”From a “crazy idea” to a companyKasseris first came to MIT to study mechanical engineering as a graduate student in 2004, when he worked in the Sloan Automotive Lab on a report on the future of transportation. For his PhD, he developed a novel technology for improving internal combustion engine fuel efficiency for a consortium of automotive and energy companies, which he then went to work for after graduation.Around 2014, he was approached by Leslie Bromberg ’73, PhD ’77, a serial inventor with more than 100 patents, who has been a principal research engineer in MIT’s Plasma Science and Fusion Center for nearly 50 years.“Leslie had this crazy idea of repurposing an internal combustion engine as a reactor,” Kasseris recalls. “I had looked at that while working in industry, and I liked it, but my company at the time thought the work needed more validation.”Bromberg had done that validation through a U.S. Department of Energy-funded project in which he used a diesel engine to “reform” methane — a high-pressure chemical reaction in which methane is combined with steam and oxygen to produce hydrogen. The work impressed Kasseris enough to bring him back to MIT as a research scientist in 2016.“We worked on that idea in addition to some other projects, and eventually it had reached the point where we decided to license the work from MIT and go full throttle,” Kasseris recalls. “It’s very easy to work with MIT’s Technology Licensing Office when you are an MIT inventor. You can get a low-cost licensing option, and you can do a lot with that, which is important for a new company. Then, once you are ready, you can finalize the license, so MIT was instrumental.”Emvolon continued working with MIT’s research community, sponsoring projects with Professor Emeritus John Heywood and participating in the MIT Venture Mentoring Service and the MIT Industrial Liaison Program.An engine-powered chemical plantAt the core of Emvolon’s system is an off-the-shelf automotive engine that runs “fuel rich” — with a higher ratio of fuel to air than what is needed for complete combustion.“That’s easy to say, but it takes a lot of [intellectual property], and that’s what was developed at MIT,” Kasseris says. “Instead of burning the methane in the gas to carbon dioxide and water, you partially burn it, or partially oxidize it, to carbon monoxide and hydrogen, which are the building blocks to synthesize a variety of chemicals.”The hydrogen and carbon monoxide are intermediate products used to synthesize different chemicals through further reactions. Those processing steps take place right next to the engine, which makes its own power. Each of Emvolon’s standalone systems fits within a 40-foot shipping container and can produce about 8 tons of methanol per day from 300,000 standard cubic feet of methane gas.The company is starting with green methanol because it’s an ideal fuel for hard-to-abate sectors such as shipping and heavy-duty transport, as well as an excellent feedstock for other high-value chemicals, such as sustainable aviation fuel. Many shipping vessels have already converted to run on green methanol in an effort to meet decarbonization goals.This summer, the company also received a grant from the Department of Energy to adapt its process to produce clean liquid fuels from power sources like solar and wind.“We’d like to expand to other chemicals like ammonia, but also other feedstocks, such as biomass and hydrogen from renewable electricity, and we already have promising results in that direction” Kasseris says. “We think we have a good solution for the energy transition and, in the later stages of the transition, for e-manufacturing.”A scalable approachEmvolon has already built a system capable of producing up to six barrels of green methanol a day in its 5,000 square-foot headquarters in Woburn, Massachusetts.“For chemical technologies, people talk about scale up risk, but with an engine, if it works in a single cylinder, we know it will work in a multicylinder engine,” Kasseris says. “It’s just engineering.”Last month, Emvolon announced an agreement with Montauk Renewables to build a commercial-scale demonstration unit next to a Texas landfill that will initially produce up to 15,000 gallons of green methanol a year and later scale up to 2.5 million gallons. That project could be expanded tenfold by scaling across Montauk’s other sites.“Our whole process was designed to be a very realistic approach to the energy transition,” Kasseris says. “Our solution is designed to produce green fuels and chemicals at prices that the markets are willing to pay today, without the need for subsidies. Using the engines as chemical plants, we can get the capital expenditure per unit output close to that of a large plant, but at a modular scale that enables us to be next to low-cost feedstock. Furthermore, our modular systems require small investments — of $1 to 10 million — that are quickly deployed, one at a time, within weeks, as opposed to massive chemical plants that require multiyear capital construction projects and cost hundreds of millions.” More

  • in

    Reducing carbon emissions from long-haul trucks

    People around the world rely on trucks to deliver the goods they need, and so-called long-haul trucks play a critical role in those supply chains. In the United States, long-haul trucks moved 71 percent of all freight in 2022. But those long-haul trucks are heavy polluters, especially of the carbon emissions that threaten the global climate. According to U.S. Environmental Protection Agency estimates, in 2022 more than 3 percent of all carbon dioxide (CO2) emissions came from long-haul trucks.The problem is that long-haul trucks run almost exclusively on diesel fuel, and burning diesel releases high levels of CO2 and other carbon emissions. Global demand for freight transport is projected to as much as double by 2050, so it’s critical to find another source of energy that will meet the needs of long-haul trucks while also reducing their carbon emissions. And conversion to the new fuel must not be costly. “Trucks are an indispensable part of the modern supply chain, and any increase in the cost of trucking will be felt universally,” notes William H. Green, the Hoyt Hottel Professor in Chemical Engineering and director of the MIT Energy Initiative.For the past year, Green and his research team have been seeking a low-cost, cleaner alternative to diesel. Finding a replacement is difficult because diesel meets the needs of the trucking industry so well. For one thing, diesel has a high energy density — that is, energy content per pound of fuel. There’s a legal limit on the total weight of a truck and its contents, so using an energy source with a lower weight allows the truck to carry more payload — an important consideration, given the low profit margin of the freight industry. In addition, diesel fuel is readily available at retail refueling stations across the country — a critical resource for drivers, who may travel 600 miles in a day and sleep in their truck rather than returning to their home depot. Finally, diesel fuel is a liquid, so it’s easy to distribute to refueling stations and then pump into trucks.Past studies have examined numerous alternative technology options for powering long-haul trucks, but no clear winner has emerged. Now, Green and his team have evaluated the available options based on consistent and realistic assumptions about the technologies involved and the typical operation of a long-haul truck, and assuming no subsidies to tip the cost balance. Their in-depth analysis of converting long-haul trucks to battery electric — summarized below — found a high cost and negligible emissions gains in the near term. Studies of methanol and other liquid fuels from biomass are ongoing, but already a major concern is whether the world can plant and harvest enough biomass for biofuels without destroying the ecosystem. An analysis of hydrogen — also summarized below — highlights specific challenges with using that clean-burning fuel, which is a gas at normal temperatures.Finally, the team identified an approach that could make hydrogen a promising, low-cost option for long-haul trucks. And, says Green, “it’s an option that most people are probably unaware of.” It involves a novel way of using materials that can pick up hydrogen, store it, and then release it when and where it’s needed to serve as a clean-burning fuel.Defining the challenge: A realistic drive cycle, plus diesel values to beatThe MIT researchers believe that the lack of consensus on the best way to clean up long-haul trucking may have a simple explanation: Different analyses are based on different assumptions about the driving behavior of long-haul trucks. Indeed, some of them don’t accurately represent actual long-haul operations. So the first task for the MIT team was to define a representative — and realistic — “drive cycle” for actual long-haul truck operations in the United States. Then the MIT researchers — and researchers elsewhere — can assess potential replacement fuels and engines based on a consistent set of assumptions in modeling and simulation analyses.To define the drive cycle for long-haul operations, the MIT team used a systematic approach to analyze many hours of real-world driving data covering 58,000 miles. They examined 10 features and identified three — daily range, vehicle speed, and road grade — that have the greatest impact on energy demand and thus on fuel consumption and carbon emissions. The representative drive cycle that emerged covers a distance of 600 miles, an average vehicle speed of 55 miles per hour, and a road grade ranging from negative 6 percent to positive 6 percent.The next step was to generate key values for the performance of the conventional diesel “powertrain,” that is, all the components involved in creating power in the engine and delivering it to the wheels on the ground. Based on their defined drive cycle, the researchers simulated the performance of a conventional diesel truck, generating “benchmarks” for fuel consumption, CO2 emissions, cost, and other performance parameters.Now they could perform parallel simulations — based on the same drive-cycle assumptions — of possible replacement fuels and powertrains to see how the cost, carbon emissions, and other performance parameters would compare to the diesel benchmarks.The battery electric optionWhen considering how to decarbonize long-haul trucks, a natural first thought is battery power. After all, battery electric cars and pickup trucks are proving highly successful. Why not switch to battery electric long-haul trucks? “Again, the literature is very divided, with some studies saying that this is the best idea ever, and other studies saying that this makes no sense,” says Sayandeep Biswas, a graduate student in chemical engineering.To assess the battery electric option, the MIT researchers used a physics-based vehicle model plus well-documented estimates for the efficiencies of key components such as the battery pack, generators, motor, and so on. Assuming the previously described drive cycle, they determined operating parameters, including how much power the battery-electric system needs. From there they could calculate the size and weight of the battery required to satisfy the power needs of the battery electric truck.The outcome was disheartening. Providing enough energy to travel 600 miles without recharging would require a 2 megawatt-hour battery. “That’s a lot,” notes Kariana Moreno Sader, a graduate student in chemical engineering. “It’s the same as what two U.S. households consume per month on average.” And the weight of such a battery would significantly reduce the amount of payload that could be carried. An empty diesel truck typically weighs 20,000 pounds. With a legal limit of 80,000 pounds, there’s room for 60,000 pounds of payload. The 2 MWh battery would weigh roughly 27,000 pounds — significantly reducing the allowable capacity for carrying payload.Accounting for that “payload penalty,” the researchers calculated that roughly four electric trucks would be required to replace every three of today’s diesel-powered trucks. Furthermore, each added truck would require an additional driver. The impact on operating expenses would be significant.Analyzing the emissions reductions that might result from shifting to battery electric long-haul trucks also brought disappointing results. One might assume that using electricity would eliminate CO2 emissions. But when the researchers included emissions associated with making that electricity, that wasn’t true.“Battery electric trucks are only as clean as the electricity used to charge them,” notes Moreno Sader. Most of the time, drivers of long-haul trucks will be charging from national grids rather than dedicated renewable energy plants. According to Energy Information Agency statistics, fossil fuels make up more than 60 percent of the current U.S. power grid, so electric trucks would still be responsible for significant levels of carbon emissions. Manufacturing batteries for the trucks would generate additional CO2 emissions.Building the charging infrastructure would require massive upfront capital investment, as would upgrading the existing grid to reliably meet additional energy demand from the long-haul sector. Accomplishing those changes would be costly and time-consuming, which raises further concern about electrification as a means of decarbonizing long-haul freight.In short, switching today’s long-haul diesel trucks to battery electric power would bring major increases in costs for the freight industry and negligible carbon emissions benefits in the near term. Analyses assuming various types of batteries as well as other drive cycles produced comparable results.However, the researchers are optimistic about where the grid is going in the future. “In the long term, say by around 2050, emissions from the grid are projected to be less than half what they are now,” says Moreno Sader. “When we do our calculations based on that prediction, we find that emissions from battery electric trucks would be around 40 percent lower than our calculated emissions based on today’s grid.”For Moreno Sader, the goal of the MIT research is to help “guide the sector on what would be the best option.” With that goal in mind, she and her colleagues are now examining the battery electric option under different scenarios — for example, assuming battery swapping (a depleted battery isn’t recharged but replaced by a fully charged one), short-haul trucking, and other applications that might produce a more cost-competitive outcome, even for the near term.A promising option: hydrogenAs the world looks to get off reliance on fossil fuels for all uses, much attention is focusing on hydrogen. Could hydrogen be a good alternative for today’s diesel-burning long-haul trucks?To find out, the MIT team performed a detailed analysis of the hydrogen option. “We thought that hydrogen would solve a lot of the problems we had with battery electric,” says Biswas. It doesn’t have associated CO2 emissions. Its energy density is far higher, so it doesn’t create the weight problem posed by heavy batteries. In addition, existing compression technology can get enough hydrogen fuel into a regular-sized tank to cover the needed distance and range. “You can actually give drivers the range they want,” he says. “There’s no issue with ‘range anxiety.’”But while using hydrogen for long-haul trucking would reduce carbon emissions, it would cost far more than diesel. Based on their detailed analysis of hydrogen, the researchers concluded that the main source of incurred cost is in transporting it. Hydrogen can be made in a chemical facility, but then it needs to be distributed to refueling stations across the country. Conventionally, there have been two main ways of transporting hydrogen: as a compressed gas and as a cryogenic liquid. As Biswas notes, the former is “super high pressure,” and the latter is “super cold.” The researchers’ calculations show that as much as 80 percent of the cost of delivered hydrogen is due to transportation and refueling, plus there’s the need to build dedicated refueling stations that can meet new environmental and safety standards for handling hydrogen as a compressed gas or a cryogenic liquid.Having dismissed the conventional options for shipping hydrogen, they turned to a less-common approach: transporting hydrogen using “liquid organic hydrogen carriers” (LOHCs), special organic (carbon-containing) chemical compounds that can under certain conditions absorb hydrogen atoms and under other conditions release them.LOHCs are in use today to deliver small amounts of hydrogen for commercial use. Here’s how the process works: In a chemical plant, the carrier compound is brought into contact with hydrogen in the presence of a catalyst under elevated temperature and pressure, and the compound picks up the hydrogen. The “hydrogen-loaded” compound — still a liquid — is then transported under atmospheric conditions. When the hydrogen is needed, the compound is again exposed to a temperature increase and a different catalyst, and the hydrogen is released.LOHCs thus appear to be ideal hydrogen carriers for long-haul trucking. They’re liquid, so they can easily be delivered to existing refueling stations, where the hydrogen would be released; and they contain at least as much energy per gallon as hydrogen in a cryogenic liquid or compressed gas form. However, a detailed analysis of using hydrogen carriers showed that the approach would decrease emissions but at a considerable cost.The problem begins with the “dehydrogenation” step at the retail station. Releasing the hydrogen from the chemical carrier requires heat, which is generated by burning some of the hydrogen being carried by the LOHC. The researchers calculate that getting the needed heat takes 36 percent of that hydrogen. (In theory, the process would take only 27 percent — but in reality, that efficiency won’t be achieved.) So out of every 100 units of starting hydrogen, 36 units are now gone.But that’s not all. The hydrogen that comes out is at near-ambient pressure. So the facility dispensing the hydrogen will need to compress it — a process that the team calculates will use up 20-30 percent of the starting hydrogen.Because of the needed heat and compression, there’s now less than half of the starting hydrogen left to be delivered to the truck — and as a result, the hydrogen fuel becomes twice as expensive. The bottom line is that the technology works, but “when it comes to really beating diesel, the economics don’t work. It’s quite a bit more expensive,” says Biswas. In addition, the refueling stations would require expensive compressors and auxiliary units such as cooling systems. The capital investment and the operating and maintenance costs together imply that the market penetration of hydrogen refueling stations will be slow.A better strategy: onboard release of hydrogen from LOHCsGiven the potential benefits of using of LOHCs, the researchers focused on how to deal with both the heat needed to release the hydrogen and the energy needed to compress it. “That’s when we had the idea,” says Biswas. “Instead of doing the dehydrogenation [hydrogen release] at the refueling station and then loading the truck with hydrogen, why don’t we just take the LOHC and load that onto the truck?” Like diesel, LOHC is a liquid, so it’s easily transported and pumped into trucks at existing refueling stations. “We’ll then make hydrogen as it’s needed based on the power demands of the truck — and we can capture waste heat from the engine exhaust and use it to power the dehydrogenation process,” says Biswas.In their proposed plan, hydrogen-loaded LOHC is created at a chemical “hydrogenation” plant and then delivered to a retail refueling station, where it’s pumped into a long-haul truck. Onboard the truck, the loaded LOHC pours into the fuel-storage tank. From there it moves to the “dehydrogenation unit” — the reactor where heat and a catalyst together promote chemical reactions that separate the hydrogen from the LOHC. The hydrogen is sent to the powertrain, where it burns, producing energy that propels the truck forward.Hot exhaust from the powertrain goes to a “heat-integration unit,” where its waste heat energy is captured and returned to the reactor to help encourage the reaction that releases hydrogen from the loaded LOHC. The unloaded LOHC is pumped back into the fuel-storage tank, where it’s kept in a separate compartment to keep it from mixing with the loaded LOHC. From there, it’s pumped back into the retail refueling station and then transported back to the hydrogenation plant to be loaded with more hydrogen.Switching to onboard dehydrogenation brings down costs by eliminating the need for extra hydrogen compression and by using waste heat in the engine exhaust to drive the hydrogen-release process. So how does their proposed strategy look compared to diesel? Based on a detailed analysis, the researchers determined that using their strategy would be 18 percent more expensive than using diesel, and emissions would drop by 71 percent.But those results need some clarification. The 18 percent cost premium of using LOHC with onboard hydrogen release is based on the price of diesel fuel in 2020. In spring of 2023 the price was about 30 percent higher. Assuming the 2023 diesel price, the LOHC option is actually cheaper than using diesel.Both the cost and emissions outcomes are affected by another assumption: the use of “blue hydrogen,” which is hydrogen produced from natural gas with carbon capture and storage. Another option is to assume the use of “green hydrogen,” which is hydrogen produced using electricity generated from renewable sources, such as wind and solar. Green hydrogen is much more expensive than blue hydrogen, so then the costs would increase dramatically.If in the future the price of green hydrogen drops, the researchers’ proposed plan would shift to green hydrogen — and then the decline in emissions would no longer be 71 percent but rather close to 100 percent. There would be almost no emissions associated with the researchers’ proposed plan for using LHOCs with onboard hydrogen release.Comparing the options on cost and emissionsTo compare the options, Moreno Sader prepared bar charts showing the per-mile cost of shipping by truck in the United States and the CO2 emissions that result using each of the fuels and approaches discussed above: diesel fuel, battery electric, hydrogen as a cryogenic liquid or compressed gas, and LOHC with onboard hydrogen release. The LOHC strategy with onboard dehydrogenation looked promising on both the cost and the emissions charts. In addition to such quantitative measures, the researchers believe that their strategy addresses two other, less-obvious challenges in finding a less-polluting fuel for long-haul trucks.First, the introduction of the new fuel and trucks to use it must not disrupt the current freight-delivery setup. “You have to keep the old trucks running while you’re introducing the new ones,” notes Green. “You cannot have even a day when the trucks aren’t running because it’d be like the end of the economy. Your supermarket shelves would all be empty; your factories wouldn’t be able to run.” The researchers’ plan would be completely compatible with the existing diesel supply infrastructure and would require relatively minor retrofits to today’s long-haul trucks, so the current supply chains would continue to operate while the new fuel and retrofitted trucks are introduced.Second, the strategy has the potential to be adopted globally. Long-haul trucking is important in other parts of the world, and Moreno Sader thinks that “making this approach a reality is going to have a lot of impact, not only in the United States but also in other countries,” including her own country of origin, Colombia. “This is something I think about all the time.” The approach is compatible with the current diesel infrastructure, so the only requirement for adoption is to build the chemical hydrogenation plant. “And I think the capital expenditure related to that will be less than the cost of building a new fuel-supply infrastructure throughout the country,” says Moreno Sader.Testing in the lab“We’ve done a lot of simulations and calculations to show that this is a great idea,” notes Biswas. “But there’s only so far that math can go to convince people.” The next step is to demonstrate their concept in the lab.To that end, the researchers are now assembling all the core components of the onboard hydrogen-release reactor as well as the heat-integration unit that’s key to transferring heat from the engine exhaust to the hydrogen-release reactor. They estimate that this spring they’ll be ready to demonstrate their ability to release hydrogen and confirm the rate at which it’s formed. And — guided by their modeling work — they’ll be able to fine-tune critical components for maximum efficiency and best performance.The next step will be to add an appropriate engine, specially equipped with sensors to provide the critical readings they need to optimize the performance of all their core components together. By the end of 2024, the researchers hope to achieve their goal: the first experimental demonstration of a power-dense, robust onboard hydrogen-release system with highly efficient heat integration.In the meantime, they believe that results from their work to date should help spread the word, bringing their novel approach to the attention of other researchers and experts in the trucking industry who are now searching for ways to decarbonize long-haul trucking.Financial support for development of the representative drive cycle and the diesel benchmarks as well as the analysis of the battery electric option was provided by the MIT Mobility Systems Center of the MIT Energy Initiative. Analysis of LOHC-powered trucks with onboard dehydrogenation was supported by the MIT Climate and Sustainability Consortium. Sayandeep Biswas is supported by a fellowship from the Martin Family Society of Fellows for Sustainability, and Kariana Moreno Sader received fellowship funding from MathWorks through the MIT School of Science. More

  • in

    Shining a light on oil fields to make them more sustainable

    Operating an oil field is complex and there is a staggeringly long list of things that can go wrong.

    One of the most common problems is spills of the salty brine that’s a toxic byproduct of pumping oil. Another is over- or under-pumping that can lead to machine failure and methane leaks. (The oil and gas industry is the largest industrial emitter of methane in the U.S.) Then there are extreme weather events, which range from winter frosts to blazing heat, that can put equipment out of commission for months. One of the wildest problems Sebastien Mannai SM ’14, PhD ’18 has encountered are hogs that pop open oil tanks with their snouts to enjoy on-demand oil baths.

    Mannai helps oil field owners detect and respond to these problems while optimizing the operation of their machinery to prevent the issues from occurring in the first place. He is the founder and CEO of Amplified Industries, a company selling oil field monitoring and control tools that help make the industry more efficient and sustainable.

    Amplified Industries’ sensors and analytics give oil well operators real-time alerts when things go wrong, allowing them to respond to issues before they become disasters.

    “We’re able to find 99 percent of the issues affecting these machines, from mechanical failures to human errors, including issues happening thousands of feet underground,” Mannai explains. “With our AI solution, operators can put the wells on autopilot, and the system automatically adjusts or shuts the well down as soon as there’s an issue.”

    Amplified currently works with private companies in states spanning from Texas to Wyoming, that own and operate as many as 3,000 wells. Such companies make up the majority of oil well operators in the U.S. and operate both new and older, more failure-prone equipment that has been in the field for decades.

    Such operators also have a harder time responding to environmental regulations like the Environmental Protection Agency’s new methane guidelines, which seek to dramatically reduce emissions of the potent greenhouse gas in the industry over the next few years.

    “These operators don’t want to be releasing methane,” Mannai explains. “Additionally, when gas gets into the pumping equipment, it leads to premature failures. We can detect gas and slow the pump down to prevent it. It’s the best of both worlds: The operators benefit because their machines are working better, saving them money while also giving them a smaller environmental footprint with fewer spills and methane leaks.”

    Leveraging “every MIT resource I possibly could”

    Mannai learned about the cutting-edge technology used in the space and aviation industries as he pursued his master’s degree at the Gas Turbine Laboratory in MIT’s Department of Aeronautics and Astronautics. Then, during his PhD at MIT, he worked with an oil services company and discovered the oil and gas industry was still relying on decades-old technologies and equipment.

    “When I first traveled to the field, I could not believe how old-school the actual operations were,” says Mannai, who has previously worked in rocket engine and turbine factories. “A lot of oil wells have to be adjusted by feel and rules of thumb. The operators have been let down by industrial automation and data companies.”

    Monitoring oil wells for problems typically requires someone in a pickup truck to drive hundreds of miles between wells looking for obvious issues, Mannai says. The sensors that are deployed are expensive and difficult to replace. Over time, they’re also often damaged in the field to the point of being unusable, forcing technicians to make educated guesses about the status of each well.

    “We often see that equipment unplugged or programmed incorrectly because it is incredibly over-complicated and ill-designed for the reality of the field,” Mannai says. “Workers on the ground often have to rip it out and bypass the control system to pump by hand. That’s how you end up with so many spills and wells pumping at suboptimal levels.”

    To build a better oil field monitoring system, Mannai received support from the MIT Sandbox Innovation Fund and the Venture Mentoring Service (VMS). He also participated in the delta V summer accelerator at the Martin Trust Center for MIT Entrepreneurship, the fuse program during IAP, and the MIT I-Corps program, and took a number of classes at the MIT Sloan School of Management. In 2019, Amplified Industries — which operated under the name Acoustic Wells until recently — won the MIT $100K Entrepreneurship competition.

    “My approach was to sign up to every possible entrepreneurship related program and to leverage every MIT resource I possibly could,” Mannai says. “MIT was amazing for us.”

    Mannai officially launched the company after his postdoc at MIT, and Amplified raised its first round of funding in early 2020. That year, Amplified’s small team moved into the Greentown Labs startup incubator in Somerville.

    Mannai says building the company’s battery-powered, low-cost sensors was a huge challenge. The sensors run machine-learning inference models and their batteries last for 10 years. They also had to be able to handle extreme conditions, from the scorching hot New Mexico desert to the swamps of Louisiana and the freezing cold winters in North Dakota.

    “We build very rugged, resilient hardware; it’s a must in those environments” Mannai says. “But it’s also very simple to deploy, so if a device does break, it’s like changing a lightbulb: We ship them a new one and it takes them a couple of minutes to swap it out.”

    Customers equip each well with four or five of Amplified’s sensors, which attach to the well’s cables and pipes to measure variables like tension, pressure, and amps. Vast amounts of data are then sent to Amplified’s cloud and processed by their analytics engine. Signal processing methods and AI models are used to diagnose problems and control the equipment in real-time, while generating notifications for the operators when something goes wrong. Operators can then remotely adjust the well or shut it down.

    “That’s where AI is important, because if you just record everything and put it in a giant dashboard, you create way more work for people,” Mannai says. “The critical part is the ability to process and understand this newly recorded data and make it readily usable in the real world.”

    Amplified’s dashboard is customized for different people in the company, so field technicians can quickly respond to problems and managers or owners can get a high-level view of how everything is running.

    Mannai says often when Amplified’s sensors are installed, they’ll immediately start detecting problems that were unknown to engineers and technicians in the field. To date, Amplified has prevented hundreds of thousands of gallons worth of brine water spills, which are particularly damaging to surrounding vegetation because of their high salt and sulfur content.

    Preventing those spills is only part of Amplified’s positive environmental impact; the company is now turning its attention toward the detection of methane leaks.

    Helping a changing industry

    The EPA’s proposed new Waste Emissions Charge for oil and gas companies would start at $900 per metric ton of reported methane emissions in 2024 and increase to $1,500 per metric ton in 2026 and beyond.

    Mannai says Amplified is well-positioned to help companies comply with the new rules. Its equipment has already showed it can detect various kinds of leaks across the field, purely based on analytics of existing data.

    “Detecting methane leaks typically requires someone to walk around every valve and piece of piping with a thermal camera or sniffer, but these operators often have thousands of valves and hundreds of miles of pipes,” Mannai says. “What we see in the field is that a lot of times people don’t know where the pipes are because oil wells change owners so frequently, or they will miss an intermittent leak.”

    Ultimately Mannai believes a strong data backend and modernized sensing equipment will become the backbone of the industry, and is a necessary prerequisite to both improving efficiency and cleaning up the industry.

    “We’re selling a service that ensures your equipment is working optimally all the time,” Mannai says. “That means a lot fewer fines from the EPA, but it also means better-performing equipment. There’s a mindset change happening across the industry, and we’re helping make that transition as easy and affordable as possible.” More

  • in

    A delicate dance

    In early 2022, economist Catherine Wolfram was at her desk in the U.S. Treasury building. She could see the east wing of the White House, just steps away.

    Russia had just invaded Ukraine, and Wolfram was thinking about Russia, oil, and sanctions. She and her colleagues had been tasked with figuring out how to restrict the revenues that Russia was using to fuel its brutal war while keeping Russian oil available and affordable to the countries that depended on it.

    Now the William F. Pounds Professor of Energy Economics at MIT, Wolfram was on leave from academia to serve as deputy assistant secretary for climate and energy economics.

    Working for Treasury Secretary Janet L. Yellen, Wolfram and her colleagues developed dozens of models and forecasts and projections. It struck her, she said later, that “huge decisions [affecting the global economy] would be made on the basis of spreadsheets that I was helping create.” Wolfram composed a memo to the Biden administration and hoped her projections would pan out the way she believed they would.

    Tackling conundrums that weigh competing, sometimes contradictory, interests has defined much of Wolfram’s career.

    Wolfram specializes in the economics of energy markets. She looks at ways to decarbonize global energy systems while recognizing that energy drives economic development, especially in the developing world.

    “The way we’re currently making energy is contributing to climate change. There’s a delicate dance we have to do to make sure that we treat this important industry carefully, but also transform it rapidly to a cleaner, decarbonized system,” she says.

    Economists as influencers

    While Wolfram was growing up in a suburb of St. Paul, Minnesota, her father was a law professor and her mother taught English as a second language. Her mother helped spawn Wolfram’s interest in other cultures and her love of travel, but it was an experience closer to home that sparked her awareness of the effect of human activities on the state of the planet.

    Minnesota’s nickname is “Land of 10,000 Lakes.” Wolfram remembers swimming in a nearby lake sometimes covered by a thick sludge of algae. “Thinking back on it, it must’ve had to do with fertilizer runoff,” she says. “That was probably the first thing that made me think about the environment and policy.”

    In high school, Wolfram liked “the fact that you could use math to understand the world. I also was interested in the types of questions about human behavior that economists were thinking about.

    “I definitely think economics is good at sussing out how different actors are likely to react to a particular policy and then designing policies with that in mind.”

    After receiving a bachelor’s degree in economics from Harvard University in 1989, Wolfram worked with a Massachusetts agency that governed rate hikes for utilities. Seeing its reliance on research, she says, illuminated the role academics could play in policy setting. It made her think she could make a difference from within academia.

    While pursuing a PhD in economics from MIT, Wolfram counted Paul L. Joskow, the Elizabeth and James Killian Professor of Economics and former director of the MIT Center for Energy and Environmental Policy Research, and Nancy L. Rose, the Charles P. Kindleberger Professor of Applied Economics, among her mentors and influencers.

    After spending 1996 to 2000 as an assistant professor of economics at Harvard, she joined the faculty at the Haas School of Business at the University of California at Berkeley.

    At Berkeley, it struck Wolfram that while she labored over ways to marginally boost the energy efficiency of U.S. power plants, the economies of China and India were growing rapidly, with a corresponding growth in energy use and carbon dioxide emissions. “It hit home that to understand the climate issue, I needed to understand energy demand in the developing world,” she says.

    The problem was that the developing world didn’t always offer up the kind of neatly packaged, comprehensive data economists relied on. She wondered if, by relying on readily accessible data, the field was looking under the lamppost — while losing sight of what the rest of the street looked like.

    To make up for a lack of available data on the state of electrification in sub-Saharan Africa, for instance, Wolfram developed and administered surveys to individual, remote rural households using on-the-ground field teams.

    Her results suggested that in the world’s poorest countries, the challenges involved in expanding the grid in rural areas should be weighed against potentially greater economic and social returns on investments in the transportation, education, or health sectors.

    Taking the lead

    Within months of Wolfram’s memo to the Biden administration, leaders of the intergovernmental political forum Group of Seven (G7) agreed to the price cap. Tankers from coalition countries would only transport Russian crude sold at or below the price cap level, initially set at $60 per barrel.

    “A price cap was not something that had ever been done before,” Wolfram says. “In some ways, we were making it up out of whole cloth. It was exciting to see that I wrote one of the original memos about it, and then literally three-and-a-half months later, the G7 was making an announcement.

    “As economists and as policymakers, we must set the parameters and get the incentives right. The price cap was basically asking developing countries to buy cheap oil, which was consistent with their incentives.”

    In May 2023, the U.S. Department of the Treasury reported that despite widespread initial skepticism about the price cap, market participants and geopolitical analysts believe it is accomplishing its goals of restricting Russia’s oil revenues while maintaining the supply of Russian oil and keeping energy costs in check for consumers and businesses around the world.

    Wolfram held the U.S. Treasury post from March 2021 to October 2022 while on leave from UC Berkeley. In July 2023, she joined MIT Sloan School of Management partly to be geographically closer to the policymakers of the nation’s capital. She’s also excited about the work taking place elsewhere at the Institute to stay ahead of climate change.

    Her time in D.C. was eye-opening, particularly in terms of the leadership power of the United States. She worries that the United States is falling prey to “lost opportunities” in terms of addressing climate change. “We were showing real leadership on the price cap, and if we could only do that on climate, I think we could make faster inroads on a global agreement,” she says.

    Now focused on structuring global agreements in energy policy among developed and developing countries, she’s considering how the United States can take advantage of its position as a world leader. “We need to be thinking about how what we do in the U.S. affects the rest of the world from a climate perspective. We can’t go it alone.

    “The U.S. needs to be more aligned with the European Union, Canada, and Japan to try to find areas where we’re taking a common approach to addressing climate change,” she says. She will touch on some of those areas in the class she will teach in spring 2024 titled “Climate and Energy in the Global Economy,” offered through MIT Sloan.

    Looking ahead, she says, “I’m a techno optimist. I believe in human innovation. I’m optimistic that we’ll find ways to live with climate change and, hopefully, ways to minimize it.”

    This article appears in the Winter 2024 issue of Energy Futures, the magazine of the MIT Energy Initiative. More

  • in

    Making the clean energy transition work for everyone

    The clean energy transition is already underway, but how do we make sure it happens in a manner that is affordable, sustainable, and fair for everyone?

    That was the overarching question at this year’s MIT Energy Conference, which took place March 11 and 12 in Boston and was titled “Short and Long: A Balanced Approach to the Energy Transition.”

    Each year, the student-run conference brings together leaders in the energy sector to discuss the progress and challenges they see in their work toward a greener future. Participants come from research, industry, government, academia, and the investment community to network and exchange ideas over two whirlwind days of keynote talks, fireside chats, and panel discussions.

    Several participants noted that clean energy technologies are already cost-competitive with fossil fuels, but changing the way the world works requires more than just technology.

    “None of this is easy, but I think developing innovative new technologies is really easy compared to the things we’re talking about here, which is how to blend social justice, soft engineering, and systems thinking that puts people first,” Daniel Kammen, a distinguished professor of energy at the University of California at Berkeley, said in a keynote talk. “While clean energy has a long way to go, it is more than ready to transition us from fossil fuels.”

    The event also featured a keynote discussion between MIT President Sally Kornbluth and MIT’s Kyocera Professor of Ceramics Yet-Ming Chiang, in which Kornbluth discussed her first year at MIT as well as a recently announced, campus-wide effort to solve critical climate problems known as the Climate Project at MIT.

    “The reason I wanted to come to MIT was I saw that MIT has the potential to solve the world’s biggest problems, and first among those for me was the climate crisis,” Kornbluth said. “I’m excited about where we are, I’m excited about the enthusiasm of the community, and I think we’ll be able to make really impactful discoveries through this project.”

    Fostering new technologies

    Several panels convened experts in new or emerging technology fields to discuss what it will take for their solutions to contribute to deep decarbonization.

    “The fun thing and challenging thing about first-of-a-kind technologies is they’re all kind of different,” said Jonah Wagner, principal assistant director for industrial innovation and clean energy in the U.S. Office of Science and Technology Policy. “You can map their growth against specific challenges you expect to see, but every single technology is going to face their own challenges, and every single one will have to defy an engineering barrier to get off the ground.”

    Among the emerging technologies discussed was next-generation geothermal energy, which uses new techniques to extract heat from the Earth’s crust in new places.

    A promising aspect of the technology is that it can leverage existing infrastructure and expertise from the oil and gas industry. Many newly developed techniques for geothermal production, for instance, use the same drills and rigs as those used for hydraulic fracturing.

    “The fact that we have a robust ecosystem of oil and gas labor and technology in the U.S. makes innovation in geothermal much more accessible compared to some of the challenges we’re seeing in nuclear or direct-air capture, where some of the supply chains are disaggregated around the world,” said Gabrial Malek, chief of staff at the geothermal company Fervo Energy.

    Another technology generating excitement — if not net energy quite yet — is fusion, the process of combining, or fusing, light atoms together to form heavier ones for a net energy gain, in the same process that powers the sun. MIT spinout Commonwealth Fusion Systems (CFS) has already validated many aspects of its approach for achieving fusion power, and the company’s unique partnership with MIT was discussed in a panel on the industry’s progress.

    “We’re standing on the shoulders of decades of research from the scientific community, and we want to maintain those ties even as we continue developing our technology,” CFS Chief Science Officer Brandon Sorbom PhD ’17 said, noting that CFS is one of the largest company sponsors of research at MIT and collaborates with institutions around the world. “Engaging with the community is a really valuable lever to get new ideas and to sanity check our own ideas.”

    Sorbom said that as CFS advances fusion energy, the company is thinking about how it can replicate its processes to lower costs and maximize the technology’s impact around the planet.

    “For fusion to work, it has to work for everyone,” Sorbom said. “I think the affordability piece is really important. We can’t just build this technological jewel that only one class of nations can afford. It has to be a technology that can be deployed throughout the entire world.”

    The event also gave students — many from MIT — a chance to learn more about careers in energy and featured a startup showcase, in which dozens of companies displayed their energy and sustainability solutions.

    “More than 700 people are here from every corner of the energy industry, so there are so many folks to connect with and help me push my vision into reality,” says GreenLIB CEO Fred Rostami, whose company recycles lithium-ion batteries. “The good thing about the energy transition is that a lot of these technologies and industries overlap, so I think we can enable this transition by working together at events like this.”

    A focused climate strategy

    Kornbluth noted that when she came to MIT, a large percentage of students and faculty were already working on climate-related technologies. With the Climate Project at MIT, she wanted to help ensure the whole of those efforts is greater than the sum of its parts.

    The project is organized around six distinct missions, including decarbonizing energy and industry, empowering frontline communities, and building healthy, resilient cities. Kornbluth says the mission areas will help MIT community members collaborate around multidisciplinary challenges. Her team, which includes a committee of faculty advisors, has begun to search for the leads of each mission area, and Kornbluth said she is planning to appoint a vice president for climate at the Institute.

    “I want someone who has the purview of the whole Institute and will report directly to me to help make sure this project stays on track,” Kornbluth explained.

    In his conversation about the initiative with Kornbluth, Yet-Ming Chiang said projects will be funded based on their potential to reduce emissions and make the planet more sustainable at scale.

    “Projects should be very high risk, with very high impact,” Chiang explained. “They should have a chance to prove themselves, and those efforts should not be limited by resources, only by time.”

    In discussing her vision of the climate project, Kornbluth alluded to the “short and long” theme of the conference.

    “It’s about balancing research and commercialization,” Kornbluth said. “The climate project has a very variable timeframe, and I think universities are the sector that can think about the things that might be 30 years out. We have to think about the incentives across the entire innovation pipeline and how we can keep an eye on the long term while making sure the short-term things get out rapidly.” More