More stories

  • in

    A home where world-changing innovations take flight

    In a large, open space on the first floor of 750 Main Street in Cambridge, Massachusetts, a carbon-capture company is heating up molten salts to 600 degrees Celsius right next to a quantum computing company’s device for supercooling qubits. The difference is about 900 degrees across 15 feet.

    It doesn’t take long in the tour of The Engine Accelerator to realize this isn’t your typical co-working space. Companies here are working at the extremes to develop new technologies with world-changing impact — what The Engine Accelerator’s leaders call “tough tech.”

    Comprising four floors and 150,000 square feet next door to MIT’s campus, the new space offers startups specialized lab equipment, advanced machining, fabrication facilities, office space, and a range of startup support services.

    The goal is to give young companies merging science and engineering all of the resources they need to move ideas from the lab bench to their own mass manufacturing lines.

    “The infrastructure has always been a really important accelerant for getting these kinds of companies off and running,” The Engine Accelerator President Emily Knight says. “Now you can start a company and, on day one, start building. Real estate is such a big factor. Our thought was, let’s make this investment in the infrastructure for the founders. It’s an agile lease that enables them to be very flexible as they grow.”

    Since the new facility opened its doors in the summer of 2022, the Accelerator has welcomed around 100 companies that employ close to 1,000 people. In addition to the space, members enjoy educational workshops on topics like fundraising and hiring, events, and networking opportunities that the Accelerator team hopes foster a sense of community among people working in the tough tech space overall.

    “We’re not just advocates for the startups in the space,” Knight says. “We’re advocates for tough tech as a whole. We think it’s important for the state of Massachusetts to create a tough tech hub here, and we think it’s important for national competitiveness.”

    Tough tech gets a home

    The Engine was spun out of MIT in 2016 as a public benefit corporation with the mission of bridging the gap between discovery and commercialization. Since its inception, it has featured an investment component, now known as Engine Ventures, and a shared services component.

    From the moment The Engine opened its doors to startups in its original headquarters on Massachusetts Avenue in Cambridge, the services team got a firsthand look at the unique challenges faced by tough tech startups. After speaking with founders, they realized their converted office space would need more power, stronger floors, and full lab accommodations.

    The team rose to the challenge. They turned a closet into a bio lab. They turned an unused wellness room into a laser lab. They managed to accommodate Commonwealth Fusion Systems when the founders informed them a 5,000-pound magnet would soon arrive for testing.

    But supporting ambitious founders in their quest to build world-changing companies was always going to require a bigger boat. As early as 2017, MIT’s leaders were considering turning the old Polaroid building, which had sat empty next to MIT’s campus for nearly 20 years, into the new home for tough tech.

    Speaking of tough, construction crews began the extensive building renovations for the Accelerator at the end of 2019, a few months before the Covid-19 pandemic. The team managed to avoid the worst of the supply chain disruptions, but they quickly learned the building has its quirks. Each floor is a different ceiling height, and massive pillars known as mushroom columns punctuate each floor.

    Based on conversations with founders, The Engine’s Accelerator team outfitted the renovated building with office and co-working space, a full machine shop, labs for biology and chemistry work, an array of 3D printers, bike storage, and, perhaps most important, cold brew on tap.

    “I think of the Accelerator as a really great Airbnb host rather than a landlord, where maybe you rented a bedroom in a large house, but you feel like you rented the whole thing because you have access to all kinds of amazing equipment,” says Bernardo Cervantes PhD ’20, co-founder of Concerto Biosciences, which is developing microbes for a variety of uses in human health and agriculture.

    The Engine Accelerator’s team credits MIT leadership with helping them manage the project, noting that the MIT Environment, Health and Safety office was particularly helpful.

    A week after the Accelerator opened its doors in August 2022, on a single sweltering day, 35 companies moved in. By 2023, the Accelerator was home to 55 companies. Since then, the Accelerator’s team has done everything they could to continue to grow.

    “At one point, one of our team members came to me with her tail between her legs and sheepishly said, ‘I gave our office space to a startup,’” Knight recalls. “I said, ‘Yes! That means you get it! We don’t need an office — we can sit anywhere.’”

    The first floor holds some of the largest machinery, including that molten salt device (developed by Mantel Capture) and the quantum computer (developed by Atlantic Quantum). On the next level, a machine shop and a fabrication space featuring every 3D printer imaginable offer ways for companies to quickly build prototype products or parts. Another floor is dubbed “the Avenue” and features a kitchen and tables for networking and serendipitous meetings. The Avenue is lined by huge garage doors that open to accommodate larger crowds for workshops and meeting spaces.

    “Even though the founders are working in different spaces, we wanted to create an area where people can connect and run into each other and get help with 3D printing or hiring or anything else,” Knight says. “It fosters those casual interactions that are very important for startups.”

    An ecosystem to change the world

    Only about one-fifth of the companies in the Accelerator space are portfolio companies of Engine Ventures. The two entities operate separately, but they pool their shared learning about supporting tough tech, and Engine Ventures has an office in the Accelerator’s space.

    Engine Ventures CEO Katie Rae sees it as a symbiotic partnership.

    “We needed to have all these robust services for everyone in tough tech, not just the portfolio companies,” Rae says. “We’ll always work together and produce the Tough Tech Summit together because of our overarching missions. It’s very much like a rising tide lifts all boats. All of these companies are working to change the world in their own verticals, so we’re just focusing on the impact they’re trying to have and making that the story.”

    Rae says MIT has helped both of The Engine’s teams think through the best way to support tough tech startups.

    “Being a partner with MIT, which understands innovation and safety better than anyone, has allowed us to say yes to more things and have more flexibility,” Rae says. “If you’re going to go at breakneck speed to solve global problems, you better have a mentality of getting things done fast and safely, and I think that’s been a core tenet of The Engine.”

    Meanwhile, Knight says her team hasn’t stopped learning from the tough tech community and will continue to adapt.

    “There’s just a waterfall of information coming from these companies,” Knight says. “It’s about iterating on our services to best support them, so we can go to people on our team and ask, ‘Can you learn to run this type of program, because we just learned these five founders need it?’ Every founder we know in the area has a badge so they can come in. We want to create a hub for tough tech within this Kendall Square area that’s already a hub in so many ways.” More

  • in

    Has remote work changed how people travel in the U.S?

    The prevalence of remote work since the start of the Covid-19 pandemic has significantly changed urban transportation patterns in the U.S., according to new study led by MIT researchers.

    The research finds significant variation between the effects of remote work on vehicle miles driven and on mass-transit ridership across the U.S.

    “A 1 percent decrease in onsite workers leads to a roughly 1 percent reduction in [automobile] vehicle miles driven, but a 2.3 percent reduction in mass transit ridership,” says Yunhan Zheng SM ’21, PhD ’24, an MIT postdoc who is co-author of a the study.

    “This is one of the first studies that identifies the causal effect of remote work on vehicle miles traveled and transit ridership across the U.S.,” adds Jinhua Zhao, an MIT professor and another co-author of the paper.

    By accounting for many of the nuances of the issue, across the lower 48 states and the District of Columbia as well as 217 metropolitan areas, the scholars believe they have arrived at a robust conclusion demonstrating the effects of working from home on larger mobility patterns.

    The paper, “Impacts of remote work on vehicle miles traveled and transit ridership in the USA,” appears today in the journal Nature Cities. The authors are Zheng, a doctoral graduate of MIT’s Department of Civil and Environmental Engineering and a postdoc at the Singapore–MIT Alliance for Research and Technology (SMART); Shenhao Wang PhD ’20, an assistant professor at the University of Florida; Lun Liu, an assistant professor at Peking University; Jim Aloisi, a lecturer in MIT’s Department of Urban Studies and Planning (DUSP); and Zhao, the Professor of Cities and Transportation, founder of the MIT Mobility Initiative, and director of MIT’s JTL Urban Mobility Lab and Transit Lab.

    The researchers gathered data on the prevalence of remote work from multiple sources, including Google location data, travel data from the U.S. Federal Highway Administration and the National Transit Database, and the monthly U.S. Survey of Working Arrangements and Attitudes (run jointly by Stanford University, the University of Chicago, ITAM, and MIT).

    The study reveals significant variation among U.S. states when it comes to how much the rise of remote work has affected mileage driven.

    “The impact of a 1 percent change in remote work on the reduction of vehicle miles traveled in New York state is only about one-quarter of that in Texas,” Zheng observes. “There is real variation there.”

    At the same time, remote work has had the biggest effect on mass-transit revenues in places with widely used systems, with New York City, Chicago, San Francisco, Boston, and Philadelphia making up the top five hardest-hit metro areas.

    The overall effect is surprisingly consistent over time, from early 2020 through late 2022.

    “In terms of the temporal variation, we found that the effect is quite consistent across our whole study period,” Zheng says. “It’s not just significant in the early stage of the pandemic, when remote work was a necessity for many. The magnitude remains consistent into the later period, when many people have the flexibility to choose where they want to work. We think this may have long-term implications.”

    Additionally, the study estimates the impact that still larger numbers of remote workers could have on the environment and mass transit.

    “On a national basis, we estimate that a 10 percent decrease in the number of onsite workers compared to prepandemic levels will reduce the annual total vehicle-related CO2 emissions by 191.8 million metric tons,” Wang says.

    The study also projects that across the 217 metropolitan areas in the study, a 10 percent decrease in the number of onsite workers, compared to prepandemic levels, would lead to an annual loss of 2.4 billion transit trips and $3.7 billion in fare revenue — equal to roughly 27 percent of the annual transit ridership and fare revenue in 2019.

    “The substantial influence of remote work on transit ridership highlights the need for transit agencies to adapt their services accordingly, investing in services tailored to noncommuting trips and implementing more flexible schedules to better accommodate the new demand patterns,” Zhao says.

    The research received support from the MIT Energy Initiative; the Barr Foundation; the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise program; the Research Opportunity Seed Fund 2023 from the University of Florida; and the Beijing Social Science Foundation. More

  • in

    Extracting hydrogen from rocks

    It’s commonly thought that the most abundant element in the universe, hydrogen, exists mainly alongside other elements — with oxygen in water, for example, and with carbon in methane. But naturally occurring underground pockets of pure hydrogen are punching holes in that notion — and generating attention as a potentially unlimited source of carbon-free power. One interested party is the U.S. Department of Energy, which last month awarded $20 million in research grants to 18 teams from laboratories, universities, and private companies to develop technologies that can lead to cheap, clean fuel from the subsurface. Geologic hydrogen, as it’s known, is produced when water reacts with iron-rich rocks, causing the iron to oxidize. One of the grant recipients, MIT Assistant Professor Iwnetim Abate’s research group, will use its $1.3 million grant to determine the ideal conditions for producing hydrogen underground — considering factors such as catalysts to initiate the chemical reaction, temperature, pressure, and pH levels. The goal is to improve efficiency for large-scale production, meeting global energy needs at a competitive cost. The U.S. Geological Survey estimates there are potentially billions of tons of geologic hydrogen buried in the Earth’s crust. Accumulations have been discovered worldwide, and a slew of startups are searching for extractable deposits. Abate is looking to jump-start the natural hydrogen production process, implementing “proactive” approaches that involve stimulating production and harvesting the gas.                                                                                                                         “We aim to optimize the reaction parameters to make the reaction faster and produce hydrogen in an economically feasible manner,” says Abate, the Chipman Development Professor in the Department of Materials Science and Engineering (DMSE). Abate’s research centers on designing materials and technologies for the renewable energy transition, including next-generation batteries and novel chemical methods for energy storage. 

    Sparking innovation

    Interest in geologic hydrogen is growing at a time when governments worldwide are seeking carbon-free energy alternatives to oil and gas. In December, French President Emmanuel Macron said his government would provide funding to explore natural hydrogen. And in February, government and private sector witnesses briefed U.S. lawmakers on opportunities to extract hydrogen from the ground. Today commercial hydrogen is manufactured at $2 a kilogram, mostly for fertilizer and chemical and steel production, but most methods involve burning fossil fuels, which release Earth-heating carbon. “Green hydrogen,” produced with renewable energy, is promising, but at $7 per kilogram, it’s expensive. “If you get hydrogen at a dollar a kilo, it’s competitive with natural gas on an energy-price basis,” says Douglas Wicks, a program director at Advanced Research Projects Agency – Energy (ARPA-E), the Department of Energy organization leading the geologic hydrogen grant program. Recipients of the ARPA-E grants include Colorado School of Mines, Texas Tech University, and Los Alamos National Laboratory, plus private companies including Koloma, a hydrogen production startup that has received funding from Amazon and Bill Gates. The projects themselves are diverse, ranging from applying industrial oil and gas methods for hydrogen production and extraction to developing models to understand hydrogen formation in rocks. The purpose: to address questions in what Wicks calls a “total white space.” “In geologic hydrogen, we don’t know how we can accelerate the production of it, because it’s a chemical reaction, nor do we really understand how to engineer the subsurface so that we can safely extract it,” Wicks says. “We’re trying to bring in the best skills of each of the different groups to work on this under the idea that the ensemble should be able to give us good answers in a fairly rapid timeframe.” Geochemist Viacheslav Zgonnik, one of the foremost experts in the natural hydrogen field, agrees that the list of unknowns is long, as is the road to the first commercial projects. But he says efforts to stimulate hydrogen production — to harness the natural reaction between water and rock — present “tremendous potential.” “The idea is to find ways we can accelerate that reaction and control it so we can produce hydrogen on demand in specific places,” says Zgonnik, CEO and founder of Natural Hydrogen Energy, a Denver-based startup that has mineral leases for exploratory drilling in the United States. “If we can achieve that goal, it means that we can potentially replace fossil fuels with stimulated hydrogen.”

    “A full-circle moment”

    For Abate, the connection to the project is personal. As a child in his hometown in Ethiopia, power outages were a usual occurrence — the lights would be out three, maybe four days a week. Flickering candles or pollutant-emitting kerosene lamps were often the only source of light for doing homework at night. “And for the household, we had to use wood and charcoal for chores such as cooking,” says Abate. “That was my story all the way until the end of high school and before I came to the U.S. for college.” In 1987, well-diggers drilling for water in Mali in Western Africa uncovered a natural hydrogen deposit, causing an explosion. Decades later, Malian entrepreneur Aliou Diallo and his Canadian oil and gas company tapped the well and used an engine to burn hydrogen and power electricity in the nearby village. Ditching oil and gas, Diallo launched Hydroma, the world’s first hydrogen exploration enterprise. The company is drilling wells near the original site that have yielded high concentrations of the gas. “So, what used to be known as an energy-poor continent now is generating hope for the future of the world,” Abate says. “Learning about that was a full-circle moment for me. Of course, the problem is global; the solution is global. But then the connection with my personal journey, plus the solution coming from my home continent, makes me personally connected to the problem and to the solution.”

    Experiments that scale

    Abate and researchers in his lab are formulating a recipe for a fluid that will induce the chemical reaction that triggers hydrogen production in rocks. The main ingredient is water, and the team is testing “simple” materials for catalysts that will speed up the reaction and in turn increase the amount of hydrogen produced, says postdoc Yifan Gao. “Some catalysts are very costly and hard to produce, requiring complex production or preparation,” Gao says. “A catalyst that’s inexpensive and abundant will allow us to enhance the production rate — that way, we produce it at an economically feasible rate, but also with an economically feasible yield.” The iron-rich rocks in which the chemical reaction happens can be found across the United States and the world. To optimize the reaction across a diversity of geological compositions and environments, Abate and Gao are developing what they call a high-throughput system, consisting of artificial intelligence software and robotics, to test different catalyst mixtures and simulate what would happen when applied to rocks from various regions, with different external conditions like temperature and pressure. “And from that we measure how much hydrogen we are producing for each possible combination,” Abate says. “Then the AI will learn from the experiments and suggest to us, ‘Based on what I’ve learned and based on the literature, I suggest you test this composition of catalyst material for this rock.’” The team is writing a paper on its project and aims to publish its findings in the coming months. The next milestones for the project, after developing the catalyst recipe, is designing a reactor that will serve two purposes. First, fitted with technologies such as Raman spectroscopy, it will allow researchers to identify and optimize the chemical conditions that lead to improved rates and yield of hydrogen production. The lab-scale device will also inform the design of a real-world reactor that can accelerate hydrogen production in the field. “That would be a plant-scale reactor that would be implanted into the subsurface,” Abate says. The cross-disciplinary project is also tapping the expertise of Yang Shao-Horn, of MIT’s Department of Mechanical Engineering and DMSE, for computational analysis of the catalyst, and Esteban Gazel, a Cornell University scientist who will lend his expertise in geology and geochemistry. He’ll focus on understanding the iron-rich ultramafic rock formations across the United States and the globe and how they react with water. For Wicks at ARPA-E, the questions Abate and the other grant recipients are asking are just the first, critical steps in uncharted energy territory. “If we can understand how to stimulate these rocks into generating hydrogen, safely getting it up, it really unleashes the potential energy source,” he says. Then the emerging industry will look to oil and gas for the drilling, piping, and gas extraction know-how. “As I like to say, this is enabling technology that we hope to, in a very short term, enable us to say, ‘Is there really something there?’” More

  • in

    Shining a light on oil fields to make them more sustainable

    Operating an oil field is complex and there is a staggeringly long list of things that can go wrong.

    One of the most common problems is spills of the salty brine that’s a toxic byproduct of pumping oil. Another is over- or under-pumping that can lead to machine failure and methane leaks. (The oil and gas industry is the largest industrial emitter of methane in the U.S.) Then there are extreme weather events, which range from winter frosts to blazing heat, that can put equipment out of commission for months. One of the wildest problems Sebastien Mannai SM ’14, PhD ’18 has encountered are hogs that pop open oil tanks with their snouts to enjoy on-demand oil baths.

    Mannai helps oil field owners detect and respond to these problems while optimizing the operation of their machinery to prevent the issues from occurring in the first place. He is the founder and CEO of Amplified Industries, a company selling oil field monitoring and control tools that help make the industry more efficient and sustainable.

    Amplified Industries’ sensors and analytics give oil well operators real-time alerts when things go wrong, allowing them to respond to issues before they become disasters.

    “We’re able to find 99 percent of the issues affecting these machines, from mechanical failures to human errors, including issues happening thousands of feet underground,” Mannai explains. “With our AI solution, operators can put the wells on autopilot, and the system automatically adjusts or shuts the well down as soon as there’s an issue.”

    Amplified currently works with private companies in states spanning from Texas to Wyoming, that own and operate as many as 3,000 wells. Such companies make up the majority of oil well operators in the U.S. and operate both new and older, more failure-prone equipment that has been in the field for decades.

    Such operators also have a harder time responding to environmental regulations like the Environmental Protection Agency’s new methane guidelines, which seek to dramatically reduce emissions of the potent greenhouse gas in the industry over the next few years.

    “These operators don’t want to be releasing methane,” Mannai explains. “Additionally, when gas gets into the pumping equipment, it leads to premature failures. We can detect gas and slow the pump down to prevent it. It’s the best of both worlds: The operators benefit because their machines are working better, saving them money while also giving them a smaller environmental footprint with fewer spills and methane leaks.”

    Leveraging “every MIT resource I possibly could”

    Mannai learned about the cutting-edge technology used in the space and aviation industries as he pursued his master’s degree at the Gas Turbine Laboratory in MIT’s Department of Aeronautics and Astronautics. Then, during his PhD at MIT, he worked with an oil services company and discovered the oil and gas industry was still relying on decades-old technologies and equipment.

    “When I first traveled to the field, I could not believe how old-school the actual operations were,” says Mannai, who has previously worked in rocket engine and turbine factories. “A lot of oil wells have to be adjusted by feel and rules of thumb. The operators have been let down by industrial automation and data companies.”

    Monitoring oil wells for problems typically requires someone in a pickup truck to drive hundreds of miles between wells looking for obvious issues, Mannai says. The sensors that are deployed are expensive and difficult to replace. Over time, they’re also often damaged in the field to the point of being unusable, forcing technicians to make educated guesses about the status of each well.

    “We often see that equipment unplugged or programmed incorrectly because it is incredibly over-complicated and ill-designed for the reality of the field,” Mannai says. “Workers on the ground often have to rip it out and bypass the control system to pump by hand. That’s how you end up with so many spills and wells pumping at suboptimal levels.”

    To build a better oil field monitoring system, Mannai received support from the MIT Sandbox Innovation Fund and the Venture Mentoring Service (VMS). He also participated in the delta V summer accelerator at the Martin Trust Center for MIT Entrepreneurship, the fuse program during IAP, and the MIT I-Corps program, and took a number of classes at the MIT Sloan School of Management. In 2019, Amplified Industries — which operated under the name Acoustic Wells until recently — won the MIT $100K Entrepreneurship competition.

    “My approach was to sign up to every possible entrepreneurship related program and to leverage every MIT resource I possibly could,” Mannai says. “MIT was amazing for us.”

    Mannai officially launched the company after his postdoc at MIT, and Amplified raised its first round of funding in early 2020. That year, Amplified’s small team moved into the Greentown Labs startup incubator in Somerville.

    Mannai says building the company’s battery-powered, low-cost sensors was a huge challenge. The sensors run machine-learning inference models and their batteries last for 10 years. They also had to be able to handle extreme conditions, from the scorching hot New Mexico desert to the swamps of Louisiana and the freezing cold winters in North Dakota.

    “We build very rugged, resilient hardware; it’s a must in those environments” Mannai says. “But it’s also very simple to deploy, so if a device does break, it’s like changing a lightbulb: We ship them a new one and it takes them a couple of minutes to swap it out.”

    Customers equip each well with four or five of Amplified’s sensors, which attach to the well’s cables and pipes to measure variables like tension, pressure, and amps. Vast amounts of data are then sent to Amplified’s cloud and processed by their analytics engine. Signal processing methods and AI models are used to diagnose problems and control the equipment in real-time, while generating notifications for the operators when something goes wrong. Operators can then remotely adjust the well or shut it down.

    “That’s where AI is important, because if you just record everything and put it in a giant dashboard, you create way more work for people,” Mannai says. “The critical part is the ability to process and understand this newly recorded data and make it readily usable in the real world.”

    Amplified’s dashboard is customized for different people in the company, so field technicians can quickly respond to problems and managers or owners can get a high-level view of how everything is running.

    Mannai says often when Amplified’s sensors are installed, they’ll immediately start detecting problems that were unknown to engineers and technicians in the field. To date, Amplified has prevented hundreds of thousands of gallons worth of brine water spills, which are particularly damaging to surrounding vegetation because of their high salt and sulfur content.

    Preventing those spills is only part of Amplified’s positive environmental impact; the company is now turning its attention toward the detection of methane leaks.

    Helping a changing industry

    The EPA’s proposed new Waste Emissions Charge for oil and gas companies would start at $900 per metric ton of reported methane emissions in 2024 and increase to $1,500 per metric ton in 2026 and beyond.

    Mannai says Amplified is well-positioned to help companies comply with the new rules. Its equipment has already showed it can detect various kinds of leaks across the field, purely based on analytics of existing data.

    “Detecting methane leaks typically requires someone to walk around every valve and piece of piping with a thermal camera or sniffer, but these operators often have thousands of valves and hundreds of miles of pipes,” Mannai says. “What we see in the field is that a lot of times people don’t know where the pipes are because oil wells change owners so frequently, or they will miss an intermittent leak.”

    Ultimately Mannai believes a strong data backend and modernized sensing equipment will become the backbone of the industry, and is a necessary prerequisite to both improving efficiency and cleaning up the industry.

    “We’re selling a service that ensures your equipment is working optimally all the time,” Mannai says. “That means a lot fewer fines from the EPA, but it also means better-performing equipment. There’s a mindset change happening across the industry, and we’re helping make that transition as easy and affordable as possible.” More

  • in

    Atmospheric observations in China show rise in emissions of a potent greenhouse gas

    To achieve the aspirational goal of the Paris Agreement on climate change — limiting the increase in global average surface temperature to 1.5 degrees Celsius above preindustrial levels — will require its 196 signatories to dramatically reduce their greenhouse gas (GHG) emissions. Those greenhouse gases differ widely in their global warming potential (GWP), or ability to absorb radiative energy and thereby warm the Earth’s surface. For example, measured over a 100-year period, the GWP of methane is about 28 times that of carbon dioxide (CO2), and the GWP of sulfur hexafluoride (SF6) is 24,300 times that of CO2, according to the Intergovernmental Panel on Climate Change (IPCC) Sixth Assessment Report. 

    Used primarily in high-voltage electrical switchgear in electric power grids, SF6 is one of the most potent greenhouse gases on Earth. In the 21st century, atmospheric concentrations of SF6 have risen sharply along with global electric power demand, threatening the world’s efforts to stabilize the climate. This heightened demand for electric power is particularly pronounced in China, which has dominated the expansion of the global power industry in the past decade. Quantifying China’s contribution to global SF6 emissions — and pinpointing its sources in the country — could lead that nation to implement new measures to reduce them, and thereby reduce, if not eliminate, an impediment to the Paris Agreement’s aspirational goal. 

    To that end, a new study by researchers at the MIT Joint Program on the Science and Policy of Global Change, Fudan University, Peking University, University of Bristol, and Meteorological Observation Center of China Meteorological Administration determined total SF6 emissions in China over 2011-21 from atmospheric observations collected from nine stations within a Chinese network, including one station from the Advanced Global Atmospheric Gases Experiment (AGAGE) network. For comparison, global total emissions were determined from five globally distributed, relatively unpolluted “background” AGAGE stations, involving additional researchers from the Scripps Institution of Oceanography and CSIRO, Australia’s National Science Agency.

    The researchers found that SF6 emissions in China almost doubled from 2.6 gigagrams (Gg) per year in 2011, when they accounted for 34 percent of global SF6 emissions, to 5.1 Gg per year in 2021, when they accounted for 57 percent of global total SF6 emissions. This increase from China over the 10-year period — some of it emerging from the country’s less-populated western regions — was larger than the global total SF6 emissions rise, highlighting the importance of lowering SF6 emissions from China in the future.

    The open-access study, which appears in the journal Nature Communications, explores prospects for future SF6 emissions reduction in China.

    “Adopting maintenance practices that minimize SF6 leakage rates or using SF6-free equipment or SF6 substitutes in the electric power grid will benefit greenhouse-gas mitigation in China,” says Minde An, a postdoc at the MIT Center for Global Change Science (CGCS) and the study’s lead author. “We see our findings as a first step in quantifying the problem and identifying how it can be addressed.”

    Emissions of SF6 are expected to last more than 1,000 years in the atmosphere, raising the stakes for policymakers in China and around the world.

    “Any increase in SF6 emissions this century will effectively alter our planet’s radiative budget — the balance between incoming energy from the sun and outgoing energy from the Earth — far beyond the multi-decadal time frame of current climate policies,” says MIT Joint Program and CGCS Director Ronald Prinn, a coauthor of the study. “So it’s imperative that China and all other nations take immediate action to reduce, and ultimately eliminate, their SF6 emissions.”

    The study was supported by the National Key Research and Development Program of China and Shanghai B&R Joint Laboratory Project, the U.S. National Aeronautics and Space Administration, and other funding agencies.   More

  • in

    A delicate dance

    In early 2022, economist Catherine Wolfram was at her desk in the U.S. Treasury building. She could see the east wing of the White House, just steps away.

    Russia had just invaded Ukraine, and Wolfram was thinking about Russia, oil, and sanctions. She and her colleagues had been tasked with figuring out how to restrict the revenues that Russia was using to fuel its brutal war while keeping Russian oil available and affordable to the countries that depended on it.

    Now the William F. Pounds Professor of Energy Economics at MIT, Wolfram was on leave from academia to serve as deputy assistant secretary for climate and energy economics.

    Working for Treasury Secretary Janet L. Yellen, Wolfram and her colleagues developed dozens of models and forecasts and projections. It struck her, she said later, that “huge decisions [affecting the global economy] would be made on the basis of spreadsheets that I was helping create.” Wolfram composed a memo to the Biden administration and hoped her projections would pan out the way she believed they would.

    Tackling conundrums that weigh competing, sometimes contradictory, interests has defined much of Wolfram’s career.

    Wolfram specializes in the economics of energy markets. She looks at ways to decarbonize global energy systems while recognizing that energy drives economic development, especially in the developing world.

    “The way we’re currently making energy is contributing to climate change. There’s a delicate dance we have to do to make sure that we treat this important industry carefully, but also transform it rapidly to a cleaner, decarbonized system,” she says.

    Economists as influencers

    While Wolfram was growing up in a suburb of St. Paul, Minnesota, her father was a law professor and her mother taught English as a second language. Her mother helped spawn Wolfram’s interest in other cultures and her love of travel, but it was an experience closer to home that sparked her awareness of the effect of human activities on the state of the planet.

    Minnesota’s nickname is “Land of 10,000 Lakes.” Wolfram remembers swimming in a nearby lake sometimes covered by a thick sludge of algae. “Thinking back on it, it must’ve had to do with fertilizer runoff,” she says. “That was probably the first thing that made me think about the environment and policy.”

    In high school, Wolfram liked “the fact that you could use math to understand the world. I also was interested in the types of questions about human behavior that economists were thinking about.

    “I definitely think economics is good at sussing out how different actors are likely to react to a particular policy and then designing policies with that in mind.”

    After receiving a bachelor’s degree in economics from Harvard University in 1989, Wolfram worked with a Massachusetts agency that governed rate hikes for utilities. Seeing its reliance on research, she says, illuminated the role academics could play in policy setting. It made her think she could make a difference from within academia.

    While pursuing a PhD in economics from MIT, Wolfram counted Paul L. Joskow, the Elizabeth and James Killian Professor of Economics and former director of the MIT Center for Energy and Environmental Policy Research, and Nancy L. Rose, the Charles P. Kindleberger Professor of Applied Economics, among her mentors and influencers.

    After spending 1996 to 2000 as an assistant professor of economics at Harvard, she joined the faculty at the Haas School of Business at the University of California at Berkeley.

    At Berkeley, it struck Wolfram that while she labored over ways to marginally boost the energy efficiency of U.S. power plants, the economies of China and India were growing rapidly, with a corresponding growth in energy use and carbon dioxide emissions. “It hit home that to understand the climate issue, I needed to understand energy demand in the developing world,” she says.

    The problem was that the developing world didn’t always offer up the kind of neatly packaged, comprehensive data economists relied on. She wondered if, by relying on readily accessible data, the field was looking under the lamppost — while losing sight of what the rest of the street looked like.

    To make up for a lack of available data on the state of electrification in sub-Saharan Africa, for instance, Wolfram developed and administered surveys to individual, remote rural households using on-the-ground field teams.

    Her results suggested that in the world’s poorest countries, the challenges involved in expanding the grid in rural areas should be weighed against potentially greater economic and social returns on investments in the transportation, education, or health sectors.

    Taking the lead

    Within months of Wolfram’s memo to the Biden administration, leaders of the intergovernmental political forum Group of Seven (G7) agreed to the price cap. Tankers from coalition countries would only transport Russian crude sold at or below the price cap level, initially set at $60 per barrel.

    “A price cap was not something that had ever been done before,” Wolfram says. “In some ways, we were making it up out of whole cloth. It was exciting to see that I wrote one of the original memos about it, and then literally three-and-a-half months later, the G7 was making an announcement.

    “As economists and as policymakers, we must set the parameters and get the incentives right. The price cap was basically asking developing countries to buy cheap oil, which was consistent with their incentives.”

    In May 2023, the U.S. Department of the Treasury reported that despite widespread initial skepticism about the price cap, market participants and geopolitical analysts believe it is accomplishing its goals of restricting Russia’s oil revenues while maintaining the supply of Russian oil and keeping energy costs in check for consumers and businesses around the world.

    Wolfram held the U.S. Treasury post from March 2021 to October 2022 while on leave from UC Berkeley. In July 2023, she joined MIT Sloan School of Management partly to be geographically closer to the policymakers of the nation’s capital. She’s also excited about the work taking place elsewhere at the Institute to stay ahead of climate change.

    Her time in D.C. was eye-opening, particularly in terms of the leadership power of the United States. She worries that the United States is falling prey to “lost opportunities” in terms of addressing climate change. “We were showing real leadership on the price cap, and if we could only do that on climate, I think we could make faster inroads on a global agreement,” she says.

    Now focused on structuring global agreements in energy policy among developed and developing countries, she’s considering how the United States can take advantage of its position as a world leader. “We need to be thinking about how what we do in the U.S. affects the rest of the world from a climate perspective. We can’t go it alone.

    “The U.S. needs to be more aligned with the European Union, Canada, and Japan to try to find areas where we’re taking a common approach to addressing climate change,” she says. She will touch on some of those areas in the class she will teach in spring 2024 titled “Climate and Energy in the Global Economy,” offered through MIT Sloan.

    Looking ahead, she says, “I’m a techno optimist. I believe in human innovation. I’m optimistic that we’ll find ways to live with climate change and, hopefully, ways to minimize it.”

    This article appears in the Winter 2024 issue of Energy Futures, the magazine of the MIT Energy Initiative. More

  • in

    Engineers find a new way to convert carbon dioxide into useful products

    MIT chemical engineers have devised an efficient way to convert carbon dioxide to carbon monoxide, a chemical precursor that can be used to generate useful compounds such as ethanol and other fuels.

    If scaled up for industrial use, this process could help to remove carbon dioxide from power plants and other sources, reducing the amount of greenhouse gases that are released into the atmosphere.

    “This would allow you to take carbon dioxide from emissions or dissolved in the ocean, and convert it into profitable chemicals. It’s really a path forward for decarbonization because we can take CO2, which is a greenhouse gas, and turn it into things that are useful for chemical manufacture,” says Ariel Furst, the Paul M. Cook Career Development Assistant Professor of Chemical Engineering and the senior author of the study.

    The new approach uses electricity to perform the chemical conversion, with help from a catalyst that is tethered to the electrode surface by strands of DNA. This DNA acts like Velcro to keep all the reaction components in close proximity, making the reaction much more efficient than if all the components were floating in solution.

    Furst has started a company called Helix Carbon to further develop the technology. Former MIT postdoc Gang Fan is the lead author of the paper, which appears in the Journal of the American Chemical Society Au. Other authors include Nathan Corbin PhD ’21, Minju Chung PhD ’23, former MIT postdocs Thomas Gill and Amruta Karbelkar, and Evan Moore ’23.

    Breaking down CO2

    Converting carbon dioxide into useful products requires first turning it into carbon monoxide. One way to do this is with electricity, but the amount of energy required for that type of electrocatalysis is prohibitively expensive.

    To try to bring down those costs, researchers have tried using electrocatalysts, which can speed up the reaction and reduce the amount of energy that needs to be added to the system. One type of catalyst used for this reaction is a class of molecules known as porphyrins, which contain metals such as iron or cobalt and are similar in structure to the heme molecules that carry oxygen in blood. 

    During this type of electrochemical reaction, carbon dioxide is dissolved in water within an electrochemical device, which contains an electrode that drives the reaction. The catalysts are also suspended in the solution. However, this setup isn’t very efficient because the carbon dioxide and the catalysts need to encounter each other at the electrode surface, which doesn’t happen very often.

    To make the reaction occur more frequently, which would boost the efficiency of the electrochemical conversion, Furst began working on ways to attach the catalysts to the surface of the electrode. DNA seemed to be the ideal choice for this application.

    “DNA is relatively inexpensive, you can modify it chemically, and you can control the interaction between two strands by changing the sequences,” she says. “It’s like a sequence-specific Velcro that has very strong but reversible interactions that you can control.”

    To attach single strands of DNA to a carbon electrode, the researchers used two “chemical handles,” one on the DNA and one on the electrode. These handles can be snapped together, forming a permanent bond. A complementary DNA sequence is then attached to the porphyrin catalyst, so that when the catalyst is added to the solution, it will bind reversibly to the DNA that’s already attached to the electrode — just like Velcro.

    Once this system is set up, the researchers apply a potential (or bias) to the electrode, and the catalyst uses this energy to convert carbon dioxide in the solution into carbon monoxide. The reaction also generates a small amount of hydrogen gas, from the water. After the catalysts wear out, they can be released from the surface by heating the system to break the reversible bonds between the two DNA strands, and replaced with new ones.

    An efficient reaction

    Using this approach, the researchers were able to boost the Faradaic efficiency of the reaction to 100 percent, meaning that all of the electrical energy that goes into the system goes directly into the chemical reactions, with no energy wasted. When the catalysts are not tethered by DNA, the Faradaic efficiency is only about 40 percent.

    This technology could be scaled up for industrial use fairly easily, Furst says, because the carbon electrodes the researchers used are much less expensive than conventional metal electrodes. The catalysts are also inexpensive, as they don’t contain any precious metals, and only a small concentration of the catalyst is needed on the electrode surface.

    By swapping in different catalysts, the researchers plan to try making other products such as methanol and ethanol using this approach. Helix Carbon, the company started by Furst, is also working on further developing the technology for potential commercial use.

    The research was funded by the U.S. Army Research Office, the CIFAR Azrieli Global Scholars Program, the MIT Energy Initiative, and the MIT Deshpande Center. More

  • in

    MIT-derived algorithm helps forecast the frequency of extreme weather

    To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades, and even centuries, forward in time, but only at a coarse resolution. These models might be used to gauge, for instance, future climate conditions for the northeastern U.S., but not specifically for Boston.

    To estimate Boston’s future risk of extreme weather such as flooding, policymakers can combine a coarse model’s large-scale predictions with a finer-resolution model, tuned to estimate how often Boston is likely to experience damaging floods as the climate warms. But this risk analysis is only as accurate as the predictions from that first, coarser climate model.

    “If you get those wrong for large-scale environments, then you miss everything in terms of what extreme events will look like at smaller scales, such as over individual cities,” says Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering.

    Sapsis and his colleagues have now developed a method to “correct” the predictions from coarse climate models. By combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales. When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades, compared to predictions made without the correction scheme.

    Play video

    This animation shows the evolution of storms around the northern hemisphere, as a result of a high-resolution storm model, combined with the MIT team’s corrected global climate model. The simulation improves the modeling of extreme values for wind, temperature, and humidity, which typically have significant errors in coarse scale models. Credit: Courtesy of Ruby Leung and Shixuan Zhang, PNNL

    Sapsis says the new correction scheme is general in form and can be applied to any global climate model. Once corrected, the models can help to determine where and how often extreme weather will strike as global temperatures rise over the coming years. 

    “Climate change will have an effect on every aspect of human life, and every type of life on the planet, from biodiversity to food security to the economy,” Sapsis says. “If we have capabilities to know accurately how extreme weather will change, especially over specific locations, it can make a lot of difference in terms of preparation and doing the right engineering to come up with solutions. This is the method that can open the way to do that.”

    The team’s results appear today in the Journal of Advances in Modeling Earth Systems. The study’s MIT co-authors include postdoc Benedikt Barthel Sorensen and Alexis-Tzianni Charalampopoulos SM ’19, PhD ’23, with Shixuan Zhang, Bryce Harrop, and Ruby Leung of the Pacific Northwest National Laboratory in Washington state.

    Over the hood

    Today’s large-scale climate models simulate weather features such as the average temperature, humidity, and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power, and in order to simulate how weather features will interact and evolve over periods of decades or longer, models average out features every 100 kilometers or so.

    “It’s a very heavy computation requiring supercomputers,” Sapsis notes. “But these models still do not resolve very important processes like clouds or storms, which occur over smaller scales of a kilometer or less.”

    To improve the resolution of these coarse climate models, scientists typically have gone under the hood to try and fix a model’s underlying dynamical equations, which describe how phenomena in the atmosphere and oceans should physically interact.

    “People have tried to dissect into climate model codes that have been developed over the last 20 to 30 years, which is a nightmare, because you can lose a lot of stability in your simulation,” Sapsis explains. “What we’re doing is a completely different approach, in that we’re not trying to correct the equations but instead correct the model’s output.”

    The team’s new approach takes a model’s output, or simulation, and overlays an algorithm that nudges the simulation toward something that more closely represents real-world conditions. The algorithm is based on a machine-learning scheme that takes in data, such as past information for temperature and humidity around the world, and learns associations within the data that represent fundamental dynamics among weather features. The algorithm then uses these learned associations to correct a model’s predictions.

    “What we’re doing is trying to correct dynamics, as in how an extreme weather feature, such as the windspeeds during a Hurricane Sandy event, will look like in the coarse model, versus in reality,” Sapsis says. “The method learns dynamics, and dynamics are universal. Having the correct dynamics eventually leads to correct statistics, for example, frequency of rare extreme events.”

    Climate correction

    As a first test of their new approach, the team used the machine-learning scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM), a climate model run by the U.S. Department of Energy, that simulates climate patterns around the world at a resolution of 110 kilometers. The researchers used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They then ran the climate model forward in time for about 36 years and applied the trained algorithm to the model’s simulations. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years, not used for training.

    “We’re not talking about huge differences in absolute terms,” Sapsis says. “An extreme event in the uncorrected simulation might be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.”

    When the team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones, they found the approach accurately reproduced the frequency of extreme storms in specific locations around the world.

    “We now have a coarse model that can get you the right frequency of events, for the present climate. It’s much more improved,” Sapsis says. “Once we correct the dynamics, this is a relevant correction, even when you have a different average global temperature, and it can be used for understanding how forest fires, flooding events, and heat waves will look in a future climate. Our ongoing work is focusing on analyzing future climate scenarios.”

    “The results are particularly impressive as the method shows promising results on E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group at the University of Chicago and was not involved with the study. “It would be interesting to see what climate change projections this framework yields once future greenhouse-gas emission scenarios are incorporated.”

    This work was supported, in part, by the U.S. Defense Advanced Research Projects Agency. More