More stories

  • in

    InEnTec: Turning trash into valuable chemical products and clean fuels

    Anyone who has ever hesitated in front of a trash bin knows the problem: It’s hard to determine what can be recycled. Consider the average potato chip bag. It’s got film plastic, metal, dyes, and food residue; it’s complicated. Today’s recycling doesn’t handle complexity well, so the typical chip bag is destined for the landfill.
    Landfills take up space, of course, but there is a much more serious problem associated with them — one that was underscored for Daniel R. Cohn, currently an MIT Energy Initiative (MITEI) research scientist, when he was the executive director of MITEI’s Future of Natural Gas study. That problem is greenhouse gas emissions.
    “About 130 million tons of waste per year go into landfills in the U.S., and that produces at least 130 million tons of CO2-equivalent emissions,” Cohn says, noting that most of these emissions come in the form of methane, a naturally occurring gas that is much worse for the climate than carbon dioxide (CO2).
    For Cohn, working on the MITEI study made it clear that the time was ripe for InEnTec — a company he co-founded — to expand its business. Spun out of MIT in 1995, InEnTec uses a process called plasma gasification to turn any kind of trash — even biological, radioactive, and other hazardous waste — into valuable chemical products and clean fuels. (The company’s name originally stood for Integrated Environmental Technologies.)
    The process is more expensive than throwing trash in a landfill, however, and climate change considerations weren’t a major driver of investment 25 years ago. “Back in the early ’90s, global warming was more of an academic pursuit,” says InEnTec president, CEO, and co-founder Jeffrey E. Surma, adding that many people at the time didn’t even believe in the phenomenon.
    As a result, for many years the company concentrated on providing niche services to heavy industries and governments with serious toxic waste problems. Now, however, Surma says the company is expanding with projects that include plastics recycling and low-cost distributed hydrogen fuel production — using advanced versions of their core technologies to keep waste out of landfills and greenhouse gases out of the air.
    “People today understand that decarbonization of our energy and industrial system has to occur,” says Surma. Diverting one ton of municipal solid waste from landfills is equivalent — “at a minimum” — to preventing one ton of CO2 from reaching the atmosphere, he notes. “It’s very significant.”
    Roots at MIT
    The story of InEnTec begins at the MIT Plasma Science and Fusion Center (PSFC) in the early 1990s. Cohn, who was then head of the Plasma Technology Division at the PSFC, wanted to identify new ways to use technologies being developed for nuclear fusion. “Fusion is very long-term, so I wondered if we could find something that would be useful for societal benefit more near-term,” he says. “We decided to look into an environmental application.”
    He teamed up with Surma, who was working on nuclear waste cleanup at the Pacific Northwest National Laboratory (PNNL), and they obtained U.S. Department of Energy funding to build and operate an experimental waste treatment furnace facility at MIT using plasma — a superheated, highly ionized gas. Plasma is at the core of fusion research, which aims to replicate the energy-producing powers of the sun, which is essentially a ball of plasma. MIT provided the critical large-scale space and facilities support for building the plasma furnace.
    After the MIT project ended, Cohn and Surma teamed up with an engineer from General Electric, Charles H. Titus, to combine the plasma technology with a joule-heating melter, a device Surma had been developing to trap hazardous wastes in molten glass. They filed for patents, and with business help from a fourth co-founder, Larry Dinkin, InEnTec was born; a facility was established in Richland, Washington, near PNNL.
    InEnTec’s technology, which the team developed and tested for years before opening the company’s first commercial-scale production facility in 2008, “allows waste to come into a chamber and be exposed to extreme temperatures — a controlled bolt of lightning of over 10,000 degrees Celsius,” Surma explains. “When waste material enters that zone, it breaks down into its elements.”
    Depending on the size of the unit, InEnTec processors can handle from 25 to 150 tons of waste a day — waste that might otherwise be landfilled, or even incinerated, Cohn points out. For example, in a project now under way in California, the company will produce ethanol using agricultural biomass waste that would typically have been burned and thus would have both generated CO2 and contributed to air pollution in the Central Valley, he says.
    Supporting the hydrogen economy
    Unlike incineration, which releases contaminants into the air, InEnTec’s process traps hazardous elements in molten glass while producing a useful feedstock fuel called synthesis gas, or “syngas,” which can be transformed into such fuels as ethanol, methanol, and hydrogen. “It’s an extremely clean process,” Surma says.
    Hydrogen is a key product focus for InEnTec, which hopes to produce inexpensive, fuel cell–grade hydrogen at sites across the country — work that could support the expanded use of electric vehicles powered by hydrogen fuel cells. “We see this as an enormous opportunity,” Surma says.
    While 99 percent of hydrogen today is produced from fossil fuels, InEnTec can generate hydrogen from any waste product. And its plants have a small footprint — typically one-half to two acres — allowing hydrogen to be produced almost anywhere. “You’re reducing the distance waste has to travel and converting it into a virtually zero-carbon fuel,” Surma adds, explaining that the InEnTec process itself produces no direct emissions.
    Already InEnTec has built a plant in Oregon that will make fuel cell-grade hydrogen for the Northwest market from waste material and biomass. The plant has the potential to make 1,500 kilograms of hydrogen a day, roughly enough to fuel 2,500 cars for the average daily commute.
    “We can generate hydrogen at very low cost, which is what’s needed to compete with gasoline,” Surma says.
    Recycling plastic
    Another initiative at InEnTec zeroes in on plastics recycling, which faces the kind of complexity illustrated by the chip bag. Different grades of plastic have different chemical compositions and cannot simply be melted down together to make new plastic — which is why less than 10 percent of plastic waste in the United States today is recycled, Cohn says.
    InEnTec solves this problem with what it calls “molecular recycling.” “We’ve partnered with chemical companies pursuing plastic circularity [making new plastics from old plastics], because our technology allows us to get back to molecules, the virgin form of plastics,” Surma explains.
    Recently, InEnTec teamed up with a major car-shredding company to process its plastic waste. “We can recycle the materials back into molecules that can be feedstock for new dashboards, seats, et cetera,” Surma says, noting that 40-45 percent of the material in the waste generated from recycling vehicles today is plastic. “We think this will be a very significant part of our business going forward.”
    InEnTec’s technology is also being used to recycle plastic for environmental cleanup. Notably, a small unit is being deployed on a boat to process ocean plastics. That project will likely require subsidies, Surma concedes, since InEnTec’s business model depends on waste disposal payments. However, it illustrates the range of projects InEnTec can address, and it shows that — in both large and small ways — InEnTec is keeping waste out of landfills.
    “We initially put a lot of effort into medical and hazardous waste because we got more money for disposing of those,” says Cohn, but he emphasizes that the team has always had broader ambitions. “We’re just arriving now at the point of getting more customers who believe that an environmentally superior product has more value. It’s taken a long time to get to this point.”
    This article appears in the Autumn 2020 issue of Energy Futures. More

  • in

    To boost emissions reductions from electric vehicles, know when to charge

    Transportation-related emissions are increasing globally. Currently, light-duty vehicles — namely passenger cars, such as sedans, SUVs, or minivans — contribute about 20 percent of the net greenhouse gas emissions in the United States. But studies have shown that switching out your conventional gas-guzzling car for a vehicle powered by electricity can make a significant dent in reducing these emissions.
    A recent study published in Environmental Science and Technology takes this a step further by examining how to reduce the emissions associated with the electricity source used to charge an electric vehicle (EV). Taking into account regional charging patterns and the effect of ambient temperature on car fuel economy, researchers at the MIT Energy Initiative (MITEI) find that the time of day when an EV is charged significantly impacts the vehicle’s emissions.
    “If you facilitate charging at particular times, you can really boost the emissions reductions that result from growth in renewables and EVs,” says Ian Miller, the lead author of the study and a research associate at MITEI. “So how do we do this? Time-of-use electricity rates are spreading, and can dramatically shift the time of day when EV drivers charge. If we inform policymakers of these large time-of-charging impacts, they can then design electricity rates to discount charging when our power grids are renewable-heavy. In solar-heavy regions, that’s midday. In wind-heavy regions, like the Midwest, it’s overnight.”
    According to their research, in solar-heavy California, charging an electric vehicle overnight produces 70 percent more emissions than if it were charged midday (when more solar energy powers the grid). Meanwhile, in New York, where nuclear and hydro power constitute a larger share of the electricity mix during the night, the best charging time is the opposite. In this region, charging a vehicle overnight actually reduces emissions by 20 percent relative to daytime charging.
    “Charging infrastructure is another big determinant when it comes to facilitating charging at specific times — during the day especially,” adds Emre Gençer, co-author and a research scientist at MITEI. “If you need to charge your EV midday, then you need to have enough charging stations at your workplace. Today, most people charge their vehicles in their garages overnight, which is going to produce higher emissions in places where it is best to charge during the day.”
    In the study, Miller, Gençer, and Maryam Arbabzadeh, a postdoc at MITEI, make these observations in part by calculating the percentage of error in two common EV emission modeling approaches, which ignore hourly variation in the grid and temperature-driven variation in fuel economy. Their results find that the combined error from these standard methods exceeds 10 percent in 30 percent of the cases, and reaches 50 percent in California, which is home to half of the EVs in the United States.
    “If you don’t model time of charging, and instead assume charging with annual average power, you can mis-estimate EV emissions,” says Arbabzadeh. “To be sure, it’s great to get more solar on the grid and more electric vehicles using that grid. No matter when you charge your EV in the U.S., its emissions will be lower than a similar gasoline-powered car; but if EV charging occurs mainly when the sun is down, you won’t get as much benefit when it comes to reducing emissions as you think when using an annual average.”
    Seeking to lessen this margin of error, the researchers use hourly grid data from 2018 and 2019 — along with hourly charging, driving, and temperature data — to estimate emissions from EV use in 60 cases across the United States. They then introduce and validate a novel method (with less than 1 percent margin of error) to accurately estimate EV emissions. They call it the “average day” method.
    “We found that you can ignore seasonality in grid emissions and fuel economy, and still accurately estimate yearly EV emissions and charging-time impacts,” says Miller. “This was a pleasant surprise. In Kansas last year, daily grid emissions rose about 80 percent between seasons, while EV power demand rose about 50 percent due to temperature changes. Previous studies speculated that ignoring such seasonal swings would hurt accuracy in EV emissions estimates, but never actually quantified the error. We did — across diverse grid mixes and climates — and found the error to be negligible.”
    This finding has useful implications for modeling future EV emissions scenarios. “You can get accuracy without computational complexity,” says Arbabzadeh. “With the average-day method, you can accurately estimate EV emissions and charging impacts in a future year without needing to simulate 8,760 values of grid emissions for each hour of the year. All you need is one average-day profile, which means only 24 hourly values, for grid emissions and other key variables. You don’t need to know seasonal variance from those average-day profiles.”
    The researchers demonstrate the utility of the average-day method by conducting a case study in the southeastern United States from 2018 to 2032 to examine how renewable growth in this region may impact future EV emissions. Assuming a conservative grid projection from the U.S. Energy Information Administration, the results show that EV emissions decline only 16 percent if charging occurs overnight, but more than 50 percent if charging occurs midday. In 2032, compared to a similar hybrid car, EV emissions per mile are 30 percent lower if charged overnight, and 65 percent lower if charged midday.
    The model used in this study is one module in a larger modeling program called the Sustainable Energy Systems Analysis Modeling Environment (SESAME). This tool, developed at MITEI, takes a systems-level approach to assess the complete carbon footprint of today’s evolving global energy system.
    “The idea behind SESAME is to make better decisions for decarbonization and to understand the energy transition from a systems perspective,” says Gençer. “One of the key elements of SESAME is how you can connect different sectors together — ‘sector coupling’ — and in this study, we are seeing a very interesting example from the transportation and electric power sectors. Right now, as we’ve been claiming, it’s impossible to treat these two sector systems independently, and this is a clear demonstration of why MITEI’s new modeling approach is really important, as well as how we can tackle some of these impending issues.”
    In ongoing and future research, the team is expanding their charging analysis from individual vehicles to whole fleets of passenger cars in order to develop fleet-level decarbonization strategies. Their work seeks to answer questions such as how California’s proposed ban of gasoline car sales in 2035 would impact transportation emissions. They are also exploring what fleet electrification could mean — not only for greenhouse gases, but also the demand for natural resources such as cobalt — and whether EV batteries could provide significant grid energy storage.
    “To mitigate climate change, we need to decarbonize both the transportation and electric power sectors,” says Gençer. “We can electrify transportation, and it will significantly reduce emissions, but what this paper shows is how you can do it more effectively.”
    This research was sponsored by ExxonMobil Research and Engineering through the MIT Energy Initiative Low-Carbon Energy Centers. More

  • in

    Want cheaper nuclear energy? Turn the design process into a game

    Nuclear energy provides more carbon-free electricity in the United States than solar and wind combined, making it a key player in the fight against climate change. But the U.S. nuclear fleet is aging, and operators are under pressure to streamline their operations to compete with coal- and gas-fired plants.
    One of the key places to cut costs is deep in the reactor core, where energy is produced. If the fuel rods that drive reactions there are ideally placed, they burn less fuel and require less maintenance. Through decades of trial and error, nuclear engineers have learned to design better layouts to extend the life of pricey fuel rods. Now, artificial intelligence is poised to give them a boost.
    Researchers at MIT and Exelon show that by turning the design process into a game, an AI system can be trained to generate dozens of optimal configurations that can make each rod last about 5 percent longer, saving a typical power plant an estimated $3 million a year, the researchers report. The AI system can also find optimal solutions faster than a human, and quickly modify designs in a safe, simulated environment. Their results appear this month in the journal Nuclear Engineering and Design.
    “This technology can be applied to any nuclear reactor in the world,” says the study’s senior author, Koroush Shirvan, an assistant professor in MIT’s Department of Nuclear Science and Engineering. “By improving the economics of nuclear energy, which supplies 20 percent of the electricity generated in the U.S., we can help limit the growth of global carbon emissions and attract the best young talents to this important clean-energy sector.”
    In a typical reactor, fuel rods are lined up on a grid, or assembly, by their levels of uranium and gadolinium oxide within, like chess pieces on a board, with radioactive uranium driving reactions, and rare-earth gadolinium slowing them down. In an ideal layout, these competing impulses balance out to drive efficient reactions. Engineers have tried using traditional algorithms to improve on human-devised layouts, but in a standard 100-rod assembly there might be an astronomical number of options to evaluate. So far, they’ve had limited success.
    The researchers wondered if deep reinforcement learning, an AI technique that has achieved superhuman mastery at games like chess and Go, could make the screening process go faster. Deep reinforcement learning combines deep neural networks, which excel at picking out patterns in reams of data, with reinforcement learning, which ties learning to a reward signal like winning a game, as in Go, or reaching a high score, as in Super Mario Bros.
    Here, the researchers trained their agent to position the fuel rods under a set of constraints, earning more points with each favorable move. Each constraint, or rule, picked by the researchers reflects decades of expert knowledge rooted in the laws of physics. The agent might score points, for example, by positioning low-uranium rods on the edges of the assembly, to slow reactions there; by spreading out the gadolinium “poison” rods to maintain consistent burn levels; and by limiting the number of poison rods to between 16 and 18.
    “After you wire in rules, the neural networks start to take very good actions,” says the study’s lead author Majdi Radaideh, a postdoc in Shirvan’s lab. “They’re not wasting time on random processes. It was fun to watch them learn to play the game like a human would.”
    Through reinforcement learning, AI has learned to play increasingly complex games as well as or better than humans. But its capabilities remain relatively untested in the real world. Here, the researchers show that reinforcement learning has potentially powerful applications.
    “This study is an exciting example of transferring an AI technique for playing board games and video games to helping us solve practical problems in the world,” says study co-author Joshua Joseph, a research scientist at the MIT Quest for Intelligence.
    Exelon is now testing a beta version of the AI system in a virtual environment that mimics an assembly within a boiling water reactor, and about 200 assemblies within a pressurized water reactor, which is globally the most common type of reactor. Based in Chicago, Illinois, Exelon owns and operates 21 nuclear reactors across the United States. It could be ready to implement the system in a year or two, a company spokesperson says.
    The study’s other authors are Isaac Wolverton, a MIT senior who joined the project through the Undergraduate Research Opportunities Program; Nicholas Roy and Benoit Forget of MIT; and James Tusar and Ugi Otgonbaatar of Exelon. More

  • in

    Fikile Brushett is looking for new ways to store energy

    Fikile Brushett, an MIT associate professor of chemical engineering, had an unusual source of inspiration for his career in the chemical sciences: the character played by Nicolas Cage in the 1996 movie “The Rock.” In the film, Cage portrays an FBI chemist who hunts down a group of rogue U.S. soldiers who have commandeered chemical weapons and taken over the island of Alcatraz.
    “For a really long time, I really wanted to be a chemist and work for the FBI with chemical warfare agents. That was the goal: to be Nick Cage,” recalls Brushett, who first saw the movie as a high school student living in Silver Spring, Maryland, a suburb of Washington.
    Though he did not end up joining the FBI or working with chemical weapons — which he says is probably for the best — Brushett did pursue his love of chemistry. In his lab at MIT, Brushett leads a group dedicated to developing more efficient and sustainable ways to store energy, including batteries that could be used to store the electricity generated by wind and solar power. He is also exploring new ways to convert carbon dioxide to useful fuels.
    “The backbone of our global energy economy is based upon liquid fossil fuels right now, and energy demand is increasing,” he says. “The challenge we’re facing is that carbon emissions are tied very tightly to this increasing energy demand, and carbon emissions are linked to climate volatility, as well as pollution and health effects. To me, this is an incredibly urgent, important, and inspiring problem to go after.”
    “A body of knowledge”
    Brushett’s parents immigrated to the United States in the early 1980s, before he was born. His mother, an English as a second language teacher, is from South Africa, and his father, an economist, is from the United Kingdom. Brushett grew up mostly in the Washington area, with the exception of four years spent living in Zimbabwe, due to his father’s work at the World Bank.
    Brushett remembers this as an idyllic time, saying, “School ended at 1 p.m., so you almost had the whole afternoon to do sports at school, or you could go home and just play in the garden.”
    His family returned to the Washington area while he was in sixth grade, and in high school, he started to get interested in chemistry, as well as other scientific subjects and math.
    At the University of Pennsylvania, he decided to major in chemical engineering because someone had advised him that if he liked chemistry and math, chemical engineering would be a good fit. While he enjoyed some of his chemical engineering classes, he struggled with others at first.
    “I remember really having a hard time with chemE for a while, and I was fortunate enough to have a really good academic advisor who said, ‘Listen, chemE is hard for some people. Some people get it immediately, for some people it takes a little while for it to sink in,’” he says. Around his junior year, concepts started to fall into place, he recalls. “Rather than looking at courses as self-contained units, the units started coming together and flowing into a body of knowledge. I was able to see the interconnections between courses.”
    While he was originally most interested in molecular biotechnology — the field of engineering proteins and other biological molecules — he ended up working in a reaction engineering lab with his academic advisor, John Vohs. There, he studied how catalytic surfaces influence chemical reactions. At Vohs’ recommendation, he applied to the University of Illinois at Urbana-Champaign for graduate school, where he worked on electrochemistry projects. With his PhD advisor, Paul Kenis, he developed microfluidic fuel cells that could run on a variety of different fuels as portable power sources.
    During his third year of graduate school, he began applying for faculty positions and was offered a job at MIT, which he accepted but deferred for two years so he could do a postdoc at Argonne National Laboratory. There, he worked with scientists and engineers doing a wide range of research on electrochemical energy storage, and became interested in flow batteries, which is now one of the major focus areas of his lab at MIT.
    Modeling new technology
    Unlike the rechargeable lithium-ion batteries that power our cell phones and laptops, flow batteries use large tanks of liquid to store energy. Such batteries have traditionally been prohibitively expensive because they rely on pricey electroactive metal salts. Brushett is working on alternative approaches that use less expensive electroactive materials derived from organic compounds.
    Such batteries could be used to store the power intermittently produced by wind turbines and solar panels, making them a more reliable, efficient, and cost-effective source of energy. His lab also works on new processes for converting carbon dioxide, a waste product and greenhouse gas, into useful fuels.
    In a related area of research, Brushett’s lab performs “techno-economic” modeling of potential new technologies, to help them assess what aspects of the technology need the most improvement to make them economically feasible.
    “With techno-economic modeling, we can devise targets for basic science,” he says. “We’re always looking for the rate-limiting step. What is it that’s preventing us from moving forward? In some cases it could be a catalyst, in other cases it could be a membrane. In other cases it could be the architecture for the device.”
    Once those targets are identified, researchers working in those areas have a better idea of what they need to focus on to make a particular technology work, Brushett says.
    “That’s the thing I’ve been most proud of from our research — hopefully opening up or demystifying the field and allowing a more diverse set of researchers to enter and to add value, which I think is important in terms of growing the science and developing new ideas,” he says. More

  • in

    Researchers decipher structure of promising battery materials

    A class of materials called metal organic frameworks, or MOFs, has attracted considerable interest over the last several years for a variety of potential energy-related applications — especially since researchers discovered that these typically insulating materials could also be made electrically conductive.
    Thanks to MOFs’ extraordinary combination of porosity and conductivity, this finding opened the possibility of new applications in batteries, fuel cells, supercapacitors, electrocatalysts, and specialized chemical sensors. But the process of developing specific MOF materials that possess the desired characteristics has been slow. That’s largely because it’s been hard to figure out their exact molecular structure and how it influences the material’s properties.
    Now, researchers at MIT and other institutions have found a way to control the growth of crystals of several kinds of MOFs. This made it possible to produce crystals large enough to be probed by a battery of tests, enabling the team to finally decode the structure of these materials, which resemble the two-dimensional hexagonal lattices of materials like graphene.
    The findings are described today in the journal Nature Materials, in a paper by a team of 20 at MIT and other universities in the U.S., China, and Sweden, led by W. M. Keck Professor of Energy Mircea Dincă from MIT’s Department of Chemistry.
    Since conductive MOFs were first discovered a few years ago, Dincă says, many teams have been working to develop versions for many different applications, “but nobody had been able to get a structure of the material with so much detail.” The better the details of those structures are understood, he says, “it helps you design better materials, and much faster. And that’s what we’ve done here: We provided the first detailed crystal structure at atomic resolution.”
    The difficulty in growing crystals that were large enough for such studies, he says, lies in the chemical bonds within the MOFs. These materials consist of a lattice of metal atoms and organic molecules that tend to form into crooked needle- or thread-like crystals, because the chemical bonds that connect the atoms in the plane of their hexagonal lattice are harder to form and harder to break. In contrast, the bonds in the vertical direction are much weaker and so keep breaking and reforming at a faster rate, causing the structures to rise faster than they can spread out. The resulting spindly crystals were far too small to be characterized by most available tools.
    The team solved that problem by changing the molecular structure of one of the organic compounds in the MOF so that it changed the balance of electron density and the way it interacts with the metal. This reversed the imbalance in the bond strengths and growth rates, thus allowing much larger crystal sheets to form. These larger crystals were then analyzed using a battery of high-resolution diffraction-based imaging techniques.
    As was the case with graphene, finding ways to produce larger sheets of the material could be a key to unlocking the potential of this type of MOFs, Dincă says. Initially graphene could only be produced by using sticky tape to peel off single-atom-thick layers from a block of graphite, but over time methods have been developed to directly produce sheets large enough to be useful. The hope is that the techniques developed in this study could help pave the way to similar advances for MOFs, Dincă says.
    “This is basically providing a basis and a blueprint for making large crystals of two-dimensional MOFs,” he says.
    As with graphene, but unlike most other conductive materials, the conductive MOFs have a strong directionality to their electrical conductivity: They conduct much more freely along the plane of the sheet of material than in the perpendicular direction.
    This property, combined with the material’s very high porosity, could make it a strong candidate to be used as an electrode material for batteries, fuel cells, or supercapacitors. And when its organic components have certain groups of atoms attached to them that bond to particular other compounds, they could be used as very sensitive chemical detectors.
    Graphene and the handful of other 2D materials known have opened up a wide swath of research in potential applications in electronics and other fields, but those materials have essentially fixed properties. Because MOFs share many of those materials’ characteristics, but form a broad family of possible variations with varying properties, they should allow researchers to design the specific kinds of materials needed for a particular use, Dincă says.
    For fuel cells, for example, “you want something that has a lot of active sites” for reactivity on the large surface area provided by the structure with its open latticework, he says. Or for a sensor to monitor levels of a particular gas such as carbon dioxide, “you want something that is specific and doesn’t give false positives.” These kinds of properties can be engineered in through the selection of the organic compounds used to make the MOFs, he says.
    The team included researchers from MIT’s departments of Chemistry, Biology, and Electrical Engineering and Computer Science; Peking University and the Shanghai Advanced Research University in China; Stockholm University in Sweden; the University of Oregon; and Purdue University. The work was supported by the U.S. Army Research Office. More