More stories

  • in

    A new way to assess radiation damage in reactors

    A new method could greatly reduce the time and expense needed for certain important safety checks in nuclear power reactors. The approach could save money and increase total power output in the short run, and it might increase plants’ safe operating lifetimes in the long run.

    One of the most effective ways to control greenhouse gas emissions, many analysts argue, is to prolong the lifetimes of existing nuclear power plants. But extending these plants beyond their originally permitted operating lifetimes requires monitoring the condition of many of their critical components to ensure that damage from heat and radiation has not led, and will not lead, to unsafe cracking or embrittlement.

    Today, testing of a reactor’s stainless steel components — which make up much of the plumbing systems that prevent heat buildup, as well as many other parts — requires removing test pieces, known as coupons, of the same kind of steel that are left adjacent to the actual components so they experience the same conditions. Or, it requires the removal of a tiny piece of the actual operating component. Both approaches are done during costly shutdowns of the reactor, prolonging these scheduled outages and costing millions of dollars per day.

    Now, researchers at MIT and elsewhere have come up with a new, inexpensive, hands-off test that can produce similar information about the condition of these reactor components, with far less time required during a shutdown. The findings are reported today in the journal Acta Materiala in a paper by MIT professor of nuclear science and engineering Michael Short; Saleem Al Dajani ’19 SM ’20, who did his master’s work at MIT on this project and is now a doctoral student at the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia; and 13 others at MIT and other institutions.

    The test involves aiming laser beams at the stainless steel material, which generates surface acoustic waves (SAWs) on the surface. Another set of laser beams is then used to detect and measure the frequencies of these SAWs. Tests on material aged identically to nuclear power plants showed that the waves produced a distinctive double-peaked spectral signature when the material was degraded.

    Short and Al Dajani embarked on the process in 2018, looking for a more rapid way to detect a specific kind of degradation, called spinodal decomposition, that can take place in austenitic stainless steel, which is used for components such as the 2- to 3-foot wide pipes that carry coolant water to and from the reactor core. This process can lead to embrittlement, cracking, and potential failure in the event of an emergency.

    While spinodal decomposition is not the only type of degradation that can occur in reactor components, it is a primary concern for the lifetime and sustainability of nuclear reactors, Short says.

    “We were looking for a signal that can link material embrittlement with properties we can measure, that can be used to estimate lifetimes of structural materials,” Al Dajani says.

    They decided to try a technique Short and his students and collaborators had expanded upon, called transient grating spectroscopy, or TGS, on samples of reactor materials known to have experienced spinodal decomposition as a result of their reactor-like thermal aging history. The method uses laser beams to stimulate, and then measure, SAWs on a material. The idea was that the decomposition should slow down the rate of heat flow through the material, that slowdown would be detectable by the TGS method.

    However, it turns out there was no such slowdown. “We went in with a hypothesis about what we would see, and we were wrong,” Short says.

    That’s often the way things work out in science, he says. “You go in guns blazing, looking for a certain thing, for a great reason, and you turn out to be wrong. But if you look carefully, you find other patterns in the data that reveal what nature actually has to say.”

    Instead, what showed up in the data was that, while a material would usually produce a single frequency peak for the material’s SAWs, in the degraded samples there was a splitting into two peaks.

    “It was a very clear pattern in the data,” Short recalls. “We just didn’t expect it, but it was right there screaming at us in the measurements.”

    Cast austenitic stainless steels like those used in reactor components are what’s known as duplex steels, actually a mixture of two different crystal structures in the same material by design. But while one of the two types is quite impervious to spinodal decomposition, the other is quite vulnerable to it. When the material starts to degrade, the difference shows up in the different frequency responses of the material, which is what the team found in their data.

    That finding was a total surprise, though. “Some of my current and former students didn’t believe it was happening,” Short says. “We were unable to convince our own team this was happening, with the initial statistics we had.” So, they went back and carried out further tests, which continued to strengthen the significance of the results. They reached a point where the confidence level was 99.9 percent that spinodal decomposition was indeed coincident with the wave peak separation.

    “Our discussions with those who opposed our initial hypotheses ended up taking our work to the next level,” Al Dajani says.

    The tests they did used large lab-based lasers and optical systems, so the next step, which the researchers are hard at work on, is miniaturizing the whole system into something that can be an easily portable test kit to use to check reactor components on-site, reducing the length of shutdowns. “We’re making great strides, but we still have some way to go,” he says.

    But when they achieve that next step, he says, it could make a significant difference. “Every day that your nuclear plant goes down, for a typical gigawatt-scale reactor, you lose about $2 million a day in lost electricity,” Al Dajani says, “so shortening outages is a huge thing in the industry right now.”

    He adds that the team’s goal was to find ways to enable existing plants to operate longer: “Let them be down for less time and be as safe or safer than they are right now — not cutting corners, but using smart science to get us the same information with far less effort.” And that’s what this new technique seems to offer.

    Short hopes that this could help to enable the extension of power plant operating licenses for some additional decades without compromising safety, by enabling frequent, simple and inexpensive testing of the key components. Existing, large-scale plants “generate just shy of a billion dollars in carbon-free electricity per plant each year,” he says, whereas bringing a new plant online can take more than a decade. “To bridge that gap, keeping our current nukes online is the single biggest thing we can do to fight climate change.”

    The team included researchers at MIT, Idaho National Laboratory, Manchester University and Imperial College London in the UK, Oak Ridge National Laboratory, the Electric Power Research Institute, Northeastern University, the University of California at Berkeley, and KAUST. The work was supported by the International Design Center at MIT and the Singapore University of Technology and Design, the U.S. Nuclear Regulatory Commission, and the U.S. National Science Foundation. More

  • in

    Sustainable supply chains put the customer first

    When we consider the supply chain, we typically think of factories, ships, trucks, and warehouses. Yet, the customer side is equally important, especially in efforts to make our distribution networks more sustainable. Customers are an untapped resource in building sustainability, says Josué C. Velázquez Martínez, a research scientist at MIT Center for Transportation and Logistics. 

    Velázquez Martínez, who is director of MIT’s Sustainable Supply Chain Lab, investigates how customer-facing supply chains can be made more environmentally and socially sustainable. One way is a Green Button project that explores how to optimize e-commerce delivery schedules to reduce carbon emissions and persuade customers to use less carbon-intensive four- or five-day shipping options instead of one or two days. Velázquez Martínez has also launched the MIT Low Income Firms Transformation (LIFT) Lab that is researching ways to improve micro-retailer supply chains in the developing world to provide owners with the necessary tools for survival.  

    “The definition of sustainable supply chain keeps evolving because things that were sustainable 20 to 30 years ago are not as sustainable now,” says Velázquez Martínez. “Today, there are more companies that are capturing information to build strategies for environmental, economic, and social sustainability. They are investing in alternative energy and other solutions to make the supply chain more environmentally friendly and are tracking their suppliers and identifying key vulnerabilities. A big part of this is an attempt to create fairer conditions for people who work in supply chains or are dependent on them.”

    Play video

    The move toward sustainable supply chain is being driven as much by people as by companies, whether they are playing the role of selective consumer or voting citizens. The consumer aspect is often overlooked, says Velázquez Martínez. “Consumers are the ones who move the supply chain. We are looking at how companies can provide transparency to involve customers in their sustainability strategy.” 

    Proposed solutions for sustainability are not always as effective as promised. Some fashion rental schemes fall into this category, says Velázquez Martínez. “There are many new rental companies that are trying to get more use out of clothes to offset the emissions associated with production. We recently researched the environmental impact of monthly subscription models where consumers pay a fee to receive clothes for a month before returning them, as well as peer-to-peer sharing models.” 

    The researchers found that while rental services generally have a lower carbon footprint than retail sales, hidden emissions from logistics played a surprisingly large role. “First, you need to deliver the clothes and pick them up, and there are high return rates,” says Velázquez Martínez. “When you factor in dry cleaning and packaging emissions, the rental models in some cases have a worse carbon footprint than buying new clothes.” Peer-to-peer sharing could be better, he adds, but that depends on how far the consumers travel to meet-up points. 

    Typically, says Velázquez Martínez, garment types that are frequently used are not well suited to rental models. “But for specialty clothes such as wedding dresses or prom dresses, it is better to rent.” 

    Waiting a few days to save the planet 

    Even before the pandemic, online retailing gained a second wind due to low-cost same- and next-day delivery options. While e-commerce may have its drawbacks as a contributor to social isolation and reduced competition, it has proven itself to be far more eco-friendly than brick-and-mortar shopping, not to mention a lot more convenient. Yet rapid deliveries are cutting into online-shopping’s carbon-cutting advantage.

    In 2019, MIT’s Sustainable Supply Chain Lab launched a Green Bottle project to study the rapid delivery phenomenon. The project has been “testing whether consumers would be willing to delay their e-commerce deliveries to reduce the environmental impact of fast shipping,” says Velázquez Martínez. “Many companies such as Walmart and Target have followed Amazon’s 2019 strategy of moving from two-day to same-day delivery. Instead of sending a fully loaded truck to a neighborhood every few days, they now send multiple trucks to that neighborhood every day, and there are more days when trucks are targeting each neighborhood. All this increases carbon emissions and makes it hard for shippers to consolidate. ”  

    Working with Coppel, one of Mexico’s largest retailers, the Green Button project inspired a related Consolidation Ecommerce Project that built a large-scale mathematical model to provide a strategy for consolidation. The model determined what delivery time window each neighborhood demands and then calculated the best day to deliver to each neighborhood to meet the desired window while minimizing carbon emissions. 

    No matter what mixture of delivery times was used, the consolidation model helped retailers schedule deliveries more efficiently. Yet, the biggest cuts in emissions emerged when customers were willing to wait several days.

    Play video

    “When we ran a month-long simulation comparing our model for four-to-five-day delivery with Coppel’s existing model for one- or two-day delivery, we saw savings in fuel consumption of over 50 percent on certain routes” says Velázquez Martínez. “This is huge compared to other strategies for squeezing more efficiency from the last-mile supply chain, such as routing optimization, where savings are close to 5 percent. The optimal solution depends on factors such as the capacity for consolidation, the frequency of delivery, the store capacity, and the impact on inbound operations.” 

    The researchers next set out to determine if customers could be persuaded to wait longer for deliveries. Considering that the price differential is low or nonexistent, this was a considerable challenge. Yet, the same day habit is only a few years old, and some consumers have come to realize they don’t always need rapid deliveries. “Some consumers who order by rapid delivery find they are too busy to open the packages right away,” says Velázquez Martínez.  

    Trees beat kilograms of CO2

    The researchers set out to find if consumers would be willing to sacrifice a bit of convenience if they knew they were helping to reduce climate change. The Green Button project tested different public outreach strategies. For one test group, they reported the carbon impact of delivery times in kilograms of carbon dioxide (CO2). Another group received the information expressed in terms of the energy required to recycle a certain amount of garbage. A third group learned about emissions in terms of the number of trees required to trap the carbon. “Explaining the impact in terms of trees led to almost 90 percent willing to wait another day or two,” says Velázquez Martínez. “This is compared to less than 40 percent for the group that received the data in kilograms of CO2.” 

    Another surprise was that there was no difference in response based on income, gender, or age. “Most studies of green consumers suggest they are predominantly high income, female, highly educated, or younger,” says Velázquez Martínez. “However, our results show that the differences were the same between low and high income, women and men, and younger and older people. We have shown that disclosing emissions transparently and making the consumer a part of the strategy can be a new opportunity for more consumer-driven logistics sustainability.” 

    The researchers are now developing similar models for business-to-business (B2B) e-commerce. “We found that B2B supply chain emissions are often high because many shipping companies require strict delivery windows,” says Velázquez Martínez.  

    The B2B models drill down to examine the Corporate Value Chain (Scope 3) emissions of suppliers. “Although some shipping companies are now asking their suppliers to review emissions, it is a challenge to create a transparent supply chain,” says Velázquez Martínez.  “Technological innovations have made it easier, starting with RFID [radio frequency identification], and then real-time GPS mapping and blockchain. But these technologies need to be more accessible and affordable, and we need more companies willing to use them.” 

    Some companies have been hesitant to dig too deeply into their supply chain, fearing they might uncover a scandal that might risk their reputation, says Velázquez Martínez. Other organizations are forced to look at the issue when nongovernmental organizations research sustainability issues such as social injustice in sweat shops and conflict mineral mines. 

    One challenge to building a transparent supply chain is that “in many companies, the sustainability teams are separate from the rest of the company,” says Velázquez Martínez. “Even if the CEOs receive information on sustainability issues, it often doesn’t filter down because the information does not belong to the planners or managers. We are pushing companies to not only account for sustainability factors in supply chain network design but also examine daily operations that affect sustainability. This is a big topic now: How can we translate sustainability information into something that everybody can understand and use?” 

    LIFT Lab lifts micro-retailers  

    In 2016, Velázquez Martínez launched the MIT GeneSys project to gain insights into micro and small enterprises (MSEs) in developing countries. The project released a GeneSys mobile app, which was used by more than 500 students throughout Latin America to collect data on more than 800 microfirms. In 2022, he launched the LIFT Lab, which focuses more specifically on studying and improving the supply chain for MSEs.  

    Worldwide, some 90 percent of companies have fewer than 10 employees. In Latin America and the Caribbean, companies with fewer than 50 employees represent 99 percent of all companies and 47 percent of employment. 

    Although MSEs represent much of the world’s economy, they are poorly understood, notes Velázquez Martínez. “Those tiny businesses are driving a lot of the economy and serve as important customers for the large companies working in developing countries. They range from small businesses down to people trying to get some money to eat by selling cakes or tacos through their windows.”  

    The MIT LIFT Lab researchers investigated whether MSE supply chain issues could help shed light on why many Latin American countries have been limited to marginal increases in gross domestic product. “Large companies from the developed world that are operating in Latin America, such as Unilever, Walmart, and Coca-Cola, have huge growth there, in some cases higher than they have in the developed world,” says Velázquez Martínez. “Yet, the countries are not developing as fast as we would expect.” 

    The LIFT Lab data showed that while the multinationals are thriving in Latin America, the local MSEs are decreasing in productivity. The study also found the trend has worsened with Covid-19.  

    The LIFT Lab’s first big project, which is sponsored by Mexican beverage and retail company FEMSA, is studying supply chains in Mexico. The study spans 200,000 micro-retailers and 300,000 consumers. In a collaboration with Tecnológico de Monterrey, hundreds of students are helping with a field study.  

    “We are looking at supply chain management and business capabilities and identifying the challenges to adoption of technology and digitalization,” says Velázquez Martínez. “We want to find the best ways for micro-firms to work with suppliers and consumers by identifying the consumers who access this market, as well as the products and services that can best help the micro-firms drive growth.” 

    Based on the earlier research by GeneSys, Velázquez Martínez has developed some hypotheses for potential improvements for micro-retailer supply chain, starting with payment terms. “We found that the micro-firms often get the worst purchasing deals. Owners without credit cards and with limited cash often buy in smaller amounts at much higher prices than retailers like Walmart. The big suppliers are squeezing them.” 

    While large retailers usually get 60 to 120 days to pay, micro-retailers “either pay at the moment of the transaction or in advance,” says Velázquez Martínez. “In a study of 500 micro-retailers in five countries in Latin America, we found the average payment time was minus seven days payment in advance. These terms reduce cash availability and often lead to bankruptcy.” 

    LIFT Lab is working with suppliers to persuade them to offer a minimum payment time of two weeks. “We can show the suppliers that the change in terms will let them move more product and increase sales,” says Velázquez Martínez. “Meanwhile, the micro-retailers gain higher profits and become more stable, even if they may pay a bit more.” 

    LIFT Lab is also looking at ways that micro-retailers can leverage smartphones for digitalization and planning. “Some of these companies are keeping records on napkins,” says Velázquez Martínez. “By using a cellphone, they can charge orders to suppliers and communicate with consumers. We are testing different dashboards for mobile apps to help with planning and financial performance. We are also recommending services the stores can provide, such as paying electricity or water bills. The idea is to build more capabilities and knowledge and increase business competencies for the supply chain that are tailored for micro-retailers.” 

    From a financial perspective, micro-retailers are not always the most efficient way to move products. Yet they also play an important role in building social cohesion within neighborhoods. By offering more services, the corner bodega can bring people together in ways that are impossible with e-commerce and big-box stores.  

    Whether the consumers are micro-firms buying from suppliers or e-commerce customers waiting for packages, “transparency is key to building a sustainable supply chain,” says Velázquez Martínez. “To change consumer habits, consumers need to be better educated on the impacts of their behaviors. With consumer-facing logistics, ‘The last shall be first, and the first last.’” More

  • in

    Manufacturing a cleaner future

    Manufacturing had a big summer. The CHIPS and Science Act, signed into law in August, represents a massive investment in U.S. domestic manufacturing. The act aims to drastically expand the U.S. semiconductor industry, strengthen supply chains, and invest in R&D for new technological breakthroughs. According to John Hart, professor of mechanical engineering and director of the Laboratory for Manufacturing and Productivity at MIT, the CHIPS Act is just the latest example of significantly increased interest in manufacturing in recent years.

    “You have multiple forces working together: reflections from the pandemic’s impact on supply chains, the geopolitical situation around the world, and the urgency and importance of sustainability,” says Hart. “This has now aligned incentives among government, industry, and the investment community to accelerate innovation in manufacturing and industrial technology.”

    Hand-in-hand with this increased focus on manufacturing is a need to prioritize sustainability.

    Roughly one-quarter of greenhouse gas emissions came from industry and manufacturing in 2020. Factories and plants can also deplete local water reserves and generate vast amounts of waste, some of which can be toxic.

    To address these issues and drive the transition to a low-carbon economy, new products and industrial processes must be developed alongside sustainable manufacturing technologies. Hart sees mechanical engineers as playing a crucial role in this transition.

    “Mechanical engineers can uniquely solve critical problems that require next-generation hardware technologies, and know how to bring their solutions to scale,” says Hart.

    Several fast-growing companies founded by faculty and alumni from MIT’s Department of Mechanical Engineering offer solutions for manufacturing’s environmental problem, paving the path for a more sustainable future.

    Gradiant: Cleantech water solutions

    Manufacturing requires water, and lots of it. A medium-sized semiconductor fabrication plant uses upward of 10 million gallons of water a day. In a world increasingly plagued by droughts, this dependence on water poses a major challenge.

    Gradiant offers a solution to this water problem. Co-founded by Anurag Bajpayee SM ’08, PhD ’12 and Prakash Govindan PhD ’12, the company is a pioneer in sustainable — or “cleantech” — water projects.

    As doctoral students in the Rohsenow Kendall Heat Transfer Laboratory, Bajpayee and Govindan shared a pragmatism and penchant for action. They both worked on desalination research — Bajpayee with Professor Gang Chen and Govindan with Professor John Lienhard.

    Inspired by a childhood spent during a severe drought in Chennai, India, Govindan developed for his PhD a humidification-dehumidification technology that mimicked natural rainfall cycles. It was with this piece of technology, which they named Carrier Gas Extraction (CGE), that the duo founded Gradiant in 2013.

    The key to CGE lies in a proprietary algorithm that accounts for variability in the quality and quantity in wastewater feed. At the heart of the algorithm is a nondimensional number, which Govindan proposes one day be called the “Lienhard Number,” after his doctoral advisor.

    “When the water quality varies in the system, our technology automatically sends a signal to motors within the plant to adjust the flow rates to bring back the nondimensional number to a value of one. Once it’s brought back to a value of one, you’re running in optimal condition,” explains Govindan, who serves as chief operating officer of Gradiant.

    This system can treat and clean the wastewater produced by a manufacturing plant for reuse, ultimately conserving millions of gallons of water each year.

    As the company has grown, the Gradiant team has added new technologies to their arsenal, including Selective Contaminant Extraction, a cost-efficient method that removes only specific contaminants, and a brine-concentration method called Counter-Flow Reverse Osmosis. They now offer a full technology stack of water and wastewater treatment solutions to clients in industries including pharmaceuticals, energy, mining, food and beverage, and the ever-growing semiconductor industry.

    “We are an end-to-end water solutions provider. We have a portfolio of proprietary technologies and will pick and choose from our ‘quiver’ depending on a customer’s needs,” says Bajpayee, who serves as CEO of Gradiant. “Customers look at us as their water partner. We can take care of their water problem end-to-end so they can focus on their core business.”

    Gradiant has seen explosive growth over the past decade. With 450 water and wastewater treatment plants built to date, they treat the equivalent of 5 million households’ worth of water each day. Recent acquisitions saw their total employees rise to above 500.

    The diversity of Gradiant’s solutions is reflected in their clients, who include Pfizer, AB InBev, and Coca-Cola. They also count semiconductor giants like Micron Technology, GlobalFoundries, Intel, and TSMC among their customers.

    “Over the last few years, we have really developed our capabilities and reputation serving semiconductor wastewater and semiconductor ultrapure water,” says Bajpayee.

    Semiconductor manufacturers require ultrapure water for fabrication. Unlike drinking water, which has a total dissolved solids range in the parts per million, water used to manufacture microchips has a range in the parts per billion or quadrillion.

    Currently, the average recycling rate at semiconductor fabrication plants — or fabs — in Singapore is only 43 percent. Using Gradiant’s technologies, these fabs can recycle 98-99 percent of the 10 million gallons of water they require daily. This reused water is pure enough to be put back into the manufacturing process.

    “What we’ve done is eliminated the discharge of this contaminated water and nearly eliminated the dependence of the semiconductor fab on the public water supply,” adds Bajpayee.

    With new regulations being introduced, pressure is increasing for fabs to improve their water use, making sustainability even more important to brand owners and their stakeholders.

    As the domestic semiconductor industry expands in light of the CHIPS and Science Act, Gradiant sees an opportunity to bring their semiconductor water treatment technologies to more factories in the United States.

    Via Separations: Efficient chemical filtration

    Like Bajpayee and Govindan, Shreya Dave ’09, SM ’12, PhD ’16 focused on desalination for her doctoral thesis. Under the guidance of her advisor Jeffrey Grossman, professor of materials science and engineering, Dave built a membrane that could enable more efficient and cheaper desalination.

    A thorough cost and market analysis brought Dave to the conclusion that the desalination membrane she developed would not make it to commercialization.

    “The current technologies are just really good at what they do. They’re low-cost, mass produced, and they worked. There was no room in the market for our technology,” says Dave.

    Shortly after defending her thesis, she read a commentary article in the journal Nature that changed everything. The article outlined a problem. Chemical separations that are central to many manufacturing processes require a huge amount of energy. Industry needed more efficient and cheaper membranes. Dave thought she might have a solution.

    After determining there was an economic opportunity, Dave, Grossman, and Brent Keller PhD ’16 founded Via Separations in 2017. Shortly thereafter, they were chosen as one of the first companies to receive funding from MIT’s venture firm, The Engine.

    Currently, industrial filtration is done by heating chemicals at very high temperatures to separate compounds. Dave likens it to making pasta by boiling all of the water off until it evaporates and all you are left with is the pasta noodles. In manufacturing, this method of chemical separation is extremely energy-intensive and inefficient.

    Via Separations has created the chemical equivalent of a “pasta strainer.” Rather than using heat to separate, their membranes “strain” chemical compounds. This method of chemical filtration uses 90 percent less energy than standard methods.

    While most membranes are made of polymers, Via Separations’ membranes are made with graphene oxide, which can withstand high temperatures and harsh conditions. The membrane is calibrated to the customer’s needs by altering the pore size and tuning the surface chemistry.

    Currently, Dave and her team are focusing on the pulp and paper industry as their beachhead market. They have developed a system that makes the recovery of a substance known as “black liquor” more energy efficient.

    “When tree becomes paper, only one-third of the biomass is used for the paper. Currently the most valuable use for the remaining two-thirds not needed for paper is to take it from a pretty dilute stream to a pretty concentrated stream using evaporators by boiling off the water,” says Dave.

    This black liquor is then burned. Most of the resulting energy is used to power the filtration process.

    “This closed-loop system accounts for an enormous amount of energy consumption in the U.S. We can make that process 84 percent more efficient by putting the ‘pasta strainer’ in front of the boiler,” adds Dave.

    VulcanForms: Additive manufacturing at industrial scale

    The first semester John Hart taught at MIT was a fruitful one. He taught a course on 3D printing, broadly known as additive manufacturing (AM). While it wasn’t his main research focus at the time, he found the topic fascinating. So did many of the students in the class, including Martin Feldmann MEng ’14.

    After graduating with his MEng in advanced manufacturing, Feldmann joined Hart’s research group full time. There, they bonded over their shared interest in AM. They saw an opportunity to innovate with an established metal AM technology, known as laser powder bed fusion, and came up with a concept to realize metal AM at an industrial scale.

    The pair co-founded VulcanForms in 2015.

    “We have developed a machine architecture for metal AM that can build parts with exceptional quality and productivity,” says Hart. “And, we have integrated our machines in a fully digital production system, combining AM, postprocessing, and precision machining.”

    Unlike other companies that sell 3D printers for others to produce parts, VulcanForms makes and sells parts for their customers using their fleet of industrial machines. VulcanForms has grown to nearly 400 employees. Last year, the team opened their first production factory, known as “VulcanOne,” in Devens, Massachusetts.

    The quality and precision with which VulcanForms produces parts is critical for products like medical implants, heat exchangers, and aircraft engines. Their machines can print layers of metal thinner than a human hair.

    “We’re producing components that are difficult, or in some cases impossible to manufacture otherwise,” adds Hart, who sits on the company’s board of directors.

    The technologies developed at VulcanForms may help lead to a more sustainable way to manufacture parts and products, both directly through the additive process and indirectly through more efficient, agile supply chains.

    One way that VulcanForms, and AM in general, promotes sustainability is through material savings.

    Many of the materials VulcanForms uses, such as titanium alloys, require a great deal of energy to produce. When titanium parts are 3D-printed, substantially less of the material is used than in a traditional machining process. This material efficiency is where Hart sees AM making a large impact in terms of energy savings.

    Hart also points out that AM can accelerate innovation in clean energy technologies, ranging from more efficient jet engines to future fusion reactors.

    “Companies seeking to de-risk and scale clean energy technologies require know-how and access to advanced manufacturing capability, and industrial additive manufacturing is transformative in this regard,” Hart adds.

    LiquiGlide: Reducing waste by removing friction

    There is an unlikely culprit when it comes to waste in manufacturing and consumer products: friction. Kripa Varanasi, professor of mechanical engineering, and the team at LiquiGlide are on a mission to create a frictionless future, and substantially reduce waste in the process.

    Founded in 2012 by Varanasi and alum David Smith SM ’11, LiquiGlide designs custom coatings that enable liquids to “glide” on surfaces. Every last drop of a product can be used, whether it’s being squeezed out of a tube of toothpaste or drained from a 500-liter tank at a manufacturing plant. Making containers frictionless substantially minimizes wasted product, and eliminates the need to clean a container before recycling or reusing.

    Since launching, the company has found great success in consumer products. Customer Colgate utilized LiquiGlide’s technologies in the design of the Colgate Elixir toothpaste bottle, which has been honored with several industry awards for design. In a collaboration with world- renowned designer Yves Béhar, LiquiGlide is applying their technology to beauty and personal care product packaging. Meanwhile, the U.S. Food and Drug Administration has granted them a Device Master Filing, opening up opportunities for the technology to be used in medical devices, drug delivery, and biopharmaceuticals.

    In 2016, the company developed a system to make manufacturing containers frictionless. Called CleanTanX, the technology is used to treat the surfaces of tanks, funnels, and hoppers, preventing materials from sticking to the side. The system can reduce material waste by up to 99 percent.

    “This could really change the game. It saves wasted product, reduces wastewater generated from cleaning tanks, and can help make the manufacturing process zero-waste,” says Varanasi, who serves as chair at LiquiGlide.

    LiquiGlide works by creating a coating made of a textured solid and liquid lubricant on the container surface. When applied to a container, the lubricant remains infused within the texture. Capillary forces stabilize and allow the liquid to spread on the surface, creating a continuously lubricated surface that any viscous material can slide right down. The company uses a thermodynamic algorithm to determine the combinations of safe solids and liquids depending on the product, whether it’s toothpaste or paint.

    The company has built a robotic spraying system that can treat large vats and tanks at manufacturing plants on site. In addition to saving companies millions of dollars in wasted product, LiquiGlide drastically reduces the amount of water needed to regularly clean these containers, which normally have product stuck to the sides.

    “Normally when you empty everything out of a tank, you still have residue that needs to be cleaned with a tremendous amount of water. In agrochemicals, for example, there are strict regulations about how to deal with the resulting wastewater, which is toxic. All of that can be eliminated with LiquiGlide,” says Varanasi.

    While the closure of many manufacturing facilities early in the pandemic slowed down the rollout of CleanTanX pilots at plants, things have picked up in recent months. As manufacturing ramps up both globally and domestically, Varanasi sees a growing need for LiquiGlide’s technologies, especially for liquids like semiconductor slurry.

    Companies like Gradiant, Via Separations, VulcanForms, and LiquiGlide demonstrate that an expansion in manufacturing industries does not need to come at a steep environmental cost. It is possible for manufacturing to be scaled up in a sustainable way.

    “Manufacturing has always been the backbone of what we do as mechanical engineers. At MIT in particular, there is always a drive to make manufacturing sustainable,” says Evelyn Wang, Ford Professor of Engineering and former head of the Department of Mechanical Engineering. “It’s amazing to see how startups that have an origin in our department are looking at every aspect of the manufacturing process and figuring out how to improve it for the health of our planet.”

    As legislation like the CHIPS and Science Act fuels growth in manufacturing, there will be an increased need for startups and companies that develop solutions to mitigate the environmental impact, bringing us closer to a more sustainable future. More

  • in

    MIT scientists contribute to National Ignition Facility fusion milestone

    On Monday, Dec. 5, at around 1 a.m., a tiny sphere of deuterium-tritium fuel surrounded by a cylindrical can of gold called a hohlraum was targeted by 192 lasers at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) in California. Over the course of billionths of a second, the lasers fired, generating X-rays inside the gold can, and imploding the sphere of fuel.

    On that morning, for the first time ever, the lasers delivered 2.1 megajoules of energy and yielded 3.15 megajoules in return, achieving a historic fusion energy gain well above 1 — a result verified by diagnostic tools developed by the MIT Plasma Science and Fusion Center (PSFC). The use of these tools and their importance was referenced by Arthur Pak, a LLNL staff scientist who spoke at a U.S. Department of Energy press event on Dec. 13 announcing the NIF’s success.

    Johan Frenje, head of the PSFC High-Energy-Density Physics division, notes that this milestone “will have profound implications for laboratory fusion research in general.”

    Since the late 1950s, researchers worldwide have pursued fusion ignition and energy gain in a laboratory, considering it one of the grand challenges of the 21st century. Ignition can only be reached when the internal fusion heating power is high enough to overcome the physical processes that cool the fusion plasma, creating a positive thermodynamic feedback loop that very rapidly increases the plasma temperature. In the case of inertial confinement fusion, the method used at the NIF, ignition can initiate a “fuel burn propagation” into the surrounding dense and cold fuel, and when done correctly, enable fusion-energy gain.

    Frenje and his PSFC division initially designed dozens of diagnostic systems that were implemented at the NIF, including the vitally important magnetic recoil neutron spectrometer (MRS), which measures the neutron energy spectrum, the data from which fusion yield, plasma ion temperature, and spherical fuel pellet compression (“fuel areal density”) can be determined. Overseen by PSFC Research Scientist Maria Gatu Johnson since 2013, the MRS is one of two systems at the NIF relied upon to measure the absolute neutron yield from the Dec. 5 experiment because of its unique ability to accurately interpret an implosion’s neutron signals.

    “Before the announcement of this historic achievement could be made, the LLNL team wanted to wait until Maria had analyzed the MRS data to an adequate level for a fusion yield to be determined,” says Frenje.

    Response around MIT to NIF’s announcement has been enthusiastic and hopeful. “This is the kind of breakthrough that ignites the imagination,” says Vice President for Research Maria Zuber, “reminding us of the wonder of discovery and the possibilities of human ingenuity. Although we have a long, hard path ahead of us before fusion can deliver clean energy to the electrical grid, we should find much reason for optimism in today’s announcement. Innovation in science and technology holds great power and promise to address some of the world’s biggest challenges, including climate change.”

    Frenje also credits the rest of the team at the PSFC’s High-Energy-Density Physics division, the Laboratory for Laser Energetics at the University of Rochester, LLNL, and other collaborators for their support and involvement in this research, as well as the National Nuclear Security Administration of the Department of Energy, which has funded much of their work since the early 1990s. He is also proud of the number of MIT PhDs that have been generated by the High-Energy-Density Physics Division and subsequently hired by LLNL, including the experimental lead for this experiment, Alex Zylstra PhD ’15.

    “This is really a team effort,” says Frenje. “Without the scientific dialogue and the extensive know-how at the HEDP Division, the critical contributions made by the MRS system would not have happened.” More

  • in

    Pursuing a practical approach to research

    Koroush Shirvan, the John Clark Hardwick Career Development Professor in the Department of Nuclear Science and Engineering (NSE), knows that the nuclear industry has traditionally been wary of innovations until they are shown to have proven utility. As a result, he has relentlessly focused on practical applications in his research, work that has netted him the 2022 Reactor Technology Award from the American Nuclear Society. “The award has usually recognized practical contributions to the field of reactor design and has not often gone to academia,” Shirvan says.

    One of these “practical contributions” is in the field of accident-tolerant fuels, a program launched by the U.S. Nuclear Regulatory Commission in the wake of the 2011 Fukushima Daiichi incident. The goal within this program, says Shirvan, is to develop new forms of nuclear fuels that can tolerate heat. His team, with students from over 16 countries, is working on numerous possibilities that range in composition and method of production.

    Another aspect of Shirvan’s research focuses on how radiation impacts heat transfer mechanisms in the reactor. The team found fuel corrosion to be the driving force. “[The research] informs how nuclear fuels perform in the reactor, from a practical point of view,” Shirvan says.

    Optimizing nuclear reactor design

    A summer internship when Shirvan was an undergraduate at the University of Florida at Gainesville seeded his drive to focus on practical applications in his studies. A nearby nuclear utility was losing millions because of crud accumulating on fuel rods. Over time, the company was solving the problem by using more fuel, before it had extracted all the life from earlier batches.

    Placement of fuel rods in nuclear reactors is a complex problem with many factors — the life of the fuel, location of hot spots — affecting outcomes. Nuclear reactors change their configuration of fuel rods every 18-24 months to optimize close to 15-20 constraints, leading to roughly 200-800 assemblies. The mind-boggling nature of the problem means that plants have to rely on experienced engineers.

    During his internship, Shirvan optimized the program used to place fuel rods in the reactor. He found that certain rods in assemblies were more prone to the crud deposits, and reworked their configurations, optimizing for these rods’ performance instead of adding assemblies.

    In recent years, Shirvan has applied a branch of artificial intelligence — reinforcement learning — to the configuration problem and created a software program used by the largest U.S. nuclear utility. “This program gives even a layperson the ability to reconfigure the fuels and the reactor without having expert knowledge,” Shirvan says.

    From advanced math to counting jelly beans

    Shirvan’s own expertise in nuclear science and engineering developed quite organically. He grew up in Tehran, Iran, and when he was 14 the family moved to Gainesville, where Shirvan’s aunt and family live. He remembers an awkward couple of years at the new high school where he was grouped in with newly arrived international students, and placed in entry-level classes. “I went from doing advanced mathematics in Iran to counting jelly beans,” he laughs.

    Shirvan applied to the University of Florida for his undergraduate studies since it made economic sense; the school gave full scholarships to Floridian students who received a certain minimum SAT score. Shirvan qualified. His uncle, who was a professor in the nuclear engineering department then, encouraged Shirvan to take classes in the department. Under his uncle’s mentorship, the courses Shirvan took, and his internship, cemented his love of the interdisciplinary approach that the field demanded.

    Having always known that he wanted to teach — he remembers finishing his math tests early in Tehran so he could earn the reward of being class monitor — Shirvan knew graduate school was next. His uncle encouraged him to apply to MIT and to the University of Michigan, home to reputable programs in the field. Shirvan chose MIT because “only at MIT was there a program on nuclear design. There were faculty dedicated to designing new reactors, looking at multiple disciplines, and putting all of that together.” He went on to pursue his master’s and doctoral studies at NSE under the supervision of Professor Mujid Kazimi, focusing on compact pressurized and boiling water reactor designs. When Kazimi passed away suddenly in 2015, Shirvan was a research scientist, and switched to tenure track to guide the professor’s team.

    Another project that Shirvan took in 2015: leadership of MIT’s course on nuclear reactor technology for utility executives. Offered only by the Institute, the program is an introduction to nuclear engineering and safety for personnel who might not have much background in the area. “It’s a great course because you get to see what the real problems are in the energy sector … like grid stability,” Shirvan says.

    A multipronged approach to savings

    Another very real problem nuclear utilities face is cost. Contrary to what one hears on the news, one of the biggest stumbling blocks to building new nuclear facilities in the United States is cost, which today can be up to three times that of renewables, Shirvan says. While many approaches such as advanced manufacturing have been tried, Shirvan believes that the solution to decrease expenditures lies in designing more compact reactors.

    His team has developed an open-source advanced nuclear cost tool and has focused on two different designs: a small water reactor using compact steam technology and a horizontal gas reactor. Compactness also means making fuels more efficient, as Shirvan’s work does, and in improving the heat exchange device. It’s all back to the basics and bringing “commercial viable arguments in with your research,” Shirvan explains.

    Shirvan is excited about the future of the U.S. nuclear industry, and that the 2022 Inflation Reduction Act grants the same subsidies to nuclear as it does for renewables. In this new level playing field, advanced nuclear still has a long way to go in terms of affordability, he admits. “It’s time to push forward with cost-effective design,” Shirvan says, “I look forward to supporting this by continuing to guide these efforts with research from my team.” More

  • in

    Decarbonization amid global crises

    A global pandemic. Russia’s invasion of Ukraine. Inflation. The first-ever serious challenge to the peaceful transfer of power in the United States.

    Forced to face a seemingly unending series of once-in-a-generation crises, how can the world continue to focus attention on goals around carbon emissions and climate change? That was the question posed by Philip R. Sharp, the former president of Resources for the Future and a former 10-term member of the U.S. House of Representatives from Indiana, during his MIT Energy Initiative Fall Colloquium address, entitled “The prospects for decarbonization in America: Will global and domestic crises disrupt our plans?”

    Perhaps surprisingly, Sharp sounded an optimistic note in his answer. Despite deep political divisions in the United States, he noted, Congress has passed five major pieces of legislation — under both presidents Donald Trump and Joseph Biden — aimed at accelerating decarbonization efforts. Rather than hampering movement to combat climate change, Sharp said, domestic and global crises have seemed to galvanize support, create new incentives for action, and even unify political rivals around the cause.

    “Almost everybody is dealing with, to some degree, the absolutely profound, churning events that we are amidst now. Most of them are unexpected, and therefore [we’re] not prepared for [them], and they have had a profound shaking of our thinking,” Sharp said. “The conventional wisdom has not held up in almost all of these areas, and therefore it makes it much more difficult for us to think we know how to predict an uncertain future, and [it causes us to] question our own ability as a nation — or anywhere — to actually take on these challenges. And obviously, climate change is one of the most important.”

    However, Sharp continued, these challenges have, in some instances, spurred action. The war in Ukraine, he noted, has upset European energy markets, but it has also highlighted the importance of countries achieving a more energy-independent posture through renewables. “In America,” he added, “we’ve actually seen absolutely stunning … behavior by the United States Congress, of all places.”

    “What we’ve witnessed is, [Congress] put out incredible … sums of money under the previous administration, and then under this administration, to deal with the Covid crisis,” Sharp added later in his talk. “And then the United States government came together — red and blue — to support the Ukrainians against Russia. It saddens me to say, it seems to take a Russian invasion or the Chinese probing us economically to get us moving. But we are moving, and these things are happening.”

    Congressional action

    Sharp cautioned against getting “caught up” in the familiar viewpoint that Congress, in its current incarnation, is fundamentally incapable of passing meaningful legislation. He pointed, in particular, to the passage of five laws over the previous two years:

    The 2020 Energy Act, which has been characterized as a “down payment on fighting climate change.”
    The Infrastructure Investment and Jobs Act (sometimes called the “bipartisan infrastructure bill”), which calls for investments in passenger rail, electric vehicle infrastructure, electric school buses, and other clean-energy measures;
    The CHIPS and Science Act, a $280 billion effort to revitalize the American semiconductor industry, which some analysts say could direct roughly one-quarter of its funding toward accelerating zero-carbon industries and conducting climate research;
    The Inflation Reduction Act (called by some “the largest climate legislation in U.S. history”), which includes tax credits, incentives, and other provisions to help private companies tackle climate change, increase investments in renewable energy, and enhance energy efficiency; and
    The Kigali Amendment to the Montreal Protocol, ratified by the Senate to little fanfare in September, under which the United States agreed to reduce the consumption and production of hydrofluorocarbons (HFCs).
    “It is a big deal,” Sharp said of the dramatic increase in federal climate action. “It is very significant actions that are being taken — more than what we would expect, or I would expect, out of the Congress at any one time.”

    Along with the many billions of dollars of climate-related investments included in the legislation, Sharp said, these new laws will have a number of positive “spillover” effects.

    “This enables state governments, in their policies, to be more aggressive,” Sharp said. “Why? Because it makes it cheaper for some of the investments that they will try to force within their state.” Another “pretty obvious” spillover effect, Sharp said, is that the new laws will enhance U.S. credibility in international negotiations. Finally, he said, these public investments will make the U.S. economy more competitive in international markets for clean-energy technology — particularly as the United States seeks to compete against China in the space.

    “[Competition with China] has become a motivator in American politics, like it or not,” Sharp said. “There is no question that it is causing and bringing together [politicians] across blue [states] and red [states].”

    Holding onto progress

    Even in an uncertain political climate in which Democrats and Republicans seem unable to agree on basic facts, recent funding commitments are likely to survive, no matter which party controls Congress and the presidency, Sharp said. That’s because most of the legislation relies on broadly popular “carrots” that reward investments in decarbonization, rather than less popular “sticks” that create new restrictions or punishments for companies that fail to decarbonize.

    “Politically, the impact of this is very significant,” Sharp said. “It is so much easier in politics to give away tax [credits] than it is to penalize or put requirements onto people. The fact is that these tax credits are more likely to be politically sustained than other forms of government intervention. That, at least, has been the history.”

    Sharp stressed the importance of what he called “civil society” — institutions such as universities, nonprofits, churches, and other organizations that are apart from government and business — in promoting decarbonization efforts. “[Those groups] can act highly independently, and therefore, they can drive for things that others are not willing to do. Now this does not always work to good purposes. Partly, this diversity and this decentralization in civil society … led to deniers and others being able to stop some climate action. But now my view is, this is starting to all move in the right direction, in a very dynamic and a very important way. What we have seen over the last few years is a big uptick in philanthropy related to climate.”

    Looking ahead

    Sharp’s optimism even extended to the role of social media. He suggested that the “Wild West” era of social platforms may be ending, pointing to the celebrities who have recently lost valuable business partnerships for spreading hate speech and disinformation. “We’re now a lot more alert to the dangers,” he said.

    Some in the audience questioned Sharp about specific paths toward decarbonization, but Sharp said that progress will require a number of disparate approaches — some of which will inevitably have a greater impact than others. “The current policy, and the policy embedded in this [new] legislation … is all about doing both,” he said. “It’s all about advancing [current] technologies into the marketplace, and at the same time driving for breakthroughs.”

    Above all, Sharp stressed the need for continued collective action around climate change. “The fact is, we’re all contributors to some degree,” he said. “But we also all can do something. In my view, this is clearly not a time for hand-wringing. This is a time for action. People have to roll up their sleeves, and go to work, and not roll them down anytime soon.” More

  • in

    A healthy wind

    Nearly 10 percent of today’s electricity in the United States comes from wind power. The renewable energy source benefits climate, air quality, and public health by displacing emissions of greenhouse gases and air pollutants that would otherwise be produced by fossil-fuel-based power plants.

    A new MIT study finds that the health benefits associated with wind power could more than quadruple if operators prioritized turning down output from the most polluting fossil-fuel-based power plants when energy from wind is available.

    In the study, published today in Science Advances, researchers analyzed the hourly activity of wind turbines, as well as the reported emissions from every fossil-fuel-based power plant in the country, between the years 2011 and 2017. They traced emissions across the country and mapped the pollutants to affected demographic populations. They then calculated the regional air quality and associated health costs to each community.

    The researchers found that in 2014, wind power that was associated with state-level policies improved air quality overall, resulting in $2 billion in health benefits across the country. However, only roughly 30 percent of these health benefits reached disadvantaged communities.

    The team further found that if the electricity industry were to reduce the output of the most polluting fossil-fuel-based power plants, rather than the most cost-saving plants, in times of wind-generated power, the overall health benefits could quadruple to $8.4 billion nationwide. However, the results would have a similar demographic breakdown.

    “We found that prioritizing health is a great way to maximize benefits in a widespread way across the U.S., which is a very positive thing. But it suggests it’s not going to address disparities,” says study co-author Noelle Selin, a professor in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences at MIT. “In order to address air pollution disparities, you can’t just focus on the electricity sector or renewables and count on the overall air pollution benefits addressing these real and persistent racial and ethnic disparities. You’ll need to look at other air pollution sources, as well as the underlying systemic factors that determine where plants are sited and where people live.”

    Selin’s co-authors are lead author and former MIT graduate student Minghao Qiu PhD ’21, now at Stanford University, and Corwin Zigler at the University of Texas at Austin.

    Turn-down service

    In their new study, the team looked for patterns between periods of wind power generation and the activity of fossil-fuel-based power plants, to see how regional electricity markets adjusted the output of power plants in response to influxes of renewable energy.

    “One of the technical challenges, and the contribution of this work, is trying to identify which are the power plants that respond to this increasing wind power,” Qiu notes.

    To do so, the researchers compared two historical datasets from the period between 2011 and 2017: an hour-by-hour record of energy output of wind turbines across the country, and a detailed record of emissions measurements from every fossil-fuel-based power plant in the U.S. The datasets covered each of seven major regional electricity markets, each market providing energy to one or multiple states.

    “California and New York are each their own market, whereas the New England market covers around seven states, and the Midwest covers more,” Qiu explains. “We also cover about 95 percent of all the wind power in the U.S.”

    In general, they observed that, in times when wind power was available, markets adjusted by essentially scaling back the power output of natural gas and sub-bituminous coal-fired power plants. They noted that the plants that were turned down were likely chosen for cost-saving reasons, as certain plants were less costly to turn down than others.

    The team then used a sophisticated atmospheric chemistry model to simulate the wind patterns and chemical transport of emissions across the country, and determined where and at what concentrations the emissions generated fine particulates and ozone — two pollutants that are known to damage air quality and human health. Finally, the researchers mapped the general demographic populations across the country, based on U.S. census data, and applied a standard epidemiological approach to calculate a population’s health cost as a result of their pollution exposure.

    This analysis revealed that, in the year 2014, a general cost-saving approach to displacing fossil-fuel-based energy in times of wind energy resulted in $2 billion in health benefits, or savings, across the country. A smaller share of these benefits went to disadvantaged populations, such as communities of color and low-income communities, though this disparity varied by state.

    “It’s a more complex story than we initially thought,” Qiu says. “Certain population groups are exposed to a higher level of air pollution, and those would be low-income people and racial minority groups. What we see is, developing wind power could reduce this gap in certain states but further increase it in other states, depending on which fossil-fuel plants are displaced.”

    Tweaking power

    The researchers then examined how the pattern of emissions and the associated health benefits would change if they prioritized turning down different fossil-fuel-based plants in times of wind-generated power. They tweaked the emissions data to reflect several alternative scenarios: one in which the most health-damaging, polluting power plants are turned down first; and two other scenarios in which plants producing the most sulfur dioxide and carbon dioxide respectively, are first to reduce their output.

    They found that while each scenario increased health benefits overall, and the first scenario in particular could quadruple health benefits, the original disparity persisted: Communities of color and low-income communities still experienced smaller health benefits than more well-off communities.

    “We got to the end of the road and said, there’s no way we can address this disparity by being smarter in deciding which plants to displace,” Selin says.

    Nevertheless, the study can help identify ways to improve the health of the general population, says Julian Marshall, a professor of environmental engineering at the University of Washington.

    “The detailed information provided by the scenarios in this paper can offer a roadmap to electricity-grid operators and to state air-quality regulators regarding which power plants are highly damaging to human health and also are likely to noticeably reduce emissions if wind-generated electricity increases,” says Marshall, who was not involved in the study.

    “One of the things that makes me optimistic about this area is, there’s a lot more attention to environmental justice and equity issues,” Selin concludes. “Our role is to figure out the strategies that are most impactful in addressing those challenges.”

    This work was supported, in part, by the U.S. Environmental Protection Agency, and by the National Institutes of Health. More

  • in

    Mining for the clean energy transition

    In a world powered increasingly by clean energy, drilling for oil and gas will gradually give way to digging for metals and minerals. Today, the “critical minerals” used to make electric cars, solar panels, wind turbines, and grid-scale battery storage are facing soaring demand — and some acute bottlenecks as miners race to catch up.

    According to a report from the International Energy Agency, by 2040, the worldwide demand for copper is expected to roughly double; demand for nickel and cobalt will grow at least sixfold; and the world’s hunger for lithium could reach 40 times what we use today.

    “Society is looking to the clean energy transition as a way to solve the environmental and social harms of climate change,” says Scott Odell, a visiting scientist at the MIT Environmental Solutions Initiative (ESI), where he helps run the ESI Mining, Environment, and Society Program, who is also a visiting assistant professor at George Washington University. “Yet mining the materials needed for that transition would also cause social and environmental impacts. So we need to look for ways to reduce our demand for minerals, while also improving current mining practices to minimize social and environmental impacts.”

    ESI recently hosted the inaugural MIT Conference on Mining, Environment, and Society to discuss how the clean energy transition may affect mining and the people and environments in mining areas. The conference convened representatives of mining companies, environmental and human rights groups, policymakers, and social and natural scientists to identify key concerns and possible collaborative solutions.

    “We can’t replace an abusive fossil fuel industry with an abusive mining industry that expands as we move through the energy transition,” said Jim Wormington, a senior researcher at Human Rights Watch, in a panel on the first day of the conference. “There’s a recognition from governments, civil society, and companies that this transition potentially has a really significant human rights and social cost, both in terms of emissions […] but also for communities and workers who are on the front lines of mining.”

    That focus on communities and workers was consistent throughout the three-day conference, as participants outlined the economic and social dimensions of standing up large numbers of new mines. Corporate mines can bring large influxes of government revenue and local investment, but the income is volatile and can leave policymakers and communities stranded when production declines or mineral prices fall. On the other hand, “artisanal” mining operations are an important source of critical minerals, but are hard to regulate and subject to abuses from brokers. And large reserves of minerals are found in conservation areas, regions with fragile ecosystems and experiencing water shortages that can be exacerbated by mining, in particular on Indigenous-controlled lands and other places where mine openings are deeply fraught.

    “One of the real triggers of conflict is a dissatisfaction with the current model of resource extraction,” said Jocelyn Fraser of the University of British Columbia in a panel discussion. “One that’s failed to support the long-term sustainable development of regions that host mining operations, and yet imposes significant local social and environmental impacts.”

    All these challenges point toward solutions in policy and in mining companies’ relationships with the communities where they work. Participants highlighted newer models of mining governance that can create better incentives for the ways mines operate — from full community ownership of mines to recognizing community rights to the benefits of mining to end-of-life planning for mines at the time they open.

    Many of the conference speakers also shared technological innovations that may help reduce mining challenges. Some operations are investing in desalination as alternative water sources in water-scarce regions; low-carbon alternatives are emerging to many of the fossil fuel-powered heavy machines that are mainstays of the industry; and work is being done to reclaim valuable minerals from mine tailings, helping to minimize both waste and the need to open new extraction sites.

    Increasingly, the mining industry itself is recognizing that reforms will allow it to thrive in a rapid clean-energy transition. “Decarbonization is really a profitability imperative,” said Kareemah Mohammed, managing director for sustainability services at the technology consultancy Accenture, on the conference’s second day. “It’s about securing a low-cost and steady supply of either minerals or metals, but it’s also doing so in an optimal way.”

    The three-day conference attracted over 350 attendees, from large mining companies, industry groups, consultancies, multilateral institutions, universities, nongovernmental organizations (NGOs), government, and more. It was held entirely virtually, a choice that helped make the conference not only truly international — participants joined from over 27 countries on six continents — but also accessible to members of nonprofits and professionals in the developing world.

    “Many people are concerned about the environmental and social challenges of supplying the clean energy revolution, and we’d heard repeatedly that there wasn’t a forum for government, industry, academia, NGOs, and communities to all sit at the same table and explore collaborative solutions,” says Christopher Noble, ESI’s director of corporate engagement. “Convening, and researching best practices, are roles that universities can play. The conversations at this conference have generated valuable ideas and consensus to pursue three parallel programs: best-in-class models for community engagement, improving ESG metrics and their use, and civil-society contributions to government/industry relations. We are developing these programs to keep the momentum going.”

    The MIT Conference on Mining, Environment, and Society was funded, in part, by Accenture, as part of the MIT/Accenture Convergence Initiative. Additional funding was provided by the Inter-American Development Bank. More