More stories

  • in

    Engaging enterprises with the climate crisis

    Almost every large corporation is committed to achieving net zero carbon emissions by 2050 but lacks a roadmap to get there, says John Sterman, professor of management at MIT’s Sloan School of Management, co-director of the MIT Sloan Sustainability Initiative, and leader of its Climate Pathways Project. Sterman and colleagues offer a suite of well-honed strategies to smooth this journey, including a free global climate policy simulator called En-ROADS deployed in workshops that have educated more than 230,000 people, including thousands of senior elected officials and leaders in business and civil society around the world. 

    Running on ordinary laptops, En-ROADS examines how we can reduce carbon emissions to keep global warming under 2 degrees Celsius, Sterman says. Users, expert or not, can easily explore how dozens of policies, such as pricing carbon and electrifying vehicles, can affect hundreds of factors such as temperature, energy prices, and sea level rise. 

    En-ROADs and related work on climate change are just one thread in Sterman’s decades of research to integrate environmental sustainability with business decisions. 

    “There’s a fundamental alignment between a healthy environment, a healthy society, and a healthy economy,” he says. “Destroy the environment and you destroy the economy and society. Likewise, hungry, ill-housed, insecure people, lacking decent jobs and equity in opportunity, will catch the last fish and cut the last tree, destroying the environment and society. Unfortunately, a lot of businesses still see the issue as a trade-off — if we focus on the environment, it will hurt our bottom line; if we improve working conditions, it will raise our labor costs. That turns out not to be true in many, many cases. But how can we help people understand that fundamental alignment? That’s where simulation models can play a big role.”

    Play video

    Learning with management flight simulators 

    “My original field is system dynamics, a method for understanding the complex systems in which we’re embedded—whether those are organizations, companies, markets, society as a whole, or the climate system” Sterman says. “You can build these wonderful, complex simulation models that offer important insights and insight into high-leverage policies so that organizations can make significant improvements.” 

    “But those models don’t do any good at all unless the folks in those organizations can learn for themselves about what those high-leverage opportunities are,” he emphasizes. “You can show people the best scientific evidence, the best data, and it’s not necessarily going to change their minds about what they ought to be doing. You’ve got to create a process that helps smart but busy people learn how they can improve their organizations.” 

    Sterman and his colleagues pioneered management flight simulators — which, like aircraft flight simulators, offer an environment in which you can make decisions, seeing what works and what doesn’t, at low cost with no risk. 

    “People learn best from experience and experiment,” he points out. “But in many of the most important settings that we face today, experience comes too late to be useful, and experiments are impossible. In such settings, simulation becomes the only way people can learn for themselves and gain the confidence to change their behavior in the real world.” 

    “You can’t learn to fly a new jetliner by watching someone else; to learn, one must be at the controls,” Sterman emphasizes. “People don’t change deeply embedded beliefs and behaviors just because somebody tells them that what they’re doing is harmful and there are better options. People have to learn for themselves.”

    Play video

    Learning the business of sustainability 

    His longstanding “laboratory for sustainable business” course lets MIT Sloan School students learn the state of the art in sustainability challenges — not just climate change but microplastics, water shortages, toxins in our food and air, and other crises. As part of the course, students work in teams with organizations on real sustainability challenges. “We’ve had a very wide range of companies and other organizations participate, and many of them come back year after year,” Sterman says. 

    MIT Sloan also offers executive education in sustainability, in both open enrollment and customized programs. “We’ve had all kinds of folks, from all over the world and every industry” he says. 

    In his opening class for executive MBAs, he polls attendees to ask if sustainability is a material issue for their companies, and how actively those companies are addressing that issue. Almost all of the attendees agree that sustainability is a key issue, but nearly all say their companies are not doing enough, with many saying they “comply with all applicable laws and regulations.” 

    “So there’s a huge disconnect,” Sterman points out. “How do you close that gap? How do you take action? How do you break the idea that if you take action to be more sustainable it will hurt your business, when in fact it’s almost always the other way around? And then how can you make the change happen, so that what you’re doing will get implemented and stick?” 

    Simulating policies for sustainability 

    Management flight simulators that offer active learning can provide crucial guidance. In the case of climate change, En-ROADs presents a straightforward interface that lets users adjust sliders to experiment with actions to try to bring down carbon emissions. “Should we have a price on carbon?” Sterman asks. “Should we promote renewables? Should we work on methane? Stop deforestation? You can try anything you want. You get immediate feedback on the likely consequences of your decisions. Often people are surprised as favorite policies — say, planting trees — have only minor impact on global warming. (In the case of trees, because it takes so long for the trees to grow).”

    One En-ROADS alumnus works for a pharmaceutical company that set a target of zero net emissions by mid-century. But, as often observed, measures proposed at the senior corporate level were often resisted by the operating units. The alumnus attacked the problem by bringing workshops with simulations and other sustainability tools to front-line employees in a manufacturing plant he knew well. He asked these employees how they thought they could reduce carbon emissions and what they needed to do so. 

    “It turns out that they had a long list of opportunities to reduce the emissions from this plant,” Sterman says. “But they didn’t have any support to get it done. He helped their ideas get that support, get the resources, come up with ways to monitor their progress, and ways to look for quick wins. It’s been highly successful.” 

    En-ROADS helps people understand that process improvement activity takes resources; you might need to take some equipment offline temporarily, for example, to upgrade or improve it. “There’s a little bit of a worse-before-better trade-off,” he says. “You need to be prepared. The active learning, the use of the simulators, helps people prepare for that journey and overcome the barriers that they will face.” 

    Interactive workshops with En-ROADS and other sustainability tools also brought change to another large corporation, HSBC Bank U.S.A. Like many other financial institutions, HSBC has committed to significantly cut its emissions, but many employees and executives didn’t understand why or what that would entail. For instance, would the bank give up potential business in carbon-intensive industries? 

    Brought to more than 1,000 employees, the En-ROADS workshops let employees surface concerns they might have about continuing to be successful while addressing climate concerns. “It turns out in many cases, there isn’t that much of a trade-off,” Sterman remarks. “Fossil energy projects, for example, are extremely risky. And there are opportunities to improve margins in other businesses where you can help cut their carbon footprint.” 

    The free version of En-ROADS generally satisfies the needs of most organizations, but Sterman and his partners also can augment the model or develop customized workshops to address specific concerns. 

    People who take the workshops emerge with a greater understanding of climate change and its effects, and a deeper knowledge of the high-leverage opportunities to cut emissions. “Even more importantly, they come out with a greater sense of urgency,” he says. “But they also come out with an understanding that it’s not too late. Time is short, but what we do can still make a difference.”  More

  • in

    Improving health outcomes by targeting climate and air pollution simultaneously

    Climate policies are typically designed to reduce greenhouse gas emissions that result from human activities and drive climate change. The largest source of these emissions is the combustion of fossil fuels, which increases atmospheric concentrations of ozone, fine particulate matter (PM2.5) and other air pollutants that pose public health risks. While climate policies may result in lower concentrations of health-damaging air pollutants as a “co-benefit” of reducing greenhouse gas emissions-intensive activities, they are most effective at improving health outcomes when deployed in tandem with geographically targeted air-quality regulations.

    Yet the computer models typically used to assess the likely air quality/health impacts of proposed climate/air-quality policy combinations come with drawbacks for decision-makers. Atmospheric chemistry/climate models can produce high-resolution results, but they are expensive and time-consuming to run. Integrated assessment models can produce results for far less time and money, but produce results at global and regional scales, rendering them insufficiently precise to obtain accurate assessments of air quality/health impacts at the subnational level.

    To overcome these drawbacks, a team of researchers at MIT and the University of California at Davis has developed a climate/air-quality policy assessment tool that is both computationally efficient and location-specific. Described in a new study in the journal ACS Environmental Au, the tool could enable users to obtain rapid estimates of combined policy impacts on air quality/health at more than 1,500 locations around the globe — estimates precise enough to reveal the equity implications of proposed policy combinations within a particular region.

    “The modeling approach described in this study may ultimately allow decision-makers to assess the efficacy of multiple combinations of climate and air-quality policies in reducing the health impacts of air pollution, and to design more effective policies,” says Sebastian Eastham, the study’s lead author and a principal research scientist at the MIT Joint Program on the Science and Policy of Global Change. “It may also be used to determine if a given policy combination would result in equitable health outcomes across a geographical area of interest.”

    To demonstrate the efficiency and accuracy of their policy assessment tool, the researchers showed that outcomes projected by the tool within seconds were consistent with region-specific results from detailed chemistry/climate models that took days or even months to run. While continuing to refine and develop their approaches, they are now working to embed the new tool into integrated assessment models for direct use by policymakers.

    “As decision-makers implement climate policies in the context of other sustainability challenges like air pollution, efficient modeling tools are important for assessment — and new computational techniques allow us to build faster and more accurate tools to provide credible, relevant information to a broader range of users,” says Noelle Selin, a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences, and supervising author of the study. “We are looking forward to further developing such approaches, and to working with stakeholders to ensure that they provide timely, targeted and useful assessments.”

    The study was funded, in part, by the U.S. Environmental Protection Agency and the Biogen Foundation. More

  • in

    Study: Carbon-neutral pavements are possible by 2050, but rapid policy and industry action are needed

    Almost 2.8 million lane-miles, or about 4.6 million lane-kilometers, of the United States are paved.

    Roads and streets form the backbone of our built environment. They take us to work or school, take goods to their destinations, and much more.

    However, a new study by MIT Concrete Sustainability Hub (CSHub) researchers shows that the annual greenhouse gas (GHG) emissions of all construction materials used in the U.S. pavement network are 11.9 to 13.3 megatons. This is equivalent to the emissions of a gasoline-powered passenger vehicle driving about 30 billion miles in a year.

    As roads are built, repaved, and expanded, new approaches and thoughtful material choices are necessary to dampen their carbon footprint. 

    The CSHub researchers found that, by 2050, mixtures for pavements can be made carbon-neutral if industry and governmental actors help to apply a range of solutions — like carbon capture — to reduce, avoid, and neutralize embodied impacts. (A neutralization solution is any compensation mechanism in the value chain of a product that permanently removes the global warming impact of the processes after avoiding and reducing the emissions.) Furthermore, nearly half of pavement-related greenhouse gas (GHG) savings can be achieved in the short term with a negative or nearly net-zero cost.

    The research team, led by Hessam AzariJafari, MIT CSHub’s deputy director, closed gaps in our understanding of the impacts of pavements decisions by developing a dynamic model quantifying the embodied impact of future pavements materials demand for the U.S. road network. 

    The team first split the U.S. road network into 10-mile (about 16 kilometer) segments, forecasting the condition and performance of each. They then developed a pavement management system model to create benchmarks helping to understand the current level of emissions and the efficacy of different decarbonization strategies. 

    This model considered factors such as annual traffic volume and surface conditions, budget constraints, regional variation in pavement treatment choices, and pavement deterioration. The researchers also used a life-cycle assessment to calculate annual state-level emissions from acquiring pavement construction materials, considering future energy supply and materials procurement.

    The team considered three scenarios for the U.S. pavement network: A business-as-usual scenario in which technology remains static, a projected improvement scenario aligned with stated industry and national goals, and an ambitious improvement scenario that intensifies or accelerates projected strategies to achieve carbon neutrality. 

    If no steps are taken to decarbonize pavement mixtures, the team projected that GHG emissions of construction materials used in the U.S. pavement network would increase by 19.5 percent by 2050. Under the projected scenario, there was an estimated 38 percent embodied impact reduction for concrete and 14 percent embodied impact reduction for asphalt by 2050.

    The keys to making the pavement network carbon neutral by 2050 lie in multiple places. Fully renewable energy sources should be used for pavement materials production, transportation, and other processes. The federal government must contribute to the development of these low-carbon energy sources and carbon capture technologies, as it would be nearly impossible to achieve carbon neutrality for pavements without them. 

    Additionally, increasing pavements’ recycled content and improving their design and production efficiency can lower GHG emissions to an extent. Still, neutralization is needed to achieve carbon neutrality.

    Making the right pavement construction and repair choices would also contribute to the carbon neutrality of the network. For instance, concrete pavements can offer GHG savings across the whole life cycle as they are stiffer and stay smoother for longer, meaning they require less maintenance and have a lesser impact on the fuel efficiency of vehicles. 

    Concrete pavements have other use-phase benefits including a cooling effect through an intrinsically high albedo, meaning they reflect more sunlight than regular pavements. Therefore, they can help combat extreme heat and positively affect the earth’s energy balance through positive radiative forcing, making albedo a potential neutralization mechanism.

    At the same time, a mix of fixes, including using concrete and asphalt in different contexts and proportions, could produce significant GHG savings for the pavement network; decision-makers must consider scenarios on a case-by-case basis to identify optimal solutions. 

    In addition, it may appear as though the GHG emissions of materials used in local roads are dwarfed by the emissions of interstate highway materials. However, the study found that the two road types have a similar impact. In fact, all road types contribute heavily to the total GHG emissions of pavement materials in general. Therefore, stakeholders at the federal, state, and local levels must be involved if our roads are to become carbon neutral. 

    The path to pavement network carbon-neutrality is, therefore, somewhat of a winding road. It demands regionally specific policies and widespread investment to help implement decarbonization solutions, just as renewable energy initiatives have been supported. Providing subsidies and covering the costs of premiums, too, are vital to avoid shifts in the market that would derail environmental savings.

    When planning for these shifts, we must recall that pavements have impacts not just in their production, but across their entire life cycle. As pavements are used, maintained, and eventually decommissioned, they have significant impacts on the surrounding environment.

    If we are to meet climate goals such as the Paris Agreement, which demands that we reach carbon-neutrality by 2050 to avoid the worst impacts of climate change, we — as well as industry and governmental stakeholders — must come together to take a hard look at the roads we use every day and work to reduce their life cycle emissions. 

    The study was published in the International Journal of Life Cycle Assessment. In addition to AzariJafari, the authors include Fengdi Guo of the MIT Department of Civil and Environmental Engineering; Jeremy Gregory, executive director of the MIT Climate and Sustainability Consortium; and Randolph Kirchain, director of the MIT CSHub. More

  • in

    Sustainable supply chains put the customer first

    When we consider the supply chain, we typically think of factories, ships, trucks, and warehouses. Yet, the customer side is equally important, especially in efforts to make our distribution networks more sustainable. Customers are an untapped resource in building sustainability, says Josué C. Velázquez Martínez, a research scientist at MIT Center for Transportation and Logistics. 

    Velázquez Martínez, who is director of MIT’s Sustainable Supply Chain Lab, investigates how customer-facing supply chains can be made more environmentally and socially sustainable. One way is a Green Button project that explores how to optimize e-commerce delivery schedules to reduce carbon emissions and persuade customers to use less carbon-intensive four- or five-day shipping options instead of one or two days. Velázquez Martínez has also launched the MIT Low Income Firms Transformation (LIFT) Lab that is researching ways to improve micro-retailer supply chains in the developing world to provide owners with the necessary tools for survival.  

    “The definition of sustainable supply chain keeps evolving because things that were sustainable 20 to 30 years ago are not as sustainable now,” says Velázquez Martínez. “Today, there are more companies that are capturing information to build strategies for environmental, economic, and social sustainability. They are investing in alternative energy and other solutions to make the supply chain more environmentally friendly and are tracking their suppliers and identifying key vulnerabilities. A big part of this is an attempt to create fairer conditions for people who work in supply chains or are dependent on them.”

    Play video

    The move toward sustainable supply chain is being driven as much by people as by companies, whether they are playing the role of selective consumer or voting citizens. The consumer aspect is often overlooked, says Velázquez Martínez. “Consumers are the ones who move the supply chain. We are looking at how companies can provide transparency to involve customers in their sustainability strategy.” 

    Proposed solutions for sustainability are not always as effective as promised. Some fashion rental schemes fall into this category, says Velázquez Martínez. “There are many new rental companies that are trying to get more use out of clothes to offset the emissions associated with production. We recently researched the environmental impact of monthly subscription models where consumers pay a fee to receive clothes for a month before returning them, as well as peer-to-peer sharing models.” 

    The researchers found that while rental services generally have a lower carbon footprint than retail sales, hidden emissions from logistics played a surprisingly large role. “First, you need to deliver the clothes and pick them up, and there are high return rates,” says Velázquez Martínez. “When you factor in dry cleaning and packaging emissions, the rental models in some cases have a worse carbon footprint than buying new clothes.” Peer-to-peer sharing could be better, he adds, but that depends on how far the consumers travel to meet-up points. 

    Typically, says Velázquez Martínez, garment types that are frequently used are not well suited to rental models. “But for specialty clothes such as wedding dresses or prom dresses, it is better to rent.” 

    Waiting a few days to save the planet 

    Even before the pandemic, online retailing gained a second wind due to low-cost same- and next-day delivery options. While e-commerce may have its drawbacks as a contributor to social isolation and reduced competition, it has proven itself to be far more eco-friendly than brick-and-mortar shopping, not to mention a lot more convenient. Yet rapid deliveries are cutting into online-shopping’s carbon-cutting advantage.

    In 2019, MIT’s Sustainable Supply Chain Lab launched a Green Bottle project to study the rapid delivery phenomenon. The project has been “testing whether consumers would be willing to delay their e-commerce deliveries to reduce the environmental impact of fast shipping,” says Velázquez Martínez. “Many companies such as Walmart and Target have followed Amazon’s 2019 strategy of moving from two-day to same-day delivery. Instead of sending a fully loaded truck to a neighborhood every few days, they now send multiple trucks to that neighborhood every day, and there are more days when trucks are targeting each neighborhood. All this increases carbon emissions and makes it hard for shippers to consolidate. ”  

    Working with Coppel, one of Mexico’s largest retailers, the Green Button project inspired a related Consolidation Ecommerce Project that built a large-scale mathematical model to provide a strategy for consolidation. The model determined what delivery time window each neighborhood demands and then calculated the best day to deliver to each neighborhood to meet the desired window while minimizing carbon emissions. 

    No matter what mixture of delivery times was used, the consolidation model helped retailers schedule deliveries more efficiently. Yet, the biggest cuts in emissions emerged when customers were willing to wait several days.

    Play video

    “When we ran a month-long simulation comparing our model for four-to-five-day delivery with Coppel’s existing model for one- or two-day delivery, we saw savings in fuel consumption of over 50 percent on certain routes” says Velázquez Martínez. “This is huge compared to other strategies for squeezing more efficiency from the last-mile supply chain, such as routing optimization, where savings are close to 5 percent. The optimal solution depends on factors such as the capacity for consolidation, the frequency of delivery, the store capacity, and the impact on inbound operations.” 

    The researchers next set out to determine if customers could be persuaded to wait longer for deliveries. Considering that the price differential is low or nonexistent, this was a considerable challenge. Yet, the same day habit is only a few years old, and some consumers have come to realize they don’t always need rapid deliveries. “Some consumers who order by rapid delivery find they are too busy to open the packages right away,” says Velázquez Martínez.  

    Trees beat kilograms of CO2

    The researchers set out to find if consumers would be willing to sacrifice a bit of convenience if they knew they were helping to reduce climate change. The Green Button project tested different public outreach strategies. For one test group, they reported the carbon impact of delivery times in kilograms of carbon dioxide (CO2). Another group received the information expressed in terms of the energy required to recycle a certain amount of garbage. A third group learned about emissions in terms of the number of trees required to trap the carbon. “Explaining the impact in terms of trees led to almost 90 percent willing to wait another day or two,” says Velázquez Martínez. “This is compared to less than 40 percent for the group that received the data in kilograms of CO2.” 

    Another surprise was that there was no difference in response based on income, gender, or age. “Most studies of green consumers suggest they are predominantly high income, female, highly educated, or younger,” says Velázquez Martínez. “However, our results show that the differences were the same between low and high income, women and men, and younger and older people. We have shown that disclosing emissions transparently and making the consumer a part of the strategy can be a new opportunity for more consumer-driven logistics sustainability.” 

    The researchers are now developing similar models for business-to-business (B2B) e-commerce. “We found that B2B supply chain emissions are often high because many shipping companies require strict delivery windows,” says Velázquez Martínez.  

    The B2B models drill down to examine the Corporate Value Chain (Scope 3) emissions of suppliers. “Although some shipping companies are now asking their suppliers to review emissions, it is a challenge to create a transparent supply chain,” says Velázquez Martínez.  “Technological innovations have made it easier, starting with RFID [radio frequency identification], and then real-time GPS mapping and blockchain. But these technologies need to be more accessible and affordable, and we need more companies willing to use them.” 

    Some companies have been hesitant to dig too deeply into their supply chain, fearing they might uncover a scandal that might risk their reputation, says Velázquez Martínez. Other organizations are forced to look at the issue when nongovernmental organizations research sustainability issues such as social injustice in sweat shops and conflict mineral mines. 

    One challenge to building a transparent supply chain is that “in many companies, the sustainability teams are separate from the rest of the company,” says Velázquez Martínez. “Even if the CEOs receive information on sustainability issues, it often doesn’t filter down because the information does not belong to the planners or managers. We are pushing companies to not only account for sustainability factors in supply chain network design but also examine daily operations that affect sustainability. This is a big topic now: How can we translate sustainability information into something that everybody can understand and use?” 

    LIFT Lab lifts micro-retailers  

    In 2016, Velázquez Martínez launched the MIT GeneSys project to gain insights into micro and small enterprises (MSEs) in developing countries. The project released a GeneSys mobile app, which was used by more than 500 students throughout Latin America to collect data on more than 800 microfirms. In 2022, he launched the LIFT Lab, which focuses more specifically on studying and improving the supply chain for MSEs.  

    Worldwide, some 90 percent of companies have fewer than 10 employees. In Latin America and the Caribbean, companies with fewer than 50 employees represent 99 percent of all companies and 47 percent of employment. 

    Although MSEs represent much of the world’s economy, they are poorly understood, notes Velázquez Martínez. “Those tiny businesses are driving a lot of the economy and serve as important customers for the large companies working in developing countries. They range from small businesses down to people trying to get some money to eat by selling cakes or tacos through their windows.”  

    The MIT LIFT Lab researchers investigated whether MSE supply chain issues could help shed light on why many Latin American countries have been limited to marginal increases in gross domestic product. “Large companies from the developed world that are operating in Latin America, such as Unilever, Walmart, and Coca-Cola, have huge growth there, in some cases higher than they have in the developed world,” says Velázquez Martínez. “Yet, the countries are not developing as fast as we would expect.” 

    The LIFT Lab data showed that while the multinationals are thriving in Latin America, the local MSEs are decreasing in productivity. The study also found the trend has worsened with Covid-19.  

    The LIFT Lab’s first big project, which is sponsored by Mexican beverage and retail company FEMSA, is studying supply chains in Mexico. The study spans 200,000 micro-retailers and 300,000 consumers. In a collaboration with Tecnológico de Monterrey, hundreds of students are helping with a field study.  

    “We are looking at supply chain management and business capabilities and identifying the challenges to adoption of technology and digitalization,” says Velázquez Martínez. “We want to find the best ways for micro-firms to work with suppliers and consumers by identifying the consumers who access this market, as well as the products and services that can best help the micro-firms drive growth.” 

    Based on the earlier research by GeneSys, Velázquez Martínez has developed some hypotheses for potential improvements for micro-retailer supply chain, starting with payment terms. “We found that the micro-firms often get the worst purchasing deals. Owners without credit cards and with limited cash often buy in smaller amounts at much higher prices than retailers like Walmart. The big suppliers are squeezing them.” 

    While large retailers usually get 60 to 120 days to pay, micro-retailers “either pay at the moment of the transaction or in advance,” says Velázquez Martínez. “In a study of 500 micro-retailers in five countries in Latin America, we found the average payment time was minus seven days payment in advance. These terms reduce cash availability and often lead to bankruptcy.” 

    LIFT Lab is working with suppliers to persuade them to offer a minimum payment time of two weeks. “We can show the suppliers that the change in terms will let them move more product and increase sales,” says Velázquez Martínez. “Meanwhile, the micro-retailers gain higher profits and become more stable, even if they may pay a bit more.” 

    LIFT Lab is also looking at ways that micro-retailers can leverage smartphones for digitalization and planning. “Some of these companies are keeping records on napkins,” says Velázquez Martínez. “By using a cellphone, they can charge orders to suppliers and communicate with consumers. We are testing different dashboards for mobile apps to help with planning and financial performance. We are also recommending services the stores can provide, such as paying electricity or water bills. The idea is to build more capabilities and knowledge and increase business competencies for the supply chain that are tailored for micro-retailers.” 

    From a financial perspective, micro-retailers are not always the most efficient way to move products. Yet they also play an important role in building social cohesion within neighborhoods. By offering more services, the corner bodega can bring people together in ways that are impossible with e-commerce and big-box stores.  

    Whether the consumers are micro-firms buying from suppliers or e-commerce customers waiting for packages, “transparency is key to building a sustainable supply chain,” says Velázquez Martínez. “To change consumer habits, consumers need to be better educated on the impacts of their behaviors. With consumer-facing logistics, ‘The last shall be first, and the first last.’” More

  • in

    Studying floods to better predict their dangers

    “My job is basically flooding Cambridge,” says Katerina “Katya” Boukin, a graduate student in civil and environmental engineering at MIT and the MIT Concrete Sustainability Hub’s resident expert on flood simulations. 

    You can often find her fine-tuning high-resolution flood risk models for the City of Cambridge, Massachusetts, or talking about hurricanes with fellow researcher Ipek Bensu Manav.

    Flooding represents one of the world’s gravest natural hazards. Extreme climate events inducing flooding, like severe storms, winter storms, and tropical cyclones, caused an estimated $128.1 billion of damages in 2021 alone. 

    Climate simulation models suggest that severe storms will become more frequent in the coming years, necessitating a better understanding of which parts of cities are most vulnerable — an understanding that can be improved through modeling.

    A problem with current flood models is that they struggle to account for an oft-misunderstood type of flooding known as pluvial flooding. 

    “You might think of flooding as the overflowing of a body of water, like a river. This is fluvial flooding. This can be somewhat predictable, as you can think of proximity to water as a risk factor,” Boukin explains.

    However, the “flash flooding” that causes many deaths each year can happen even in places nowhere near a body of water. This is an example of pluvial flooding, which is affected by terrain, urban infrastructure, and the dynamic nature of storm loads.

    “If we don’t know how a flood is propagating, we don’t know the risk it poses to the urban environment. And if we don’t understand the risk, we can’t really discuss mitigation strategies,” says Boukin, “That’s why I pursue improving flood propagation models.”

    Boukin is leading development of a new flood prediction method that seeks to address these shortcomings. By better representing the complex morphology of cities, Boukin’s approach may provide a clearer forecast of future urban flooding.

    Katya Boukin developed this model of the City of Cambridge, Massachusetts. The base model was provided through a collaboration between MIT, the City of Cambridge, and Dewberry Engineering.

    Image: Katya Boukin

    Previous item
    Next item

    “In contrast to the more typical traditional catchment model, our method has rainwater spread around the urban environment based on the city’s topography, below-the-surface features like sewer pipes, and the characteristics of local soils,” notes Boukin.

    “We can simulate the flooding of regions with local rain forecasts. Our results can show how flooding propagates by the foot and by the second,” she adds.

    While Boukin’s current focus is flood simulation, her unconventional academic career has taken her research in many directions, like examining structural bottlenecks in dense urban rail systems and forecasting ground displacement due to tunneling. 

    “I’ve always been interested in the messy side of problem-solving. I think that difficult problems present a real chance to gain a deeper understanding,” says Boukin.

    Boukin credits her upbringing for giving her this perspective. A native of Israel, Boukin says that civil engineering is the family business. “My parents are civil engineers, my mom’s parents are, too, her grandfather was a professor in civil engineering, and so on. Civil engineering is my bloodline.”

    However, the decision to follow the family tradition did not come so easily. “After I took the Israeli equivalent of the SAT, I was at a decision point: Should I go to engineering school or medical school?” she recalls.

    “I decided to go on a backpacking trip to help make up my mind. It’s sort of an Israeli rite to explore internationally, so I spent six months in South America. I think backpacking is something everyone should do.”

    After this soul searching, Boukin landed on engineering school, where she fell in love with structural engineering. “It was the option that felt most familiar and interesting. I grew up playing with AutoCAD on the family computer, and now I use AutoCAD professionally!” she notes.

    “For my master’s degree, I was looking to study in a department that would help me integrate knowledge from fields like climatology and civil engineering. I found the MIT Department of Civil and Environmental Engineering to be an excellent fit,” she says.

    “I am lucky that MIT has so many people that work together as well as they do. I ended up at the Concrete Sustainability Hub, where I’m working on projects which are the perfect fit between what I wanted to do and what the department wanted to do.” 

    Boukin’s move to Cambridge has given her a new perspective on her family and childhood. 

    “My parents brought me to Israel when I was just 1 year old. In moving here as a second-time immigrant, I have a new perspective on what my parents went through during the move to Israel. I moved when I was 27 years old, the same age as they were. They didn’t have a support network and worked any job they could find,” she explains.

    “I am incredibly grateful to them for the morals they instilled in my sister, who recently graduated medical school, and I. I know I can call my parents if I ever need something, and they will do whatever they can to help.”

    Boukin hopes to honor her parents’ efforts through her research.

    “Not only do I want to help stakeholders understand flood risks, I want to make awareness of flooding more accessible. Each community needs different things to be resilient, and different cultures have different ways of delivering and receiving information,” she says.

    “Everyone should understand that they, in addition to the buildings and infrastructure around them, are part of a complex ecosystem. Any change to a city can affect the rest of it. If designers and residents are aware of this when considering flood mitigation strategies, we can better design cities and understand the consequences of damage.” More

  • in

    Simulating neutron behavior in nuclear reactors

    Amelia Trainer applied to MIT because she lost a bet.

    As part of what the fourth-year nuclear science and engineering (NSE) doctoral student labels her “teenage rebellious phase,” Trainer was quite convinced she would just be wasting the application fee were she to submit an application. She wasn’t even “super sure” she wanted to go to college. But a high-school friend was convinced Trainer would get into a “top school” if she only applied. A bet followed: If Trainer lost, she would have to apply to MIT. Trainer lost — and is glad she did.

    Growing up in Daytona Beach, Florida, good grades were Trainer’s thing. Seeing friends participate in interschool math competitions, Trainer decided she would tag along and soon found she loved them. She remembers being adept at reading the room: If teams were especially struggling over a problem, Trainer figured the answer had to be something easy, like zero or one. “The hardest problems would usually have the most goofball answers,” she laughs.

    Simulating neutron behavior

    As a doctoral student, hard problems in math, specifically computational reactor physics, continue to be Trainer’s forte.

    Her research, under the guidance of Professor Benoit Forget in MIT NSE’s Computational Reactor Physics Group (CRPG), focuses on modeling complicated neutron behavior in reactors. Simulation helps forecast the behavior of reactors before millions of dollars sink into development of a potentially uneconomical unit. Using simulations, Trainer can see “where the neutrons are going, how much heat is being produced, and how much power the reactor can generate.” Her research helps form the foundation for the next generation of nuclear power plants.

    To simulate neutron behavior inside of a nuclear reactor, you first need to know how neutrons will interact with the various materials inside the system. These neutrons can have wildly different energies, thereby making them susceptible to different physical phenomena. For the entirety of her graduate studies, Trainer has been primarily interested in the physics regarding slow-moving neutrons and their scattering behavior.

    When a slow neutron scatters off of a material, it can induce or cancel out molecular vibrations between the material’s atoms. The effect that material vibrations can have on neutron energies, and thereby on reactor behavior, has been heavily approximated over the years. Trainer is primarily interested in chipping away at these approximations by creating scattering data for materials that have historically been misrepresented and by exploring new techniques for preparing slow-neutron scattering data.

    Trainer remembers waiting for a simulation to complete in the early days of the Covid-19 pandemic, when she discovered a way to predict neutron behavior with limited input data. Traditionally, “people have to store large tables of what neutrons will do under specific circumstances,” she says. “I’m really happy about it because it’s this really cool method of sampling what your neutron does from very little information,” Trainer says.

    Amelia Trainer — Modeling complicated neutron behavior in nuclear reactors

    As part of her research, Trainer often works closely with two software packages: OpenMC and NJOY. OpenMC is a Monte Carlo neutron transport simulation code that was developed in the CRPG and is used to simulate neutron behavior in reactor systems. NJOY is a nuclear data processing tool, and is used to create, augment, and prepare material data that is fed into tools like OpenMC. By editing both these codes to her specifications, Trainer is able to observe the effect that “upstream” material data has on the “downstream” reactor calculations. Through this, she hopes to identify additional problems: approximations that could lead to a noticeable misrepresentation of the physics.

    A love of geometry and poetry

    Trainer discovered the coolness of science as a child. Her mother, who cares for indoor plants and runs multiple greenhouses, and her father, a blacksmith and farrier, who explored materials science through his craft, were self-taught inspirations.

    Trainer’s father urged his daughter to learn and pursue any topics that she found exciting and encouraged her to read poems from “Calvin and Hobbes” out loud when she struggled with a speech impediment in early childhood. Reading the same passages every day helped her memorize them. “The natural manifestation of that extended into [a love of] poetry,” Trainer says.

    A love of poetry, combined with Trainer’s propensity for fun, led her to compose an ode to pi as part of an MIT-sponsored event for alumni. “I was really only in it for the cupcake,” she laughs. (Participants received an indulgent treat).

    Play video

    MIT Matters: A Love Poem to Pi

    Computations and nuclear science

    After being accepted at MIT, Trainer knew she wanted to study in a field that would take her skills at the levels they were at — “my math skills were pretty underdeveloped in the grand scheme of things,” she says. An open-house weekend at MIT, where she met with faculty from the NSE department, and the opportunity to contribute to a discipline working toward clean energy, cemented Trainer’s decision to join NSE.

    As a high schooler, Trainer won a scholarship to Embry-Riddle Aeronautical University to learn computer coding and knew computational physics might be more aligned with her interests. After she joined MIT as an undergraduate student in 2014, she realized that the CRPG, with its focus on coding and modeling, might be a good fit. Fortunately, a graduate student from Forget’s team welcomed Trainer’s enthusiasm for research even as an undergraduate first-year. She has stayed with the lab ever since. 

    Research internships at Los Alamos National Laboratory, the creators of NJOY, have furthered Trainer’s enthusiasm for modeling and computational physics. She met a Los Alamos scientist after he presented a talk at MIT and it snowballed into a collaboration where she could work on parts of the NJOY code. “It became a really cool collaboration which led me into a deep dive into physics and data preparation techniques, which was just so fulfilling,” Trainer says. As for what’s next, Trainer was awarded the Rickover fellowship in nuclear engineering by the the Department of Energy’s Naval Reactors Division and will join the program in Pittsburgh after she graduates.

    For many years, Trainer’s cats, Jacques and Monster, have been a constant companion. “Neutrons, computers, and cats, that’s my personality,” she laughs. Work continues to fuel her passion. To borrow a favorite phrase from Spaceman Spiff, Trainer’s favorite “Calvin” avatar, Trainer’s approach to research has invariably been: “Another day, another mind-boggling adventure.” More

  • in

    Taking a magnifying glass to data center operations

    When the MIT Lincoln Laboratory Supercomputing Center (LLSC) unveiled its TX-GAIA supercomputer in 2019, it provided the MIT community a powerful new resource for applying artificial intelligence to their research. Anyone at MIT can submit a job to the system, which churns through trillions of operations per second to train models for diverse applications, such as spotting tumors in medical images, discovering new drugs, or modeling climate effects. But with this great power comes the great responsibility of managing and operating it in a sustainable manner — and the team is looking for ways to improve.

    “We have these powerful computational tools that let researchers build intricate models to solve problems, but they can essentially be used as black boxes. What gets lost in there is whether we are actually using the hardware as effectively as we can,” says Siddharth Samsi, a research scientist in the LLSC. 

    To gain insight into this challenge, the LLSC has been collecting detailed data on TX-GAIA usage over the past year. More than a million user jobs later, the team has released the dataset open source to the computing community.

    Their goal is to empower computer scientists and data center operators to better understand avenues for data center optimization — an important task as processing needs continue to grow. They also see potential for leveraging AI in the data center itself, by using the data to develop models for predicting failure points, optimizing job scheduling, and improving energy efficiency. While cloud providers are actively working on optimizing their data centers, they do not often make their data or models available for the broader high-performance computing (HPC) community to leverage. The release of this dataset and associated code seeks to fill this space.

    “Data centers are changing. We have an explosion of hardware platforms, the types of workloads are evolving, and the types of people who are using data centers is changing,” says Vijay Gadepally, a senior researcher at the LLSC. “Until now, there hasn’t been a great way to analyze the impact to data centers. We see this research and dataset as a big step toward coming up with a principled approach to understanding how these variables interact with each other and then applying AI for insights and improvements.”

    Papers describing the dataset and potential applications have been accepted to a number of venues, including the IEEE International Symposium on High-Performance Computer Architecture, the IEEE International Parallel and Distributed Processing Symposium, the Annual Conference of the North American Chapter of the Association for Computational Linguistics, the IEEE High-Performance and Embedded Computing Conference, and International Conference for High Performance Computing, Networking, Storage and Analysis. 

    Workload classification

    Among the world’s TOP500 supercomputers, TX-GAIA combines traditional computing hardware (central processing units, or CPUs) with nearly 900 graphics processing unit (GPU) accelerators. These NVIDIA GPUs are specialized for deep learning, the class of AI that has given rise to speech recognition and computer vision.

    The dataset covers CPU, GPU, and memory usage by job; scheduling logs; and physical monitoring data. Compared to similar datasets, such as those from Google and Microsoft, the LLSC dataset offers “labeled data, a variety of known AI workloads, and more detailed time series data compared with prior datasets. To our knowledge, it’s one of the most comprehensive and fine-grained datasets available,” Gadepally says. 

    Notably, the team collected time-series data at an unprecedented level of detail: 100-millisecond intervals on every GPU and 10-second intervals on every CPU, as the machines processed more than 3,000 known deep-learning jobs. One of the first goals is to use this labeled dataset to characterize the workloads that different types of deep-learning jobs place on the system. This process would extract features that reveal differences in how the hardware processes natural language models versus image classification or materials design models, for example.   

    The team has now launched the MIT Datacenter Challenge to mobilize this research. The challenge invites researchers to use AI techniques to identify with 95 percent accuracy the type of job that was run, using their labeled time-series data as ground truth.

    Such insights could enable data centers to better match a user’s job request with the hardware best suited for it, potentially conserving energy and improving system performance. Classifying workloads could also allow operators to quickly notice discrepancies resulting from hardware failures, inefficient data access patterns, or unauthorized usage.

    Too many choices

    Today, the LLSC offers tools that let users submit their job and select the processors they want to use, “but it’s a lot of guesswork on the part of users,” Samsi says. “Somebody might want to use the latest GPU, but maybe their computation doesn’t actually need it and they could get just as impressive results on CPUs, or lower-powered machines.”

    Professor Devesh Tiwari at Northeastern University is working with the LLSC team to develop techniques that can help users match their workloads to appropriate hardware. Tiwari explains that the emergence of different types of AI accelerators, GPUs, and CPUs has left users suffering from too many choices. Without the right tools to take advantage of this heterogeneity, they are missing out on the benefits: better performance, lower costs, and greater productivity.

    “We are fixing this very capability gap — making users more productive and helping users do science better and faster without worrying about managing heterogeneous hardware,” says Tiwari. “My PhD student, Baolin Li, is building new capabilities and tools to help HPC users leverage heterogeneity near-optimally without user intervention, using techniques grounded in Bayesian optimization and other learning-based optimization methods. But, this is just the beginning. We are looking into ways to introduce heterogeneity in our data centers in a principled approach to help our users achieve the maximum advantage of heterogeneity autonomously and cost-effectively.”

    Workload classification is the first of many problems to be posed through the Datacenter Challenge. Others include developing AI techniques to predict job failures, conserve energy, or create job scheduling approaches that improve data center cooling efficiencies.

    Energy conservation 

    To mobilize research into greener computing, the team is also planning to release an environmental dataset of TX-GAIA operations, containing rack temperature, power consumption, and other relevant data.

    According to the researchers, huge opportunities exist to improve the power efficiency of HPC systems being used for AI processing. As one example, recent work in the LLSC determined that simple hardware tuning, such as limiting the amount of power an individual GPU can draw, could reduce the energy cost of training an AI model by 20 percent, with only modest increases in computing time. “This reduction translates to approximately an entire week’s worth of household energy for a mere three-hour time increase,” Gadepally says.

    They have also been developing techniques to predict model accuracy, so that users can quickly terminate experiments that are unlikely to yield meaningful results, saving energy. The Datacenter Challenge will share relevant data to enable researchers to explore other opportunities to conserve energy.

    The team expects that lessons learned from this research can be applied to the thousands of data centers operated by the U.S. Department of Defense. The U.S. Air Force is a sponsor of this work, which is being conducted under the USAF-MIT AI Accelerator.

    Other collaborators include researchers at MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Professor Charles Leiserson’s Supertech Research Group is investigating performance-enhancing techniques for parallel computing, and research scientist Neil Thompson is designing studies on ways to nudge data center users toward climate-friendly behavior.

    Samsi presented this work at the inaugural AI for Datacenter Optimization (ADOPT’22) workshop last spring as part of the IEEE International Parallel and Distributed Processing Symposium. The workshop officially introduced their Datacenter Challenge to the HPC community.

    “We hope this research will allow us and others who run supercomputing centers to be more responsive to user needs while also reducing the energy consumption at the center level,” Samsi says. More

  • in

    New J-WAFS-led project combats food insecurity

    Today the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) at MIT announced a new research project, supported by Community Jameel, to tackle one of the most urgent crises facing the planet: food insecurity. Approximately 276 million people worldwide are severely food insecure, and more than half a million face famine conditions.     To better understand and analyze food security, this three-year research project will develop a comprehensive index assessing countries’ food security vulnerability, called the Jameel Index for Food Trade and Vulnerability. Global changes spurred by social and economic transitions, energy and environmental policy, regional geopolitics, conflict, and of course climate change, can impact food demand and supply. The Jameel Index will measure countries’ dependence on global food trade and imports and how these regional-scale threats might affect the ability to trade food goods across diverse geographic regions. A main outcome of the research will be a model to project global food demand, supply balance, and bilateral trade under different likely future scenarios, with a focus on climate change. The work will help guide policymakers over the next 25 years while the global population is expected to grow, and the climate crisis is predicted to worsen.    

    The work will be the foundational project for the J-WAFS-led Food and Climate Systems Transformation Alliance, or FACT Alliance. Formally launched at the COP26 climate conference last November, the FACT Alliance is a global network of 20 leading research institutions and stakeholder organizations that are driving research and innovation and informing better decision-making for healthy, resilient, equitable, and sustainable food systems in a rapidly changing climate. The initiative is co-directed by Greg Sixt, research manager for climate and food systems at J-WAFS, and Professor Kenneth Strzepek, climate, water, and food specialist at J-WAFS.

    The dire state of our food systems

    The need for this project is evidenced by the hundreds of millions of people around the globe currently experiencing food shortages. While several factors contribute to food insecurity, climate change is one of the most notable. Devastating extreme weather events are increasingly crippling crop and livestock production around the globe. From Southwest Asia to the Arabian Peninsula to the Horn of Africa, communities are migrating in search of food. In the United States, extreme heat and lack of rainfall in the Southwest have drastically lowered Lake Mead’s water levels, restricting water access and drying out farmlands. 

    Social, political, and economic issues also disrupt food systems. The effects of the Covid-19 pandemic, supply chain disruptions, and inflation continue to exacerbate food insecurity. Russia’s invasion of Ukraine is dramatically worsening the situation, disrupting agricultural exports from both Russia and Ukraine — two of the world’s largest producers of wheat, sunflower seed oil, and corn. Other countries like Lebanon, Sri Lanka, and Cuba are confronting food insecurity due to domestic financial crises.

    Few countries are immune to threats to food security from sudden disruptions in food production or trade. When an enormous container ship became lodged in the Suez Canal in March 2021, the vital international trade route was blocked for three months. The resulting delays in international shipping affected food supplies around the world. These situations demonstrate the importance of food trade in achieving food security: a disaster in one part of the world can drastically affect the availability of food in another. This puts into perspective just how interconnected the earth’s food systems are and how vulnerable they remain to external shocks. 

    An index to prepare for the future of food

    Despite the need for more secure food systems, significant knowledge gaps exist when it comes to understanding how different climate scenarios may affect both agricultural productivity and global food supply chains and security. The Global Trade Analysis Project database from Purdue University, and the current IMPACT modeling system from the International Food Policy Research Institute (IFPRI), enable assessments of existing conditions but cannot project or model changes in the future.

    In 2021, Strzepek and Sixt developed an initial Food Import Vulnerability Index (FIVI) as part of a regional assessment of the threat of climate change to food security in the Gulf Cooperation Council states and West Asia. FIVI is also limited in that it can only assess current trade conditions and climate change threats to food production. Additionally, FIVI is a national aggregate index and does not address issues of hunger, poverty, or equity that stem from regional variations within a country.

    “Current models are really good at showing global food trade flows, but we don’t have systems for looking at food trade between individual countries and how different food systems stressors such as climate change and conflict disrupt that trade,” says Greg Sixt of J-WAFS and the FACT Alliance. “This timely index will be a valuable tool for policymakers to understand the vulnerabilities to their food security from different shocks in the countries they import their food from. The project will also illustrate the stakeholder-guided, transdisciplinary approach that is central to the FACT Alliance,” Sixt adds.

    Phase 1 of the project will support a collaboration between four FACT Alliance members: MIT J-WAFS, Ethiopian Institute of Agricultural Research, IFPRI (which is also part of the CGIAR network), and the Martin School at the University of Oxford. An external partner, United Arab Emirates University, will also assist with the project work. This first phase will build on Strzepek and Sixt’s previous work on FIVI by developing a comprehensive Global Food System Modeling Framework that takes into consideration climate and global changes projected out to 2050, and assesses their impacts on domestic production, world market prices, and national balance of payments and bilateral trade. The framework will also utilize a mixed-modeling approach that includes the assessment of bilateral trade and macroeconomic data associated with varying agricultural productivity under the different climate and economic policy scenarios. In this way, consistent and harmonized projections of global food demand and supply balance, and bilateral trade under climate and global change can be achieved. 

    “Just like in the global response to Covid-19, using data and modeling are critical to understanding and tackling vulnerabilities in the global supply of food,” says George Richards, director of Community Jameel. “The Jameel Index for Food Trade and Vulnerability will help inform decision-making to manage shocks and long-term disruptions to food systems, with the aim of ensuring food security for all.”

    On a national level, the researchers will enrich the Jameel Index through country-level food security analyses of regions within countries and across various socioeconomic groups, allowing for a better understanding of specific impacts on key populations. The research will present vulnerability scores for a variety of food security metrics for 126 countries. Case studies of food security and food import vulnerability in Ethiopia and Sudan will help to refine the applicability of the Jameel Index with on-the-ground information. The case studies will use an IFPRI-developed tool called the Rural Investment and Policy Analysis model, which allows for analysis of urban and rural populations and different income groups. Local capacity building and stakeholder engagement will be critical to enable the use of the tools developed by this research for national-level planning in priority countries, and ultimately to inform policy.  Phase 2 of the project will build on phase 1 and the lessons learned from the Ethiopian and Sudanese case studies. It will entail a number of deeper, country-level analyses to assess the role of food imports on future hunger, poverty, and equity across various regional and socioeconomic groups within the modeled countries. This work will link the geospatial national models with the global analysis. A scholarly paper is expected to be submitted to show findings from this work, and a website will be launched so that interested stakeholders and organizations can learn more information. More