More stories

  • in

    New tools are available to help reduce the energy that AI models devour

    When searching for flights on Google, you may have noticed that each flight’s carbon-emission estimate is now presented next to its cost. It’s a way to inform customers about their environmental impact, and to let them factor this information into their decision-making.

    A similar kind of transparency doesn’t yet exist for the computing industry, despite its carbon emissions exceeding those of the entire airline industry. Escalating this energy demand are artificial intelligence models. Huge, popular models like ChatGPT signal a trend of large-scale artificial intelligence, boosting forecasts that predict data centers will draw up to 21 percent of the world’s electricity supply by 2030.

    The MIT Lincoln Laboratory Supercomputing Center (LLSC) is developing techniques to help data centers reel in energy use. Their techniques range from simple but effective changes, like power-capping hardware, to adopting novel tools that can stop AI training early on. Crucially, they have found that these techniques have a minimal impact on model performance.

    In the wider picture, their work is mobilizing green-computing research and promoting a culture of transparency. “Energy-aware computing is not really a research area, because everyone’s been holding on to their data,” says Vijay Gadepally, senior staff in the LLSC who leads energy-aware research efforts. “Somebody has to start, and we’re hoping others will follow.”

    Curbing power and cooling down

    Like many data centers, the LLSC has seen a significant uptick in the number of AI jobs running on its hardware. Noticing an increase in energy usage, computer scientists at the LLSC were curious about ways to run jobs more efficiently. Green computing is a principle of the center, which is powered entirely by carbon-free energy.

    Training an AI model — the process by which it learns patterns from huge datasets — requires using graphics processing units (GPUs), which are power-hungry hardware. As one example, the GPUs that trained GPT-3 (the precursor to ChatGPT) are estimated to have consumed 1,300 megawatt-hours of electricity, roughly equal to that used by 1,450 average U.S. households per month.

    While most people seek out GPUs because of their computational power, manufacturers offer ways to limit the amount of power a GPU is allowed to draw. “We studied the effects of capping power and found that we could reduce energy consumption by about 12 percent to 15 percent, depending on the model,” Siddharth Samsi, a researcher within the LLSC, says.

    The trade-off for capping power is increasing task time — GPUs will take about 3 percent longer to complete a task, an increase Gadepally says is “barely noticeable” considering that models are often trained over days or even months. In one of their experiments in which they trained the popular BERT language model, limiting GPU power to 150 watts saw a two-hour increase in training time (from 80 to 82 hours) but saved the equivalent of a U.S. household’s week of energy.

    The team then built software that plugs this power-capping capability into the widely used scheduler system, Slurm. The software lets data center owners set limits across their system or on a job-by-job basis.

    “We can deploy this intervention today, and we’ve done so across all our systems,” Gadepally says.

    Side benefits have arisen, too. Since putting power constraints in place, the GPUs on LLSC supercomputers have been running about 30 degrees Fahrenheit cooler and at a more consistent temperature, reducing stress on the cooling system. Running the hardware cooler can potentially also increase reliability and service lifetime. They can now consider delaying the purchase of new hardware — reducing the center’s “embodied carbon,” or the emissions created through the manufacturing of equipment — until the efficiencies gained by using new hardware offset this aspect of the carbon footprint. They’re also finding ways to cut down on cooling needs by strategically scheduling jobs to run at night and during the winter months.

    “Data centers can use these easy-to-implement approaches today to increase efficiencies, without requiring modifications to code or infrastructure,” Gadepally says.

    Taking this holistic look at a data center’s operations to find opportunities to cut down can be time-intensive. To make this process easier for others, the team — in collaboration with Professor Devesh Tiwari and Baolin Li at Northeastern University — recently developed and published a comprehensive framework for analyzing the carbon footprint of high-performance computing systems. System practitioners can use this analysis framework to gain a better understanding of how sustainable their current system is and consider changes for next-generation systems.  

    Adjusting how models are trained and used

    On top of making adjustments to data center operations, the team is devising ways to make AI-model development more efficient.

    When training models, AI developers often focus on improving accuracy, and they build upon previous models as a starting point. To achieve the desired output, they have to figure out what parameters to use, and getting it right can take testing thousands of configurations. This process, called hyperparameter optimization, is one area LLSC researchers have found ripe for cutting down energy waste. 

    “We’ve developed a model that basically looks at the rate at which a given configuration is learning,” Gadepally says. Given that rate, their model predicts the likely performance. Underperforming models are stopped early. “We can give you a very accurate estimate early on that the best model will be in this top 10 of 100 models running,” he says.

    In their studies, this early stopping led to dramatic savings: an 80 percent reduction in the energy used for model training. They’ve applied this technique to models developed for computer vision, natural language processing, and material design applications.

    “In my opinion, this technique has the biggest potential for advancing the way AI models are trained,” Gadepally says.

    Training is just one part of an AI model’s emissions. The largest contributor to emissions over time is model inference, or the process of running the model live, like when a user chats with ChatGPT. To respond quickly, these models use redundant hardware, running all the time, waiting for a user to ask a question.

    One way to improve inference efficiency is to use the most appropriate hardware. Also with Northeastern University, the team created an optimizer that matches a model with the most carbon-efficient mix of hardware, such as high-power GPUs for the computationally intense parts of inference and low-power central processing units (CPUs) for the less-demanding aspects. This work recently won the best paper award at the International ACM Symposium on High-Performance Parallel and Distributed Computing.

    Using this optimizer can decrease energy use by 10-20 percent while still meeting the same “quality-of-service target” (how quickly the model can respond).

    This tool is especially helpful for cloud customers, who lease systems from data centers and must select hardware from among thousands of options. “Most customers overestimate what they need; they choose over-capable hardware just because they don’t know any better,” Gadepally says.

    Growing green-computing awareness

    The energy saved by implementing these interventions also reduces the associated costs of developing AI, often by a one-to-one ratio. In fact, cost is usually used as a proxy for energy consumption. Given these savings, why aren’t more data centers investing in green techniques?

    “I think it’s a bit of an incentive-misalignment problem,” Samsi says. “There’s been such a race to build bigger and better models that almost every secondary consideration has been put aside.”

    They point out that while some data centers buy renewable-energy credits, these renewables aren’t enough to cover the growing energy demands. The majority of electricity powering data centers comes from fossil fuels, and water used for cooling is contributing to stressed watersheds. 

    Hesitancy may also exist because systematic studies on energy-saving techniques haven’t been conducted. That’s why the team has been pushing their research in peer-reviewed venues in addition to open-source repositories. Some big industry players, like Google DeepMind, have applied machine learning to increase data center efficiency but have not made their work available for others to deploy or replicate. 

    Top AI conferences are now pushing for ethics statements that consider how AI could be misused. The team sees the climate aspect as an AI ethics topic that has not yet been given much attention, but this also appears to be slowly changing. Some researchers are now disclosing the carbon footprint of training the latest models, and industry is showing a shift in energy transparency too, as in this recent report from Meta AI.

    They also acknowledge that transparency is difficult without tools that can show AI developers their consumption. Reporting is on the LLSC roadmap for this year. They want to be able to show every LLSC user, for every job, how much energy they consume and how this amount compares to others, similar to home energy reports.

    Part of this effort requires working more closely with hardware manufacturers to make getting these data off hardware easier and more accurate. If manufacturers can standardize the way the data are read out, then energy-saving and reporting tools can be applied across different hardware platforms. A collaboration is underway between the LLSC researchers and Intel to work on this very problem.

    Even for AI developers who are aware of the intense energy needs of AI, they can’t do much on their own to curb this energy use. The LLSC team wants to help other data centers apply these interventions and provide users with energy-aware options. Their first partnership is with the U.S. Air Force, a sponsor of this research, which operates thousands of data centers. Applying these techniques can make a significant dent in their energy consumption and cost.

    “We’re putting control into the hands of AI developers who want to lessen their footprint,” Gadepally says. “Do I really need to gratuitously train unpromising models? Am I willing to run my GPUs slower to save energy? To our knowledge, no other supercomputing center is letting you consider these options. Using our tools, today, you get to decide.”

    Visit this webpage to see the group’s publications related to energy-aware computing and findings described in this article. More

  • in

    Desalination system could produce freshwater that is cheaper than tap water

    Engineers at MIT and in China are aiming to turn seawater into drinking water with a completely passive device that is inspired by the ocean, and powered by the sun.

    In a paper appearing today in the journal Joule, the team outlines the design for a new solar desalination system that takes in saltwater and heats it with natural sunlight.

    The configuration of the device allows water to circulate in swirling eddies, in a manner similar to the much larger “thermohaline” circulation of the ocean. This circulation, combined with the sun’s heat, drives water to evaporate, leaving salt behind. The resulting water vapor can then be condensed and collected as pure, drinkable water. In the meantime, the leftover salt continues to circulate through and out of the device, rather than accumulating and clogging the system.

    The new system has a higher water-production rate and a higher salt-rejection rate than all other passive solar desalination concepts currently being tested.

    The researchers estimate that if the system is scaled up to the size of a small suitcase, it could produce about 4 to 6 liters of drinking water per hour and last several years before requiring replacement parts. At this scale and performance, the system could produce drinking water at a rate and price that is cheaper than tap water.

    “For the first time, it is possible for water, produced by sunlight, to be even cheaper than tap water,” says Lenan Zhang, a research scientist in MIT’s Device Research Laboratory.

    The team envisions a scaled-up device could passively produce enough drinking water to meet the daily requirements of a small family. The system could also supply off-grid, coastal communities where seawater is easily accessible.

    Zhang’s study co-authors include MIT graduate student Yang Zhong and Evelyn Wang, the Ford Professor of Engineering, along with Jintong Gao, Jinfang You, Zhanyu Ye, Ruzhu Wang, and Zhenyuan Xu of Shanghai Jiao Tong University in China.

    A powerful convection

    The team’s new system improves on their previous design — a similar concept of multiple layers, called stages. Each stage contained an evaporator and a condenser that used heat from the sun to passively separate salt from incoming water. That design, which the team tested on the roof of an MIT building, efficiently converted the sun’s energy to evaporate water, which was then condensed into drinkable water. But the salt that was left over quickly accumulated as crystals that clogged the system after a few days. In a real-world setting, a user would have to place stages on a frequent basis, which would significantly increase the system’s overall cost.

    In a follow-up effort, they devised a solution with a similar layered configuration, this time with an added feature that helped to circulate the incoming water as well as any leftover salt. While this design prevented salt from settling and accumulating on the device, it desalinated water at a relatively low rate.

    In the latest iteration, the team believes it has landed on a design that achieves both a high water-production rate, and high salt rejection, meaning that the system can quickly and reliably produce drinking water for an extended period. The key to their new design is a combination of their two previous concepts: a multistage system of evaporators and condensers, that is also configured to boost the circulation of water — and salt — within each stage.

    “We introduce now an even more powerful convection, that is similar to what we typically see in the ocean, at kilometer-long scales,” Xu says.

    The small circulations generated in the team’s new system is similar to the “thermohaline” convection in the ocean — a phenomenon that drives the movement of water around the world, based on differences in sea temperature (“thermo”) and salinity (“haline”).

    “When seawater is exposed to air, sunlight drives water to evaporate. Once water leaves the surface, salt remains. And the higher the salt concentration, the denser the liquid, and this heavier water wants to flow downward,” Zhang explains. “By mimicking this kilometer-wide phenomena in small box, we can take advantage of this feature to reject salt.”

    Tapping out

    The heart of the team’s new design is a single stage that resembles a thin box, topped with a dark material that efficiently absorbs the heat of the sun. Inside, the box is separated into a top and bottom section. Water can flow through the top half, where the ceiling is lined with an evaporator layer that uses the sun’s heat to warm up and evaporate any water in direct contact. The water vapor is then funneled to the bottom half of the box, where a condensing layer air-cools the vapor into salt-free, drinkable liquid. The researchers set the entire box at a tilt within a larger, empty vessel, then attached a tube from the top half of the box down through the bottom of the vessel, and floated the vessel in saltwater.

    In this configuration, water can naturally push up through the tube and into the box, where the tilt of the box, combined with the thermal energy from the sun, induces the water to swirl as it flows through. The small eddies help to bring water in contact with the upper evaporating layer while keeping salt circulating, rather than settling and clogging.

    The team built several prototypes, with one, three, and 10 stages, and tested their performance in water of varying salinity, including natural seawater and water that was seven times saltier.

    From these tests, the researchers calculated that if each stage were scaled up to a square meter, it would produce up to 5 liters of drinking water per hour, and that the system could desalinate water without accumulating salt for several years. Given this extended lifetime, and the fact that the system is entirely passive, requiring no electricity to run, the team estimates that the overall cost of running the system would be cheaper than what it costs to produce tap water in the United States.

    “We show that this device is capable of achieving a long lifetime,” Zhong says. “That means that, for the first time, it is possible for drinking water produced by sunlight to be cheaper than tap water. This opens up the possibility for solar desalination to address real-world problems.”

    “This is a very innovative approach that effectively mitigates key challenges in the field of desalination,” says Guihua Yu, who develops sustainable water and energy storage systems at the University of Texas at Austin, and was not involved in the research. “The design is particularly beneficial for regions struggling with high-salinity water. Its modular design makes it highly suitable for household water production, allowing for scalability and adaptability to meet individual needs.”

    Funding for the research at Shanghai Jiao Tong University was supported by the Natural Science Foundation of China. More

  • in

    Improving US air quality, equitably

    Decarbonization of national economies will be key to achieving global net-zero emissions by 2050, a major stepping stone to the Paris Agreement’s long-term goal of keeping global warming well below 2 degrees Celsius (and ideally 1.5 C), and thereby averting the worst consequences of climate change. Toward that end, the United States has pledged to reduce its greenhouse gas emissions by 50-52 percent from 2005 levels by 2030, backed by its implementation of the 2022 Inflation Reduction Act. This strategy is consistent with a 50-percent reduction in carbon dioxide (CO2) by the end of the decade.

    If U.S. federal carbon policy is successful, the nation’s overall air quality will also improve. Cutting CO2 emissions reduces atmospheric concentrations of air pollutants that lead to the formation of fine particulate matter (PM2.5), which causes more than 200,000 premature deaths in the United States each year. But an average nationwide improvement in air quality will not be felt equally; air pollution exposure disproportionately harms people of color and lower-income populations.

    How effective are current federal decarbonization policies in reducing U.S. racial and economic disparities in PM2.5 exposure, and what changes will be needed to improve their performance? To answer that question, researchers at MIT and Stanford University recently evaluated a range of policies which, like current U.S. federal carbon policies, reduce economy-wide CO2 emissions by 40-60 percent from 2005 levels by 2030. Their findings appear in an open-access article in the journal Nature Communications.

    First, they show that a carbon-pricing policy, while effective in reducing PM2.5 exposure for all racial/ethnic groups, does not significantly mitigate relative disparities in exposure. On average, the white population undergoes far less exposure than Black, Hispanic, and Asian populations. This policy does little to reduce exposure disparities because the CO2 emissions reductions that it achieves primarily occur in the coal-fired electricity sector. Other sectors, such as industry and heavy-duty diesel transportation, contribute far more PM2.5-related emissions.

    The researchers then examine thousands of different reduction options through an optimization approach to identify whether any possible combination of carbon dioxide reductions in the range of 40-60 percent can mitigate disparities. They find that that no policy scenario aligned with current U.S. carbon dioxide emissions targets is likely to significantly reduce current PM2.5 exposure disparities.

    “Policies that address only about 50 percent of CO2 emissions leave many polluting sources in place, and those that prioritize reductions for minorities tend to benefit the entire population,” says Noelle Selin, supervising author of the study and a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences. “This means that a large range of policies that reduce CO2 can improve air quality overall, but can’t address long-standing inequities in air pollution exposure.”

    So if climate policy alone cannot adequately achieve equitable air quality results, what viable options remain? The researchers suggest that more ambitious carbon policies could narrow racial and economic PM2.5 exposure disparities in the long term, but not within the next decade. To make a near-term difference, they recommend interventions designed to reduce PM2.5 emissions resulting from non-CO2 sources, ideally at the economic sector or community level.

    “Achieving improved PM2.5 exposure for populations that are disproportionately exposed across the United States will require thinking that goes beyond current CO2 policy strategies, most likely involving large-scale structural changes,” says Selin. “This could involve changes in local and regional transportation and housing planning, together with accelerated efforts towards decarbonization.” More

  • in

    How to tackle the global deforestation crisis

    Imagine if France, Germany, and Spain were completely blanketed in forests — and then all those trees were quickly chopped down. That’s nearly the amount of deforestation that occurred globally between 2001 and 2020, with profound consequences.

    Deforestation is a major contributor to climate change, producing between 6 and 17 percent of global greenhouse gas emissions, according to a 2009 study. Meanwhile, because trees also absorb carbon dioxide, removing it from the atmosphere, they help keep the Earth cooler. And climate change aside, forests protect biodiversity.

    “Climate change and biodiversity make this a global problem, not a local problem,” says MIT economist Ben Olken. “Deciding to cut down trees or not has huge implications for the world.”

    But deforestation is often financially profitable, so it continues at a rapid rate. Researchers can now measure this trend closely: In the last quarter-century, satellite-based technology has led to a paradigm change in charting deforestation. New deforestation datasets, based on the Landsat satellites, for instance, track forest change since 2000 with resolution at 30 meters, while many other products now offer frequent imaging at close resolution.

    “Part of this revolution in measurement is accuracy, and the other part is coverage,” says Clare Balboni, an assistant professor of economics at the London School of Economics (LSE). “On-site observation is very expensive and logistically challenging, and you’re talking about case studies. These satellite-based data sets just open up opportunities to see deforestation at scale, systematically, across the globe.”

    Balboni and Olken have now helped write a new paper providing a road map for thinking about this crisis. The open-access article, “The Economics of Tropical Deforestation,” appears this month in the Annual Review of Economics. The co-authors are Balboni, a former MIT faculty member; Aaron Berman, a PhD candidate in MIT’s Department of Economics; Robin Burgess, an LSE professor; and Olken, MIT’s Jane Berkowitz Carlton and Dennis William Carlton Professor of Microeconomics. Balboni and Olken have also conducted primary research in this area, along with Burgess.

    So, how can the world tackle deforestation? It starts with understanding the problem.

    Replacing forests with farms

    Several decades ago, some thinkers, including the famous MIT economist Paul Samuelson in the 1970s, built models to study forests as a renewable resource; Samuelson calculated the “maximum sustained yield” at which a forest could be cleared while being regrown. These frameworks were designed to think about tree farms or the U.S. national forest system, where a fraction of trees would be cut each year, and then new trees would be grown over time to take their place.

    But deforestation today, particularly in tropical areas, often looks very different, and forest regeneration is not common.

    Indeed, as Balboni and Olken emphasize, deforestation is now rampant partly because the profits from chopping down trees come not just from timber, but from replacing forests with agriculture. In Brazil, deforestation has increased along with agricultural prices; in Indonesia, clearing trees accelerated as the global price of palm oil went up, leading companies to replace forests with palm tree orchards.

    All this tree-clearing creates a familiar situation: The globally shared costs of climate change from deforestation are “externalities,” as economists say, imposed on everyone else by the people removing forest land. It is akin to a company that pollutes into a river, affecting the water quality of residents.

    “Economics has changed the way it thinks about this over the last 50 years, and two things are central,” Olken says. “The relevance of global externalities is very important, and the conceptualization of alternate land uses is very important.” This also means traditional forest-management guidance about regrowth is not enough. With the economic dynamics in mind, which policies might work, and why?

    The search for solutions

    As Balboni and Olken note, economists often recommend “Pigouvian” taxes (named after the British economist Arthur Pigou) in these cases, levied against people imposing externalities on others. And yet, it can be hard to identify who is doing the deforesting.

    Instead of taxing people for clearing forests, governments can pay people to keep forests intact. The UN uses Payments for Environmental Services (PES) as part of its REDD+ (Reducing Emissions from Deforestation and forest Degradation) program. However, it is similarly tough to identify the optimal landowners to subsidize, and these payments may not match the quick cash-in of deforestation. A 2017 study in Uganda showed PES reduced deforestation somewhat; a 2022 study in Indonesia found no reduction; another 2022 study, in Brazil, showed again that some forest protection resulted.

    “There’s mixed evidence from many of these [studies],” Balboni says. These policies, she notes, must reach people who would otherwise clear forests, and a key question is, “How can we assess their success compared to what would have happened anyway?”

    Some places have tried cash transfer programs for larger populations. In Indonesia, a 2020 study found such subsidies reduced deforestation near villages by 30 percent. But in Mexico, a similar program meant more people could afford milk and meat, again creating demand for more agriculture and thus leading to more forest-clearing.

    At this point, it might seem that laws simply banning deforestation in key areas would work best — indeed, about 16 percent of the world’s land overall is protected in some way. Yet the dynamics of protection are tricky. Even with protected areas in place, there is still “leakage” of deforestation into other regions. 

    Still more approaches exist, including “nonstate agreements,” such as the Amazon Soy Moratorium in Brazil, in which grain traders pledged not to buy soy from deforested lands, and reduced deforestation without “leakage.”

    Also, intriguingly, a 2008 policy change in the Brazilian Amazon made agricultural credit harder to obtain by requiring recipients to comply with environmental and land registration rules. The result? Deforestation dropped by up to 60 percent over nearly a decade. 

    Politics and pulp

    Overall, Balboni and Olken observe, beyond “externalities,” two major challenges exist. One, it is often unclear who holds property rights in forests. In these circumstances, deforestation seems to increase. Two, deforestation is subject to political battles.

    For instance, as economist Bard Harstad of Stanford University has observed, environmental lobbying is asymmetric. Balboni and Olken write: “The conservationist lobby must pay the government in perpetuity … while the deforestation-oriented lobby need pay only once to deforest in the present.” And political instability leads to more deforestation because “the current administration places lower value on future conservation payments.”

    Even so, national political measures can work. In the Amazon from 2001 to 2005, Brazilian deforestation rates were three to four times higher than on similar land across the border, but that imbalance vanished once the country passed conservation measures in 2006. However, deforestation ramped up again after a 2014 change in government. Looking at particular monitoring approaches, a study of Brazil’s satellite-based Real-Time System for Detection of Deforestation (DETER), launched in 2004, suggests that a 50 percent annual increase in its use in municipalities created a 25 percent reduction in deforestation from 2006 to 2016.

    How precisely politics matters may depend on the context. In a 2021 paper, Balboni and Olken (with three colleagues) found that deforestation actually decreased around elections in Indonesia. Conversely, in Brazil, one study found that deforestation rates were 8 to 10 percent higher where mayors were running for re-election between 2002 and 2012, suggesting incumbents had deforestation industry support.

    “The research there is aiming to understand what the political economy drivers are,” Olken says, “with the idea that if you understand those things, reform in those countries is more likely.”

    Looking ahead, Balboni and Olken also suggest that new research estimating the value of intact forest land intact could influence public debates. And while many scholars have studied deforestation in Brazil and Indonesia, fewer have examined the Democratic Republic of Congo, another deforestation leader, and sub-Saharan Africa.

    Deforestation is an ongoing crisis. But thanks to satellites and many recent studies, experts know vastly more about the problem than they did a decade or two ago, and with an economics toolkit, can evaluate the incentives and dynamics at play.

    “To the extent that there’s ambuiguity across different contexts with different findings, part of the point of our review piece is to draw out common themes — the important considerations in determining which policy levers can [work] in different circumstances,” Balboni says. “That’s a fast-evolving area. We don’t have all the answers, but part of the process is bringing together growing evidence about [everything] that affects how successful those choices can be.” More

  • in

    Tracking US progress on the path to a decarbonized economy

    Investments in new technologies and infrastucture that help reduce greenhouse gas emissions — everything from electric vehicles to heat pumps — are growing rapidly in the United States. Now, a new database enables these investments to be comprehensively monitored in real-time, thereby helping to assess the efficacy of policies designed to spur clean investments and address climate change.

    The Clean Investment Monitor (CIM), developed by a team at MIT’s Center for Energy and Environmental Policy Research (CEEPR) led by Institute Innovation Fellow Brian Deese and in collaboration with the Rhodium Group, an independent research firm, provides a timely and methodologically consistent tracking of all announced public and private investments in the manufacture and deployment of clean technologies and infrastructure in the U.S. The CIM offers a means of assessing the country’s progress in transitioning to a cleaner economy and reducing greenhouse gas emissions.

    In the year from July 1, 2022, to June 30, 2023, data from the CIM show, clean investments nationwide totaled $213 billion. To put that figure in perspective, 18 states in the U.S. have GDPs each lower than $213 billion.

    “As clean technology becomes a larger and larger sector in the United States, its growth will have far-reaching implications — for our economy, for our leadership in innovation, and for reducing our greenhouse gas emissions,” says Deese, who served as the director of the White House National Economic Council from January 2021 to February 2023. “The Clean Investment Monitor is a tool designed to help us understand and assess this growth in a real-time, comprehensive way. Our hope is that the CIM will enhance research and improve public policies designed to accelerate the clean energy transition.”

    Launched on Sept. 13, the CIM shows that the $213 billion invested over the last year reflects a 37 percent increase from the $155 billion invested in the previous 12-month period. According to CIM data, the fastest growth has been in the manufacturing sector, where investment grew 125 percent year-on-year, particularly in electric vehicle and solar manufacturing.

    Beyond manufacturing, the CIM also provides data on investment in clean energy production, such as solar, wind, and nuclear; industrial decarbonization, such as sustainable aviation fuels; and retail investments by households and businesses in technologies like heat pumps and zero-emission vehicles. The CIM’s data goes back to 2018, providing a baseline before the passage of the legislation in 2021 and 2022.

    “We’re really excited to bring MIT’s analytical rigor to bear to help develop the Clean Investment Monitor,” says Christopher Knittel, the George P. Shultz Professor of Energy Economics at the MIT Sloan School of Management and CEEPR’s faculty director. “Bolstered by Brian’s keen understanding of the policy world, this tool is poised to become the go-to reference for anyone looking to understand clean investment flows and what drives them.”

    In 2021 and 2022, the U.S. federal government enacted a series of new laws that together aimed to catalyze the largest-ever national investment in clean energy technologies and related infrastructure. The Clean Investment Monitor can also be used to track how well the legislation is living up to expectations.

    The three pieces of federal legislation — the Infrastructure Investment and Jobs Act, enacted in 2021, and the Inflation Reduction Act (IRA) and the CHIPS and Science Act, both enacted in 2022 — provide grants, loans, loan guarantees, and tax incentives to spur investments in technologies that reduce greenhouse gas emissions.

    The effectiveness of the legislation in hastening the U.S. transition to a clean economy will be crucial in determining whether the country reaches its goal of reducing greenhouse gas emissions by 50 percent to 52 percent below 2005 levels in 2030. An analysis earlier this year estimated that the IRA will lead to a 43 percent to 48 percent decline in economywide emissions below 2005 levels by 2035, compared with 27 percent to 35 percent in a reference scenario without the law’s provisions, helping bring the U.S. goal closer in reach.

    The Clean Investment Monitor is available at cleaninvestmentmonitor.org. More

  • in

    Desirée Plata appointed co-director of the MIT Climate and Sustainability Consortium

    Desirée Plata, associate professor of civil and environmental engineering at MIT, has been named co-director of the MIT Climate and Sustainability Consortium (MCSC), effective Sept. 1. Plata will serve on the MCSC’s leadership team alongside Anantha P. Chandrakasan, dean of the MIT School of Engineering, the Vannevar Bush Professor of Electrical Engineering and Computer Science, and MCSC chair; Elsa Olivetti, the Jerry McAfee Professor in Engineering, a professor of materials science and engineering, and associate dean of engineering, and MCSC co-director; and Jeremy Gregory, MCSC executive director.Plata succeeds Jeffrey Grossman, the Morton and Claire Goulder and Family Professor in Environmental Systems, who has served as co-director since the MCSC’s launch in January 2021. Grossman, who played a central role in the ideation and launch of the MCSC, will continue his work with the MCSC as strategic advisor.“Professor Plata is a valued member of the MIT community. She brings a deep understanding of and commitment to climate and sustainability initiatives at MIT, as well as extensive experience working with industry, to her new role within the MCSC,” says Chandrakasan. The MIT Climate and Sustainability Consortium is an academia-industry collaboration working to accelerate implementation of large-scale solutions across sectors of the global economy. It aims to lay the groundwork for one critical aspect of MIT’s continued and intensified commitment to climate: helping large companies usher in, adapt to, and prosper in a decarbonized world.“We are thrilled to bring Professor Plata’s knowledge, vision, and passion to our leadership team,” says Olivetti. “Her experience developing sustainable technologies that have the potential to improve the environment and reduce the impacts of climate change will help move our work forward in meaningful ways. We have valued Professor Plata’s contributions to the consortium and look forward to continuing our work with her.”Plata played a pivotal role in the creation and launch of the MCSC’s Climate and Sustainability Scholars Program and its yearlong course for MIT rising juniors and seniors — an effort that she and Olivetti were recently recognized for with the Class of 1960 Innovation in Education Fellowship. She has also been a member of the MCSC’s Faculty Steering Committee since the consortium’s launch, helping to shape and guide its vision and work.Plata is a dedicated researcher, educator, and mentor. A member of MIT’s faculty since 2018, Plata and her team at the Plata Lab are helping to guide industry to more environmentally sustainable practices and develop new ways to protect the health of the planet — using chemistry to understand the impact that industrial materials and processes have on the environment. By coupling devices that simulate industrial systems with computation, she helps industry develop more environmentally friendly practices.To celebrate her work in the lab, classroom, and community, Plata has received many awards and honors. In 2020, she won MIT’s prestigious Harold E. Edgerton Faculty Achievement Award, recognizing her innovative approach to environmentally sustainable industrial practices, her inspirational teaching and mentoring, and her service to MIT and the community. She is a two-time National Academy of Sciences Kavli Frontiers of Science Fellow, a two-time National Academy of Engineers Frontiers of Engineering Fellow, and a Caltech Young Investigator Sustainability Fellow. She has also won the ACS C. Ellen Gonter Environmental Chemistry Award, an NSF CAREER award, and the 2016 Odebrecht Award for Sustainable Development.Beyond her work in the academic space, Plata is co-founder of two climate- and energy-related startups: Nth Cycle and Moxair, illustrating her commitment to translating academic innovations for real-world implementation — a core value of the MCSC.Plata received her bachelor’s degree from Union College and her PhD from the MIT and Woods Hole Oceanographic Institution (MIT-WHOI) joint program in oceanography/applied ocean science and engineering. After receiving her doctorate, Plata held positions at Mount Holyoke College, Duke University, and Yale University.  More

  • in

    AI pilot programs look to reduce energy use and emissions on MIT campus

    Smart thermostats have changed the way many people heat and cool their homes by using machine learning to respond to occupancy patterns and preferences, resulting in a lower energy draw. This technology — which can collect and synthesize data — generally focuses on single-dwelling use, but what if this type of artificial intelligence could dynamically manage the heating and cooling of an entire campus? That’s the idea behind a cross-departmental effort working to reduce campus energy use through AI building controls that respond in real-time to internal and external factors. 

    Understanding the challenge

    Heating and cooling can be an energy challenge for campuses like MIT, where existing building management systems (BMS) can’t respond quickly to internal factors like occupancy fluctuations or external factors such as forecast weather or the carbon intensity of the grid. This results in using more energy than needed to heat and cool spaces, often to sub-optimal levels. By engaging AI, researchers have begun to establish a framework to understand and predict optimal temperature set points (the temperature at which a thermostat has been set to maintain) at the individual room level and take into consideration a host of factors, allowing the existing systems to heat and cool more efficiently, all without manual intervention. 

    “It’s not that different from what folks are doing in houses,” explains Les Norford, a professor of architecture at MIT, whose work in energy studies, controls, and ventilation connected him with the effort. “Except we have to think about things like how long a classroom may be used in a day, weather predictions, time needed to heat and cool a room, the effect of the heat from the sun coming in the window, and how the classroom next door might impact all of this.” These factors are at the crux of the research and pilots that Norford and a team are focused on. That team includes Jeremy Gregory, executive director of the MIT Climate and Sustainability Consortium; Audun Botterud, principal research scientist for the Laboratory for Information and Decision Systems; Steve Lanou, project manager in the MIT Office of Sustainability (MITOS); Fran Selvaggio, Department of Facilities Senior Building Management Systems engineer; and Daisy Green and You Lin, both postdocs.

    The group is organized around the call to action to “explore possibilities to employ artificial intelligence to reduce on-campus energy consumption” outlined in Fast Forward: MIT’s Climate Action Plan for the Decade, but efforts extend back to 2019. “As we work to decarbonize our campus, we’re exploring all avenues,” says Vice President for Campus Services and Stewardship Joe Higgins, who originally pitched the idea to students at the 2019 MIT Energy Hack. “To me, it was a great opportunity to utilize MIT expertise and see how we can apply it to our campus and share what we learn with the building industry.” Research into the concept kicked off at the event and continued with undergraduate and graduate student researchers running differential equations and managing pilots to test the bounds of the idea. Soon, Gregory, who is also a MITOS faculty fellow, joined the project and helped identify other individuals to join the team. “My role as a faculty fellow is to find opportunities to connect the research community at MIT with challenges MIT itself is facing — so this was a perfect fit for that,” Gregory says. 

    Early pilots of the project focused on testing thermostat set points in NW23, home to the Department of Facilities and Office of Campus Planning, but Norford quickly realized that classrooms provide many more variables to test, and the pilot was expanded to Building 66, a mixed-use building that is home to classrooms, offices, and lab spaces. “We shifted our attention to study classrooms in part because of their complexity, but also the sheer scale — there are hundreds of them on campus, so [they offer] more opportunities to gather data and determine parameters of what we are testing,” says Norford. 

    Developing the technology

    The work to develop smarter building controls starts with a physics-based model using differential equations to understand how objects can heat up or cool down, store heat, and how the heat may flow across a building façade. External data like weather, carbon intensity of the power grid, and classroom schedules are also inputs, with the AI responding to these conditions to deliver an optimal thermostat set point each hour — one that provides the best trade-off between the two objectives of thermal comfort of occupants and energy use. That set point then tells the existing BMS how much to heat up or cool down a space. Real-life testing follows, surveying building occupants about their comfort. Botterud, whose research focuses on the interactions between engineering, economics, and policy in electricity markets, works to ensure that the AI algorithms can then translate this learning into energy and carbon emission savings. 

    Currently the pilots are focused on six classrooms within Building 66, with the intent to move onto lab spaces before expanding to the entire building. “The goal here is energy savings, but that’s not something we can fully assess until we complete a whole building,” explains Norford. “We have to work classroom by classroom to gather the data, but are looking at a much bigger picture.” The research team used its data-driven simulations to estimate significant energy savings while maintaining thermal comfort in the six classrooms over two days, but further work is needed to implement the controls and measure savings across an entire year. 

    With significant savings estimated across individual classrooms, the energy savings derived from an entire building could be substantial, and AI can help meet that goal, explains Botterud: “This whole concept of scalability is really at the heart of what we are doing. We’re spending a lot of time in Building 66 to figure out how it works and hoping that these algorithms can be scaled up with much less effort to other rooms and buildings so solutions we are developing can make a big impact at MIT,” he says.

    Part of that big impact involves operational staff, like Selvaggio, who are essential in connecting the research to current operations and putting them into practice across campus. “Much of the BMS team’s work is done in the pilot stage for a project like this,” he says. “We were able to get these AI systems up and running with our existing BMS within a matter of weeks, allowing the pilots to get off the ground quickly.” Selvaggio says in preparation for the completion of the pilots, the BMS team has identified an additional 50 buildings on campus where the technology can easily be installed in the future to start energy savings. The BMS team also collaborates with the building automation company, Schneider Electric, that has implemented the new control algorithms in Building 66 classrooms and is ready to expand to new pilot locations. 

    Expanding impact

    The successful completion of these programs will also open the possibility for even greater energy savings — bringing MIT closer to its decarbonization goals. “Beyond just energy savings, we can eventually turn our campus buildings into a virtual energy network, where thousands of thermostats are aggregated and coordinated to function as a unified virtual entity,” explains Higgins. These types of energy networks can accelerate power sector decarbonization by decreasing the need for carbon-intensive power plants at peak times and allowing for more efficient power grid energy use.

    As pilots continue, they fulfill another call to action in Fast Forward — for campus to be a “test bed for change.” Says Gregory: “This project is a great example of using our campus as a test bed — it brings in cutting-edge research to apply to decarbonizing our own campus. It’s a great project for its specific focus, but also for serving as a model for how to utilize the campus as a living lab.” More

  • in

    Jackson Jewett wants to design buildings that use less concrete

    After three years leading biking tours through U.S. National Parks, Jackson Jewett decided it was time for a change.

    “It was a lot of fun, but I realized I missed buildings,” says Jewett. “I really wanted to be a part of that industry, learn more about it, and reconnect with my roots in the built environment.”

    Jewett grew up in California in what he describes as a “very creative household.”

    “I remember making very elaborate Halloween costumes with my parents, making fun dioramas for school projects, and building forts in the backyard, that kind of thing,” Jewett explains.

    Both of his parents have backgrounds in design; his mother studied art in college and his father is a practicing architect. From a young age, Jewett was interested in following in his father’s footsteps. But when he arrived at the University of California at Berkeley in the midst of the 2009 housing crash, it didn’t seem like the right time. Jewett graduated with a degree in cognitive science and a minor in history of architecture. And even as he led tours through Yellowstone, the Grand Canyon, and other parks, buildings were in the back of his mind.

    It wasn’t just the built environment that Jewett was missing. He also longed for the rigor and structure of an academic environment.

    Jewett arrived at MIT in 2017, initially only planning on completing the master’s program in civil and environmental engineering. It was then that he first met Josephine Carstensen, a newly hired lecturer in the department. Jewett was interested in Carstensen’s work on “topology optimization,” which uses algorithms to design structures that can achieve their performance requirements while using only a limited amount of material. He was particularly interested in applying this approach to concrete design, and he collaborated with Carstensen to help demonstrate its viability.

    After earning his master’s, Jewett spent a year and a half as a structural engineer in New York City. But when Carstensen was hired as a professor, she reached out to Jewett about joining her lab as a PhD student. He was ready for another change.

    Now in the third year of his PhD program, Jewett’s dissertation work builds upon his master’s thesis to further refine algorithms that can design building-scale concrete structures that use less material, which would help lower carbon emissions from the construction industry. It is estimated that the concrete industry alone is responsible for 8 percent of global carbon emissions, so any efforts to reduce that number could help in the fight against climate change.

    Implementing new ideas

    Topology optimization is a small field, with the bulk of the prior work being computational without any experimental verification. The work Jewett completed for his master’s thesis was just the start of a long learning process.

    “I do feel like I’m just getting to the part where I can start implementing my own ideas without as much support as I’ve needed in the past,” says Jewett. “In the last couple of months, I’ve been working on a reinforced concrete optimization algorithm that I hope will be the cornerstone of my thesis.”

    The process of fine-tuning a generative algorithm is slow going, particularly when tackling a multifaceted problem.

    “It can take days or usually weeks to take a step toward making it work as an entire integrated system,” says Jewett. “The days when that breakthrough happens and I can see the algorithm converging on a solution that makes sense — those are really exciting moments.”

    By harnessing computational power, Jewett is searching for materially efficient components that can be used to make up structures such as bridges or buildings. These are other constraints to consider as well, particularly ensuring that the cost of manufacturing isn’t too high. Having worked in the industry before starting the PhD program, Jewett has an eye toward doing work that can be feasibly implemented.

    Inspiring others

    When Jewett first visited MIT campus, he was drawn in by the collaborative environment of the institute and the students’ drive to learn. Now, he’s a part of that process as a teaching assistant and a supervisor in the Undergraduate Research Opportunities Program.  

    Working as a teaching assistant isn’t a requirement for Jewett’s program, but it’s been one of his favorite parts of his time at MIT.

    “The MIT undergrads are so gifted and just constantly impress me,” says Jewett. “Being able to teach, especially in the context of what MIT values is a lot of fun. And I learn, too. My coding practices have gotten so much better since working with undergrads here.”

    Jewett’s experiences have inspired him to pursue a career in academia after the completion of his program, which he expects to complete in the spring of 2025. But he’s making sure to take care of himself along the way. He still finds time to plan cycling trips with his friends and has gotten into running ever since moving to Boston. So far, he’s completed two marathons.

    “It’s so inspiring to be in a place where so many good ideas are just bouncing back and forth all over campus,” says Jewett. “And on most days, I remember that and it inspires me. But it’s also the case that academics is hard, PhD programs are hard, and MIT — there’s pressure being here, and sometimes that pressure can feel like it’s working against you.”

    Jewett is grateful for the mental health resources that MIT provides students. While he says it can be imperfect, it’s been a crucial part of his journey.

    “My PhD thesis will be done in 2025, but the work won’t be done. The time horizon of when these things need to be implemented is relatively short if we want to make an impact before global temperatures have already risen too high. My PhD research will be developing a framework for how that could be done with concrete construction, but I’d like to keep thinking about other materials and construction methods even after this project is finished.” More