More stories

  • in

    Solve Challenge Finals 2023: Action in service to the world

    In a celebratory convergence of innovation and global impact, the 2023 Solve Challenge Finals, hosted by MIT Solve, unfolded to welcome the 2023 Solver Class. These teams, resolute in their commitment to addressing Solve’s 2023 Global Challenges and rooted in advancing the United Nation’s Sustainable Development Goals, serve as the perfect examples of the impact technology can have when addressed toward social good.

    To set the tone of the day, Cynthia Barnhart, MIT provost, called for bold action in service to the world, and Hala Hanna, MIT Solve executive director, urged the new Solver teams and attendees to harness the power of technology for benevolent purposes. “Humans have lived with the dichotomy of technology since the dawn of time. Today we find ourselves at another juncture with generative AI, and we have choices to make. So, what if we choose that every line of code heals, and every algorithm uplifts, and every device includes?” she said during the opening plenary, Tech-Powered and Locally-Led: Solutions for Global Progress.

    Global, intergenerational, and contextual change for good

    This year’s Solve Challenge Finals served as a global platform for reflection. Majid Al Suwaidi, director-general of COP28, shared the experiences that have shaped his approach to climate negotiation. He recounted a poignant visit to a United Nations High Commissioner for Refugees-facilitated refugee camp housing 300,000 climate migrants. There he met a mother and her nine children. In a sprawling camp housing 300,000 people, scarcity was evident, with just one toilet for every 100 residents. “There are people who contribute nothing to the problem but are impacted the most,” Majid emphasized, stressing the need to prioritize those most affected by climate change when crafting solutions.

    Moderator Lysa John, secretary-general of CIVICUS, steered the conversation toward Africa’s growing influence during her fireside chat with David Sengeh SM ’12, PhD ’16, chief minister of Sierra Leone, and Toyin Saraki, president of the Wellbeing Foundation. The African Union was recently named a permanent member of the G20. Saraki passionately advocated for Africa to assert itself: “I would like this to be more than just the North recognizing the South. This is the time now for us to bring African intelligence to the forefront. We have to bring our own people, our own data, our own resources.” She also called for an intergenerational shift, recognizing the readiness of the younger generation to lead.

    Sengeh, who is 36 himself, emphasized that young people are natural leaders, especially in a nation where 70 percent of the population is youth. He challenged the status quo, urging society to entrust leadership roles to the younger generation.

    Saraki praised Solve as a vital incubation hub, satisfying the need for contextual innovation while contributing to global progress. She views Solve as a marketplace of solutions to systemic weaknesses, drawing upon the diverse approaches of innovators both young and old. “That is the generation of intelligence that needs to grow, not just in Africa. Solve is amazing for that, it’s an investor’s delight,” she said.

    Henrietta Fore, managing partner, chair, and CEO of Radiate Capital, Holsman International, shared an example of entrepreneurship catalyzed by country-level leaders, referencing India’s Swachh Bharat program aimed at promoting cleaner environments. The government initiative led to a burst of entrepreneurial activity, with women opening various shops for toilets and bathroom commodities. Fore highlighted the potential for companies to collaborate with countries on such programs, creating momentum and innovation.

    Trust as capital

    Trust was a prevalent theme throughout the event, from personal to business levels.

    Johanna Mair, academic editor of the Stanford Social Innovation Review, asked Sarah Chandler, vice president of environment and supply chain innovation at Apple, for advice she may have for corporations and startups thinking about their holistic climate goals. Chandler emphasized the importance of trust that businesses must have that environmental goals can align with business goals, highlighting Apple’s 45 percent reduction in carbon footprint since 2015 and 65 percent revenue increase.

    Neela Montgomery, board partner at Greycroft, discussed her initial skepticism around collaborating with large entities, seeking advice from Ilan Goldfajn, president of the Inter-American Development Bank. “Don’t be shy to come … take advantage of a multilateral bank … think about multilateral organizations as the ones to make connections. We can be your support commercially and financially, we could be your clients, and we could be your promoters,” said Goldfajn.

    During a fireside chat among Janti Soeripto, president and CEO of Save the Children USA, and Imran Ahmed, founder and CEO of Center for Countering Digital, Soeripto shared her belief that the most effective change comes from the country and local community level. She pointed to a contextual example of this where Save the Children invested in scaling a small Austrian ed-tech startup — Library for All. The partnership positively impacted literacy for other communities around the world by making literature more accessible.

    There still exist major hurdles for small enterprises to enter the global market. Imran points to sclerosis and hesitancy to trust small-scale innovation as a roadblock to meaningful change. 

    The final discussion of the closing plenary, Funding the Future: Scaling up Inclusive Impact, featured Fore; Mohammed Nanabhay, managing partner of Mozilla Ventures; and Alfred Ironside, vice president of communications at MIT, who asked the two panelists, “What do you [look for] when thinking about putting money into leaders and organizations who are on this mission to create impact and achieve scale?”

    Beyond aligning principles with organizations, Nanabhay said that he looks for tenacity and, most importantly, trust in oneself. “Entrepreneurship is a long journey, it’s a hard journey — whether you’re on the for-profit side or the nonprofit side. It’s easy to say people should have grit, everyone says this. When the time comes and you’re struggling … you need to have the fundamental belief that what you’re working on is meaningful and that it’s going to make the world better.” More

  • in

    A reciprocal relationship with the land in Hawaiʻi

    Aja Grande grew up on the Hawaiian island of Oʻahu, between the Kona and ʻEwa districts, nurtured by her community and the natural environment. Her family has lived in Hawaiʻi for generations; while she is not “Kanaka ʻŌiwi,” of native Hawaiian descent, she is proud to trace her family’s history to the time of the Hawaiian Kingdom in the 19th century. Grande is now a PhD candidate in MIT’s HASTS (History, Anthropology, Science, Technology and Society) program, and part of her dissertation tracks how Hawaiian culture and people’s relationship with the land has evolved throughout time.

    “The fondest memories I have are camping on the north shore every summer with my friends,” says Grande. “I loved being in ‘ke kai’ (the sea) and ‘ma uka,’ (inland, in the mountains) with my friends when I was younger. It was just pure fun exploring ‘ʻāina’ like that.” “‘Āina” in the Hawaiian language is often defined as “land,” but is understood to the people of Hawaiʻi as “that which feeds.”

    “Now that I’m older,” Grande adds, “I’m connecting the dots and realizing how much knowledge about the complex systems of ‘ahupuaʻa’ [traditional Hawaiian divisions of land that extend from the mountains to the sea], I actually gained through these experiences.”

    Grande recently completed a year of fieldwork in Hawaiʻi where she volunteered with land-based, or ‘āina-based organizations. In the movement to restore ‘āina to “momona,” or  “fertile and abundant lands,” the land and the people who serve as its stewards are of equal importance.

    “I’m looking at how people who are not Kanaka ‘Ōiwi, or native Hawaiian, by descent can participate in this kind of restoration, and what it means for both Kanaka ‘Ōiwi and non-Kanaka ‘Ōiwi to participate in it,” says Grande, who herself descends from immigrants of other island nations. “Some of my ancestors were born and raised in Hawaiʻi before the U.S. subjected Hawaiʻi as a state and territory, meaning that some of them were Hawaiian Kingdom subjects. While, I am not Kanaka ʻŌiwi by lineage, some of my ‘ohana nui (extended family), from these same ancestors, are Kanaka ʻŌiwi. I’m writing about how being Hawaiian, from a Hawaiian sovereignty standpoint, is not just about race and ethnicity. When Hawaiʻi was a sovereign nation, Hawaiian citizenship was never afforded on the basis of race alone. It was also based on your lifelong commitment to ‘āina and the people of Hawaiʻi.”

    The project is personal to Grande, who describes both the content and the process of writing it as part of her healing journey. She hopes to lay the groundwork for others who are “hoaʻāina,” or “those who actively care for ʻāina,” in Hawaiʻi, but not Kanaka ʻŌiwi to better articulate their identities and foster a deeper connection with the ʻāina and the “kaiāulu,” or “community,” they love and actively care for.

    Returning home

    Grande has spent her academic career on the East Coast, first at Brown University, where she received a degree in science, technology, and society, and now at MIT in the HASTS program. She swam competitively through her second year of college, and had earlier represented Hawaiʻi at the 2012 Oceania Games in New Caledonia. Once she stopped swimming, Grande first used her newfound time to travel the world. Tired of this transient lifestyle, she realized she was more interested in exploring her connection to land in a more rooted way.

    “Moving around, especially as a college student, it’s very hard to grow things,” says Grande. “People are a lot like plants. You really just need to let plants do their thing in place. We do really well and we thrive when we can be connected to place.”

    Grande started by founding the Ethnobotany Society at Brown to explore the relationship people have to plants. With the group she organized nature walks, collaborated with local farms, and connected it to the history she was learning in class.

    Still, the East Coast never quite felt like home to Grande. When she started planning for the fieldwork portion of her program, she envisioned spending half the year in New England and half in Hawaiʻi. But she soon realized how important it was for both her research and herself to dedicate everything to Hawai’i.

    “When I came back, it just felt so right to be back home,” says Grande. “The feeling in your naʻau — your ‘gut’ — of knowing that you have to contribute to Hawaiʻi is very powerful, and I think a lot of people here understand what that means. It’s kind of like a calling.”

    Hoaʻāina, community, family

    Once Grande made the decision to return home for her field work, she says everything fell into place.

    “I knew that I wanted to do something close to my heart. It’s a huge privilege because I was able to come home and learn more about myself and my family and how we are connected to Hawaiʻi,” she says.

    During her year of fieldwork, Grande learned how hoaʻāina cultivate spaces where the community can can work alongside one another to plant traditional food and medicinal crops, control invasive species, and more. She wasn’t just an observer, either. As much as Grande learned from an academic perspective, her personal growth has been intertwined with the entire process.

    “The most interesting part was that all the hoaʻāina I volunteered with helped me to understand my place back home,” says Grande. “They were my informants but also — this usually happens with anthropologists — people become your friends. The hoaʻāina I volunteered with treated me like family. They also got to know some of my family members, who joined me to volunteer at different sites. It’s sometimes hard to drop a hard line between what fieldwork is and what your personal life is because when you’re in the field, there’s so many events that are connected to your work. It was so fun and meaningful to write about the ʻāina and people I consider my community and family.”

    The movement doesn’t start or end with Grande’s dissertation. Pursuing this project has given her the language to articulate her own relationship with ‘āina, and she hopes it will empower others to reexamine how they exist in relation to land.

    After completing her program, Grande intends to stay in Hawaiʻi and continue philanthropy work while contributing to the movement of ʻāina momona.

    “We want the land to live and to keep a relationship with the land. That’s the emotional part. I have a ‘kuleana,’ (duty and responsibility) to everything that I learned while growing up, including the ʻāina and ‘kaiāulu,’ (community) who raised me. The more you learn, there’s so much that you want to protect about the culture and this ‘āina.” More

  • in

    New tools are available to help reduce the energy that AI models devour

    When searching for flights on Google, you may have noticed that each flight’s carbon-emission estimate is now presented next to its cost. It’s a way to inform customers about their environmental impact, and to let them factor this information into their decision-making.

    A similar kind of transparency doesn’t yet exist for the computing industry, despite its carbon emissions exceeding those of the entire airline industry. Escalating this energy demand are artificial intelligence models. Huge, popular models like ChatGPT signal a trend of large-scale artificial intelligence, boosting forecasts that predict data centers will draw up to 21 percent of the world’s electricity supply by 2030.

    The MIT Lincoln Laboratory Supercomputing Center (LLSC) is developing techniques to help data centers reel in energy use. Their techniques range from simple but effective changes, like power-capping hardware, to adopting novel tools that can stop AI training early on. Crucially, they have found that these techniques have a minimal impact on model performance.

    In the wider picture, their work is mobilizing green-computing research and promoting a culture of transparency. “Energy-aware computing is not really a research area, because everyone’s been holding on to their data,” says Vijay Gadepally, senior staff in the LLSC who leads energy-aware research efforts. “Somebody has to start, and we’re hoping others will follow.”

    Curbing power and cooling down

    Like many data centers, the LLSC has seen a significant uptick in the number of AI jobs running on its hardware. Noticing an increase in energy usage, computer scientists at the LLSC were curious about ways to run jobs more efficiently. Green computing is a principle of the center, which is powered entirely by carbon-free energy.

    Training an AI model — the process by which it learns patterns from huge datasets — requires using graphics processing units (GPUs), which are power-hungry hardware. As one example, the GPUs that trained GPT-3 (the precursor to ChatGPT) are estimated to have consumed 1,300 megawatt-hours of electricity, roughly equal to that used by 1,450 average U.S. households per month.

    While most people seek out GPUs because of their computational power, manufacturers offer ways to limit the amount of power a GPU is allowed to draw. “We studied the effects of capping power and found that we could reduce energy consumption by about 12 percent to 15 percent, depending on the model,” Siddharth Samsi, a researcher within the LLSC, says.

    The trade-off for capping power is increasing task time — GPUs will take about 3 percent longer to complete a task, an increase Gadepally says is “barely noticeable” considering that models are often trained over days or even months. In one of their experiments in which they trained the popular BERT language model, limiting GPU power to 150 watts saw a two-hour increase in training time (from 80 to 82 hours) but saved the equivalent of a U.S. household’s week of energy.

    The team then built software that plugs this power-capping capability into the widely used scheduler system, Slurm. The software lets data center owners set limits across their system or on a job-by-job basis.

    “We can deploy this intervention today, and we’ve done so across all our systems,” Gadepally says.

    Side benefits have arisen, too. Since putting power constraints in place, the GPUs on LLSC supercomputers have been running about 30 degrees Fahrenheit cooler and at a more consistent temperature, reducing stress on the cooling system. Running the hardware cooler can potentially also increase reliability and service lifetime. They can now consider delaying the purchase of new hardware — reducing the center’s “embodied carbon,” or the emissions created through the manufacturing of equipment — until the efficiencies gained by using new hardware offset this aspect of the carbon footprint. They’re also finding ways to cut down on cooling needs by strategically scheduling jobs to run at night and during the winter months.

    “Data centers can use these easy-to-implement approaches today to increase efficiencies, without requiring modifications to code or infrastructure,” Gadepally says.

    Taking this holistic look at a data center’s operations to find opportunities to cut down can be time-intensive. To make this process easier for others, the team — in collaboration with Professor Devesh Tiwari and Baolin Li at Northeastern University — recently developed and published a comprehensive framework for analyzing the carbon footprint of high-performance computing systems. System practitioners can use this analysis framework to gain a better understanding of how sustainable their current system is and consider changes for next-generation systems.  

    Adjusting how models are trained and used

    On top of making adjustments to data center operations, the team is devising ways to make AI-model development more efficient.

    When training models, AI developers often focus on improving accuracy, and they build upon previous models as a starting point. To achieve the desired output, they have to figure out what parameters to use, and getting it right can take testing thousands of configurations. This process, called hyperparameter optimization, is one area LLSC researchers have found ripe for cutting down energy waste. 

    “We’ve developed a model that basically looks at the rate at which a given configuration is learning,” Gadepally says. Given that rate, their model predicts the likely performance. Underperforming models are stopped early. “We can give you a very accurate estimate early on that the best model will be in this top 10 of 100 models running,” he says.

    In their studies, this early stopping led to dramatic savings: an 80 percent reduction in the energy used for model training. They’ve applied this technique to models developed for computer vision, natural language processing, and material design applications.

    “In my opinion, this technique has the biggest potential for advancing the way AI models are trained,” Gadepally says.

    Training is just one part of an AI model’s emissions. The largest contributor to emissions over time is model inference, or the process of running the model live, like when a user chats with ChatGPT. To respond quickly, these models use redundant hardware, running all the time, waiting for a user to ask a question.

    One way to improve inference efficiency is to use the most appropriate hardware. Also with Northeastern University, the team created an optimizer that matches a model with the most carbon-efficient mix of hardware, such as high-power GPUs for the computationally intense parts of inference and low-power central processing units (CPUs) for the less-demanding aspects. This work recently won the best paper award at the International ACM Symposium on High-Performance Parallel and Distributed Computing.

    Using this optimizer can decrease energy use by 10-20 percent while still meeting the same “quality-of-service target” (how quickly the model can respond).

    This tool is especially helpful for cloud customers, who lease systems from data centers and must select hardware from among thousands of options. “Most customers overestimate what they need; they choose over-capable hardware just because they don’t know any better,” Gadepally says.

    Growing green-computing awareness

    The energy saved by implementing these interventions also reduces the associated costs of developing AI, often by a one-to-one ratio. In fact, cost is usually used as a proxy for energy consumption. Given these savings, why aren’t more data centers investing in green techniques?

    “I think it’s a bit of an incentive-misalignment problem,” Samsi says. “There’s been such a race to build bigger and better models that almost every secondary consideration has been put aside.”

    They point out that while some data centers buy renewable-energy credits, these renewables aren’t enough to cover the growing energy demands. The majority of electricity powering data centers comes from fossil fuels, and water used for cooling is contributing to stressed watersheds. 

    Hesitancy may also exist because systematic studies on energy-saving techniques haven’t been conducted. That’s why the team has been pushing their research in peer-reviewed venues in addition to open-source repositories. Some big industry players, like Google DeepMind, have applied machine learning to increase data center efficiency but have not made their work available for others to deploy or replicate. 

    Top AI conferences are now pushing for ethics statements that consider how AI could be misused. The team sees the climate aspect as an AI ethics topic that has not yet been given much attention, but this also appears to be slowly changing. Some researchers are now disclosing the carbon footprint of training the latest models, and industry is showing a shift in energy transparency too, as in this recent report from Meta AI.

    They also acknowledge that transparency is difficult without tools that can show AI developers their consumption. Reporting is on the LLSC roadmap for this year. They want to be able to show every LLSC user, for every job, how much energy they consume and how this amount compares to others, similar to home energy reports.

    Part of this effort requires working more closely with hardware manufacturers to make getting these data off hardware easier and more accurate. If manufacturers can standardize the way the data are read out, then energy-saving and reporting tools can be applied across different hardware platforms. A collaboration is underway between the LLSC researchers and Intel to work on this very problem.

    Even for AI developers who are aware of the intense energy needs of AI, they can’t do much on their own to curb this energy use. The LLSC team wants to help other data centers apply these interventions and provide users with energy-aware options. Their first partnership is with the U.S. Air Force, a sponsor of this research, which operates thousands of data centers. Applying these techniques can make a significant dent in their energy consumption and cost.

    “We’re putting control into the hands of AI developers who want to lessen their footprint,” Gadepally says. “Do I really need to gratuitously train unpromising models? Am I willing to run my GPUs slower to save energy? To our knowledge, no other supercomputing center is letting you consider these options. Using our tools, today, you get to decide.”

    Visit this webpage to see the group’s publications related to energy-aware computing and findings described in this article. More

  • in

    Desalination system could produce freshwater that is cheaper than tap water

    Engineers at MIT and in China are aiming to turn seawater into drinking water with a completely passive device that is inspired by the ocean, and powered by the sun.

    In a paper appearing today in the journal Joule, the team outlines the design for a new solar desalination system that takes in saltwater and heats it with natural sunlight.

    The configuration of the device allows water to circulate in swirling eddies, in a manner similar to the much larger “thermohaline” circulation of the ocean. This circulation, combined with the sun’s heat, drives water to evaporate, leaving salt behind. The resulting water vapor can then be condensed and collected as pure, drinkable water. In the meantime, the leftover salt continues to circulate through and out of the device, rather than accumulating and clogging the system.

    The new system has a higher water-production rate and a higher salt-rejection rate than all other passive solar desalination concepts currently being tested.

    The researchers estimate that if the system is scaled up to the size of a small suitcase, it could produce about 4 to 6 liters of drinking water per hour and last several years before requiring replacement parts. At this scale and performance, the system could produce drinking water at a rate and price that is cheaper than tap water.

    “For the first time, it is possible for water, produced by sunlight, to be even cheaper than tap water,” says Lenan Zhang, a research scientist in MIT’s Device Research Laboratory.

    The team envisions a scaled-up device could passively produce enough drinking water to meet the daily requirements of a small family. The system could also supply off-grid, coastal communities where seawater is easily accessible.

    Zhang’s study co-authors include MIT graduate student Yang Zhong and Evelyn Wang, the Ford Professor of Engineering, along with Jintong Gao, Jinfang You, Zhanyu Ye, Ruzhu Wang, and Zhenyuan Xu of Shanghai Jiao Tong University in China.

    A powerful convection

    The team’s new system improves on their previous design — a similar concept of multiple layers, called stages. Each stage contained an evaporator and a condenser that used heat from the sun to passively separate salt from incoming water. That design, which the team tested on the roof of an MIT building, efficiently converted the sun’s energy to evaporate water, which was then condensed into drinkable water. But the salt that was left over quickly accumulated as crystals that clogged the system after a few days. In a real-world setting, a user would have to place stages on a frequent basis, which would significantly increase the system’s overall cost.

    In a follow-up effort, they devised a solution with a similar layered configuration, this time with an added feature that helped to circulate the incoming water as well as any leftover salt. While this design prevented salt from settling and accumulating on the device, it desalinated water at a relatively low rate.

    In the latest iteration, the team believes it has landed on a design that achieves both a high water-production rate, and high salt rejection, meaning that the system can quickly and reliably produce drinking water for an extended period. The key to their new design is a combination of their two previous concepts: a multistage system of evaporators and condensers, that is also configured to boost the circulation of water — and salt — within each stage.

    “We introduce now an even more powerful convection, that is similar to what we typically see in the ocean, at kilometer-long scales,” Xu says.

    The small circulations generated in the team’s new system is similar to the “thermohaline” convection in the ocean — a phenomenon that drives the movement of water around the world, based on differences in sea temperature (“thermo”) and salinity (“haline”).

    “When seawater is exposed to air, sunlight drives water to evaporate. Once water leaves the surface, salt remains. And the higher the salt concentration, the denser the liquid, and this heavier water wants to flow downward,” Zhang explains. “By mimicking this kilometer-wide phenomena in small box, we can take advantage of this feature to reject salt.”

    Tapping out

    The heart of the team’s new design is a single stage that resembles a thin box, topped with a dark material that efficiently absorbs the heat of the sun. Inside, the box is separated into a top and bottom section. Water can flow through the top half, where the ceiling is lined with an evaporator layer that uses the sun’s heat to warm up and evaporate any water in direct contact. The water vapor is then funneled to the bottom half of the box, where a condensing layer air-cools the vapor into salt-free, drinkable liquid. The researchers set the entire box at a tilt within a larger, empty vessel, then attached a tube from the top half of the box down through the bottom of the vessel, and floated the vessel in saltwater.

    In this configuration, water can naturally push up through the tube and into the box, where the tilt of the box, combined with the thermal energy from the sun, induces the water to swirl as it flows through. The small eddies help to bring water in contact with the upper evaporating layer while keeping salt circulating, rather than settling and clogging.

    The team built several prototypes, with one, three, and 10 stages, and tested their performance in water of varying salinity, including natural seawater and water that was seven times saltier.

    From these tests, the researchers calculated that if each stage were scaled up to a square meter, it would produce up to 5 liters of drinking water per hour, and that the system could desalinate water without accumulating salt for several years. Given this extended lifetime, and the fact that the system is entirely passive, requiring no electricity to run, the team estimates that the overall cost of running the system would be cheaper than what it costs to produce tap water in the United States.

    “We show that this device is capable of achieving a long lifetime,” Zhong says. “That means that, for the first time, it is possible for drinking water produced by sunlight to be cheaper than tap water. This opens up the possibility for solar desalination to address real-world problems.”

    “This is a very innovative approach that effectively mitigates key challenges in the field of desalination,” says Guihua Yu, who develops sustainable water and energy storage systems at the University of Texas at Austin, and was not involved in the research. “The design is particularly beneficial for regions struggling with high-salinity water. Its modular design makes it highly suitable for household water production, allowing for scalability and adaptability to meet individual needs.”

    Funding for the research at Shanghai Jiao Tong University was supported by the Natural Science Foundation of China. More

  • in

    Improving US air quality, equitably

    Decarbonization of national economies will be key to achieving global net-zero emissions by 2050, a major stepping stone to the Paris Agreement’s long-term goal of keeping global warming well below 2 degrees Celsius (and ideally 1.5 C), and thereby averting the worst consequences of climate change. Toward that end, the United States has pledged to reduce its greenhouse gas emissions by 50-52 percent from 2005 levels by 2030, backed by its implementation of the 2022 Inflation Reduction Act. This strategy is consistent with a 50-percent reduction in carbon dioxide (CO2) by the end of the decade.

    If U.S. federal carbon policy is successful, the nation’s overall air quality will also improve. Cutting CO2 emissions reduces atmospheric concentrations of air pollutants that lead to the formation of fine particulate matter (PM2.5), which causes more than 200,000 premature deaths in the United States each year. But an average nationwide improvement in air quality will not be felt equally; air pollution exposure disproportionately harms people of color and lower-income populations.

    How effective are current federal decarbonization policies in reducing U.S. racial and economic disparities in PM2.5 exposure, and what changes will be needed to improve their performance? To answer that question, researchers at MIT and Stanford University recently evaluated a range of policies which, like current U.S. federal carbon policies, reduce economy-wide CO2 emissions by 40-60 percent from 2005 levels by 2030. Their findings appear in an open-access article in the journal Nature Communications.

    First, they show that a carbon-pricing policy, while effective in reducing PM2.5 exposure for all racial/ethnic groups, does not significantly mitigate relative disparities in exposure. On average, the white population undergoes far less exposure than Black, Hispanic, and Asian populations. This policy does little to reduce exposure disparities because the CO2 emissions reductions that it achieves primarily occur in the coal-fired electricity sector. Other sectors, such as industry and heavy-duty diesel transportation, contribute far more PM2.5-related emissions.

    The researchers then examine thousands of different reduction options through an optimization approach to identify whether any possible combination of carbon dioxide reductions in the range of 40-60 percent can mitigate disparities. They find that that no policy scenario aligned with current U.S. carbon dioxide emissions targets is likely to significantly reduce current PM2.5 exposure disparities.

    “Policies that address only about 50 percent of CO2 emissions leave many polluting sources in place, and those that prioritize reductions for minorities tend to benefit the entire population,” says Noelle Selin, supervising author of the study and a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences. “This means that a large range of policies that reduce CO2 can improve air quality overall, but can’t address long-standing inequities in air pollution exposure.”

    So if climate policy alone cannot adequately achieve equitable air quality results, what viable options remain? The researchers suggest that more ambitious carbon policies could narrow racial and economic PM2.5 exposure disparities in the long term, but not within the next decade. To make a near-term difference, they recommend interventions designed to reduce PM2.5 emissions resulting from non-CO2 sources, ideally at the economic sector or community level.

    “Achieving improved PM2.5 exposure for populations that are disproportionately exposed across the United States will require thinking that goes beyond current CO2 policy strategies, most likely involving large-scale structural changes,” says Selin. “This could involve changes in local and regional transportation and housing planning, together with accelerated efforts towards decarbonization.” More

  • in

    How to tackle the global deforestation crisis

    Imagine if France, Germany, and Spain were completely blanketed in forests — and then all those trees were quickly chopped down. That’s nearly the amount of deforestation that occurred globally between 2001 and 2020, with profound consequences.

    Deforestation is a major contributor to climate change, producing between 6 and 17 percent of global greenhouse gas emissions, according to a 2009 study. Meanwhile, because trees also absorb carbon dioxide, removing it from the atmosphere, they help keep the Earth cooler. And climate change aside, forests protect biodiversity.

    “Climate change and biodiversity make this a global problem, not a local problem,” says MIT economist Ben Olken. “Deciding to cut down trees or not has huge implications for the world.”

    But deforestation is often financially profitable, so it continues at a rapid rate. Researchers can now measure this trend closely: In the last quarter-century, satellite-based technology has led to a paradigm change in charting deforestation. New deforestation datasets, based on the Landsat satellites, for instance, track forest change since 2000 with resolution at 30 meters, while many other products now offer frequent imaging at close resolution.

    “Part of this revolution in measurement is accuracy, and the other part is coverage,” says Clare Balboni, an assistant professor of economics at the London School of Economics (LSE). “On-site observation is very expensive and logistically challenging, and you’re talking about case studies. These satellite-based data sets just open up opportunities to see deforestation at scale, systematically, across the globe.”

    Balboni and Olken have now helped write a new paper providing a road map for thinking about this crisis. The open-access article, “The Economics of Tropical Deforestation,” appears this month in the Annual Review of Economics. The co-authors are Balboni, a former MIT faculty member; Aaron Berman, a PhD candidate in MIT’s Department of Economics; Robin Burgess, an LSE professor; and Olken, MIT’s Jane Berkowitz Carlton and Dennis William Carlton Professor of Microeconomics. Balboni and Olken have also conducted primary research in this area, along with Burgess.

    So, how can the world tackle deforestation? It starts with understanding the problem.

    Replacing forests with farms

    Several decades ago, some thinkers, including the famous MIT economist Paul Samuelson in the 1970s, built models to study forests as a renewable resource; Samuelson calculated the “maximum sustained yield” at which a forest could be cleared while being regrown. These frameworks were designed to think about tree farms or the U.S. national forest system, where a fraction of trees would be cut each year, and then new trees would be grown over time to take their place.

    But deforestation today, particularly in tropical areas, often looks very different, and forest regeneration is not common.

    Indeed, as Balboni and Olken emphasize, deforestation is now rampant partly because the profits from chopping down trees come not just from timber, but from replacing forests with agriculture. In Brazil, deforestation has increased along with agricultural prices; in Indonesia, clearing trees accelerated as the global price of palm oil went up, leading companies to replace forests with palm tree orchards.

    All this tree-clearing creates a familiar situation: The globally shared costs of climate change from deforestation are “externalities,” as economists say, imposed on everyone else by the people removing forest land. It is akin to a company that pollutes into a river, affecting the water quality of residents.

    “Economics has changed the way it thinks about this over the last 50 years, and two things are central,” Olken says. “The relevance of global externalities is very important, and the conceptualization of alternate land uses is very important.” This also means traditional forest-management guidance about regrowth is not enough. With the economic dynamics in mind, which policies might work, and why?

    The search for solutions

    As Balboni and Olken note, economists often recommend “Pigouvian” taxes (named after the British economist Arthur Pigou) in these cases, levied against people imposing externalities on others. And yet, it can be hard to identify who is doing the deforesting.

    Instead of taxing people for clearing forests, governments can pay people to keep forests intact. The UN uses Payments for Environmental Services (PES) as part of its REDD+ (Reducing Emissions from Deforestation and forest Degradation) program. However, it is similarly tough to identify the optimal landowners to subsidize, and these payments may not match the quick cash-in of deforestation. A 2017 study in Uganda showed PES reduced deforestation somewhat; a 2022 study in Indonesia found no reduction; another 2022 study, in Brazil, showed again that some forest protection resulted.

    “There’s mixed evidence from many of these [studies],” Balboni says. These policies, she notes, must reach people who would otherwise clear forests, and a key question is, “How can we assess their success compared to what would have happened anyway?”

    Some places have tried cash transfer programs for larger populations. In Indonesia, a 2020 study found such subsidies reduced deforestation near villages by 30 percent. But in Mexico, a similar program meant more people could afford milk and meat, again creating demand for more agriculture and thus leading to more forest-clearing.

    At this point, it might seem that laws simply banning deforestation in key areas would work best — indeed, about 16 percent of the world’s land overall is protected in some way. Yet the dynamics of protection are tricky. Even with protected areas in place, there is still “leakage” of deforestation into other regions. 

    Still more approaches exist, including “nonstate agreements,” such as the Amazon Soy Moratorium in Brazil, in which grain traders pledged not to buy soy from deforested lands, and reduced deforestation without “leakage.”

    Also, intriguingly, a 2008 policy change in the Brazilian Amazon made agricultural credit harder to obtain by requiring recipients to comply with environmental and land registration rules. The result? Deforestation dropped by up to 60 percent over nearly a decade. 

    Politics and pulp

    Overall, Balboni and Olken observe, beyond “externalities,” two major challenges exist. One, it is often unclear who holds property rights in forests. In these circumstances, deforestation seems to increase. Two, deforestation is subject to political battles.

    For instance, as economist Bard Harstad of Stanford University has observed, environmental lobbying is asymmetric. Balboni and Olken write: “The conservationist lobby must pay the government in perpetuity … while the deforestation-oriented lobby need pay only once to deforest in the present.” And political instability leads to more deforestation because “the current administration places lower value on future conservation payments.”

    Even so, national political measures can work. In the Amazon from 2001 to 2005, Brazilian deforestation rates were three to four times higher than on similar land across the border, but that imbalance vanished once the country passed conservation measures in 2006. However, deforestation ramped up again after a 2014 change in government. Looking at particular monitoring approaches, a study of Brazil’s satellite-based Real-Time System for Detection of Deforestation (DETER), launched in 2004, suggests that a 50 percent annual increase in its use in municipalities created a 25 percent reduction in deforestation from 2006 to 2016.

    How precisely politics matters may depend on the context. In a 2021 paper, Balboni and Olken (with three colleagues) found that deforestation actually decreased around elections in Indonesia. Conversely, in Brazil, one study found that deforestation rates were 8 to 10 percent higher where mayors were running for re-election between 2002 and 2012, suggesting incumbents had deforestation industry support.

    “The research there is aiming to understand what the political economy drivers are,” Olken says, “with the idea that if you understand those things, reform in those countries is more likely.”

    Looking ahead, Balboni and Olken also suggest that new research estimating the value of intact forest land intact could influence public debates. And while many scholars have studied deforestation in Brazil and Indonesia, fewer have examined the Democratic Republic of Congo, another deforestation leader, and sub-Saharan Africa.

    Deforestation is an ongoing crisis. But thanks to satellites and many recent studies, experts know vastly more about the problem than they did a decade or two ago, and with an economics toolkit, can evaluate the incentives and dynamics at play.

    “To the extent that there’s ambuiguity across different contexts with different findings, part of the point of our review piece is to draw out common themes — the important considerations in determining which policy levers can [work] in different circumstances,” Balboni says. “That’s a fast-evolving area. We don’t have all the answers, but part of the process is bringing together growing evidence about [everything] that affects how successful those choices can be.” More

  • in

    Tracking US progress on the path to a decarbonized economy

    Investments in new technologies and infrastucture that help reduce greenhouse gas emissions — everything from electric vehicles to heat pumps — are growing rapidly in the United States. Now, a new database enables these investments to be comprehensively monitored in real-time, thereby helping to assess the efficacy of policies designed to spur clean investments and address climate change.

    The Clean Investment Monitor (CIM), developed by a team at MIT’s Center for Energy and Environmental Policy Research (CEEPR) led by Institute Innovation Fellow Brian Deese and in collaboration with the Rhodium Group, an independent research firm, provides a timely and methodologically consistent tracking of all announced public and private investments in the manufacture and deployment of clean technologies and infrastructure in the U.S. The CIM offers a means of assessing the country’s progress in transitioning to a cleaner economy and reducing greenhouse gas emissions.

    In the year from July 1, 2022, to June 30, 2023, data from the CIM show, clean investments nationwide totaled $213 billion. To put that figure in perspective, 18 states in the U.S. have GDPs each lower than $213 billion.

    “As clean technology becomes a larger and larger sector in the United States, its growth will have far-reaching implications — for our economy, for our leadership in innovation, and for reducing our greenhouse gas emissions,” says Deese, who served as the director of the White House National Economic Council from January 2021 to February 2023. “The Clean Investment Monitor is a tool designed to help us understand and assess this growth in a real-time, comprehensive way. Our hope is that the CIM will enhance research and improve public policies designed to accelerate the clean energy transition.”

    Launched on Sept. 13, the CIM shows that the $213 billion invested over the last year reflects a 37 percent increase from the $155 billion invested in the previous 12-month period. According to CIM data, the fastest growth has been in the manufacturing sector, where investment grew 125 percent year-on-year, particularly in electric vehicle and solar manufacturing.

    Beyond manufacturing, the CIM also provides data on investment in clean energy production, such as solar, wind, and nuclear; industrial decarbonization, such as sustainable aviation fuels; and retail investments by households and businesses in technologies like heat pumps and zero-emission vehicles. The CIM’s data goes back to 2018, providing a baseline before the passage of the legislation in 2021 and 2022.

    “We’re really excited to bring MIT’s analytical rigor to bear to help develop the Clean Investment Monitor,” says Christopher Knittel, the George P. Shultz Professor of Energy Economics at the MIT Sloan School of Management and CEEPR’s faculty director. “Bolstered by Brian’s keen understanding of the policy world, this tool is poised to become the go-to reference for anyone looking to understand clean investment flows and what drives them.”

    In 2021 and 2022, the U.S. federal government enacted a series of new laws that together aimed to catalyze the largest-ever national investment in clean energy technologies and related infrastructure. The Clean Investment Monitor can also be used to track how well the legislation is living up to expectations.

    The three pieces of federal legislation — the Infrastructure Investment and Jobs Act, enacted in 2021, and the Inflation Reduction Act (IRA) and the CHIPS and Science Act, both enacted in 2022 — provide grants, loans, loan guarantees, and tax incentives to spur investments in technologies that reduce greenhouse gas emissions.

    The effectiveness of the legislation in hastening the U.S. transition to a clean economy will be crucial in determining whether the country reaches its goal of reducing greenhouse gas emissions by 50 percent to 52 percent below 2005 levels in 2030. An analysis earlier this year estimated that the IRA will lead to a 43 percent to 48 percent decline in economywide emissions below 2005 levels by 2035, compared with 27 percent to 35 percent in a reference scenario without the law’s provisions, helping bring the U.S. goal closer in reach.

    The Clean Investment Monitor is available at cleaninvestmentmonitor.org. More

  • in

    Desirée Plata appointed co-director of the MIT Climate and Sustainability Consortium

    Desirée Plata, associate professor of civil and environmental engineering at MIT, has been named co-director of the MIT Climate and Sustainability Consortium (MCSC), effective Sept. 1. Plata will serve on the MCSC’s leadership team alongside Anantha P. Chandrakasan, dean of the MIT School of Engineering, the Vannevar Bush Professor of Electrical Engineering and Computer Science, and MCSC chair; Elsa Olivetti, the Jerry McAfee Professor in Engineering, a professor of materials science and engineering, and associate dean of engineering, and MCSC co-director; and Jeremy Gregory, MCSC executive director.Plata succeeds Jeffrey Grossman, the Morton and Claire Goulder and Family Professor in Environmental Systems, who has served as co-director since the MCSC’s launch in January 2021. Grossman, who played a central role in the ideation and launch of the MCSC, will continue his work with the MCSC as strategic advisor.“Professor Plata is a valued member of the MIT community. She brings a deep understanding of and commitment to climate and sustainability initiatives at MIT, as well as extensive experience working with industry, to her new role within the MCSC,” says Chandrakasan. The MIT Climate and Sustainability Consortium is an academia-industry collaboration working to accelerate implementation of large-scale solutions across sectors of the global economy. It aims to lay the groundwork for one critical aspect of MIT’s continued and intensified commitment to climate: helping large companies usher in, adapt to, and prosper in a decarbonized world.“We are thrilled to bring Professor Plata’s knowledge, vision, and passion to our leadership team,” says Olivetti. “Her experience developing sustainable technologies that have the potential to improve the environment and reduce the impacts of climate change will help move our work forward in meaningful ways. We have valued Professor Plata’s contributions to the consortium and look forward to continuing our work with her.”Plata played a pivotal role in the creation and launch of the MCSC’s Climate and Sustainability Scholars Program and its yearlong course for MIT rising juniors and seniors — an effort that she and Olivetti were recently recognized for with the Class of 1960 Innovation in Education Fellowship. She has also been a member of the MCSC’s Faculty Steering Committee since the consortium’s launch, helping to shape and guide its vision and work.Plata is a dedicated researcher, educator, and mentor. A member of MIT’s faculty since 2018, Plata and her team at the Plata Lab are helping to guide industry to more environmentally sustainable practices and develop new ways to protect the health of the planet — using chemistry to understand the impact that industrial materials and processes have on the environment. By coupling devices that simulate industrial systems with computation, she helps industry develop more environmentally friendly practices.To celebrate her work in the lab, classroom, and community, Plata has received many awards and honors. In 2020, she won MIT’s prestigious Harold E. Edgerton Faculty Achievement Award, recognizing her innovative approach to environmentally sustainable industrial practices, her inspirational teaching and mentoring, and her service to MIT and the community. She is a two-time National Academy of Sciences Kavli Frontiers of Science Fellow, a two-time National Academy of Engineers Frontiers of Engineering Fellow, and a Caltech Young Investigator Sustainability Fellow. She has also won the ACS C. Ellen Gonter Environmental Chemistry Award, an NSF CAREER award, and the 2016 Odebrecht Award for Sustainable Development.Beyond her work in the academic space, Plata is co-founder of two climate- and energy-related startups: Nth Cycle and Moxair, illustrating her commitment to translating academic innovations for real-world implementation — a core value of the MCSC.Plata received her bachelor’s degree from Union College and her PhD from the MIT and Woods Hole Oceanographic Institution (MIT-WHOI) joint program in oceanography/applied ocean science and engineering. After receiving her doctorate, Plata held positions at Mount Holyoke College, Duke University, and Yale University.  More