More stories

  • in

    MIT delegation mainstreams biodiversity conservation at the UN Biodiversity Convention, COP16

    For the first time, MIT sent an organized engagement to the global Conference of the Parties for the Convention on Biological Diversity, which this year was held Oct. 21 to Nov. 1 in Cali, Colombia.The 10 delegates to COP16 included faculty, researchers, and students from the MIT Environmental Solutions Initiative (ESI), the Department of Electrical Engineering and Computer Science (EECS), the Computer Science and Artificial Intelligence Laboratory (CSAIL), the Department of Urban Studies and Planning (DUSP), the Institute for Data, Systems, and Society (IDSS), and the Center for Sustainability Science and Strategy.In previous years, MIT faculty had participated sporadically in the discussions. This organized engagement, led by the ESI, is significant because it brought representatives from many of the groups working on biodiversity across the Institute; showcased the breadth of MIT’s research in more than 15 events including panels, roundtables, and keynote presentations across the Blue and Green Zones of the conference (with the Blue Zone representing the primary venue for the official negotiations and discussions and the Green Zone representing public events); and created an experiential learning opportunity for students who followed specific topics in the negotiations and throughout side events.The conference also gathered attendees from governments, nongovernmental organizations, businesses, other academic institutions, and practitioners focused on stopping global biodiversity loss and advancing the 23 goals of the Kunming-Montreal Global Biodiversity Framework (KMGBF), an international agreement adopted in 2022 to guide global efforts to protect and restore biodiversity through 2030.MIT’s involvement was particularly pronounced when addressing goals related to building coalitions of sub-national governments (targets 11, 12, 14); technology and AI for biodiversity conservation (targets 20 and 21); shaping equitable markets (targets 3, 11, and 19); and informing an action plan for Afro-descendant communities (targets 3, 10, and 22).Building coalitions of sub-national governmentsThe ESI’s Natural Climate Solutions (NCS) Program was able to support two separate coalitions of Latin American cities, namely the Coalition of Cities Against Illicit Economies in the Biogeographic Chocó Region and the Colombian Amazonian Cities coalition, who successfully signed declarations to advance specific targets of the KMGBF (the aforementioned targets 11, 12, 14).This was accomplished through roundtables and discussions where team members — including Marcela Angel, research program director at the MIT ESI; Angelica Mayolo, ESI Martin Luther King Fellow 2023-25; and Silvia Duque and Hannah Leung, MIT Master’s in City Planning students — presented a set of multi-scale actions including transnational strategies, recommendations to strengthen local and regional institutions, and community-based actions to promote the conservation of the Biogeographic Chocó as an ecological corridor.“There is an urgent need to deepen the relationship between academia and local governments of cities located in biodiversity hotspots,” said Angel. “Given the scale and unique conditions of Amazonian cities, pilot research projects present an opportunity to test and generate a proof of concept. These could generate catalytic information needed to scale up climate adaptation and conservation efforts in socially and ecologically sensitive contexts.”ESI’s research also provided key inputs for the creation of the Fund for the Biogeographic Chocó Region, a multi-donor fund launched within the framework of COP16 by a coalition composed of Colombia, Ecuador, Panamá, and Costa Rica. The fund aims to support biodiversity conservation, ecosystem restoration, climate change mitigation and adaptation, and sustainable development efforts across the region.Technology and AI for biodiversity conservationData, technology, and artificial intelligence are playing an increasing role in how we understand biodiversity and ecosystem change globally. Professor Sara Beery’s research group at MIT focuses on this intersection, developing AI methods that enable species and environmental monitoring at previously unprecedented spatial, temporal, and taxonomic scales.During the International Union of Biological Diversity Science-Policy Forum, the high-level COP16 segment focused on outlining recommendations from scientific and academic community, Beery spoke on a panel alongside María Cecilia Londoño, scientific information manager of the Humboldt Institute and co-chair of the Global Biodiversity Observations Network, and Josh Tewksbury, director of the Smithsonian Tropical Research Institute, among others, about how these technological advancements will help humanity achieve our biodiversity targets. The panel emphasized that AI innovation was needed, but with emphasis on direct human-AI partnership, AI capacity building, and the need for data and AI policy to ensure equity of access and benefit from these technologies.As a direct outcome of the session, for the first time, AI was emphasized in the statement on behalf of science and academia delivered by Hernando Garcia, director of the Humboldt Institute, and David Skorton, secretary general of the Smithsonian Institute, to the high-level segment of the COP16.That statement read, “To effectively address current and future challenges, urgent action is required in equity, governance, valuation, infrastructure, decolonization and policy frameworks around biodiversity data and artificial intelligence.”Beery also organized a panel at the GEOBON pavilion in the Blue Zone on Scaling Biodiversity Monitoring with AI, which brought together global leaders from AI research, infrastructure development, capacity and community building, and policy and regulation. The panel was initiated and experts selected from the participants at the recent Aspen Global Change Institute Workshop on Overcoming Barriers to Impact in AI for Biodiversity, co-organized by Beery.Shaping equitable marketsIn a side event co-hosted by the ESI with CAF-Development Bank of Latin America, researchers from ESI’s Natural Climate Solutions Program — including Marcela Angel; Angelica Mayolo; Jimena Muzio, ESI research associate; and Martin Perez Lara, ESI research affiliate and director for Forest Climate Solutions Impact and Monitoring at World Wide Fund for Nature of the U.S. — presented results of a study titled “Voluntary Carbon Markets for Social Impact: Comprehensive Assessment of the Role of Indigenous Peoples and Local Communities (IPLC) in Carbon Forestry Projects in Colombia.” The report highlighted the structural barriers that hinder effective participation of IPLC, and proposed a conceptual framework to assess IPLC engagement in voluntary carbon markets.Communicating these findings is important because the global carbon market has experienced a credibility crisis since 2023, influenced by critical assessments in academic literature, journalism questioning the quality of mitigation results, and persistent concerns about the engagement of private actors with IPLC. Nonetheless, carbon forestry projects have expanded rapidly in Indigenous, Afro-descendant, and local communities’ territories, and there is a need to assess the relationships between private actors and IPLC and to propose pathways for equitable participation. 

    Panelists pose at the equitable markets side event at the Latin American Pavilion in the Blue Zone.

    Previous item
    Next item

    The research presentation and subsequent panel with representatives of the association for Carbon Project Developers in Colombia Asocarbono, Fondo Acción, and CAF further discussed recommendations for all actors in the value chain of carbon certificates — including those focused on promoting equitable benefit-sharing and safeguarding compliance, increased accountability, enhanced governance structures, strengthened institutionality, and regulatory frameworks  — necessary to create an inclusive and transparent market.Informing an action plan for Afro-descendant communitiesThe Afro-Interamerican Forum on Climate Change (AIFCC), an international network working to highlight the critical role of Afro-descendant peoples in global climate action, was also present at COP16.At the Afro Summit, Mayolo presented key recommendations prepared collectively by the members of AIFCC to the technical secretariat of the Convention on Biological Diversity (CBD). The recommendations emphasize:creating financial tools for conservation and supporting Afro-descendant land rights;including a credit guarantee fund for countries that recognize Afro-descendant collective land titling and research on their contributions to biodiversity conservation;calling for increased representation of Afro-descendant communities in international policy forums;capacity-building for local governments; andstrategies for inclusive growth in green business and energy transition.These actions aim to promote inclusive and sustainable development for Afro-descendant populations.“Attending COP16 with a large group from MIT contributing knowledge and informed perspectives at 15 separate events was a privilege and honor,” says MIT ESI Director John E. Fernández. “This demonstrates the value of the ESI as a powerful research and convening body at MIT. Science is telling us unequivocally that climate change and biodiversity loss are the two greatest challenges that we face as a species and a planet. MIT has the capacity, expertise, and passion to address not only the former, but also the latter, and the ESI is committed to facilitating the very best contributions across the institute for the critical years that are ahead of us.”A fuller overview of the conference is available via The MIT Environmental Solutions Initiative’s Primer of COP16. More

  • in

    Is there enough land on Earth to fight climate change and feed the world?

    Capping global warming at 1.5 degrees Celsius is a tall order. Achieving that goal will not only require a massive reduction in greenhouse gas emissions from human activities, but also a substantial reallocation of land to support that effort and sustain the biosphere, including humans. More land will be needed to accommodate a growing demand for bioenergy and nature-based carbon sequestration while ensuring sufficient acreage for food production and ecological sustainability.The expanding role of land in a 1.5 C world will be twofold — to remove carbon dioxide from the atmosphere and to produce clean energy. Land-based carbon dioxide removal strategies include bioenergy with carbon capture and storage; direct air capture; and afforestation/reforestation and other nature-based solutions. Land-based clean energy production includes wind and solar farms and sustainable bioenergy cropland. Any decision to allocate more land for climate mitigation must also address competing needs for long-term food security and ecosystem health.Land-based climate mitigation choices vary in terms of costs — amount of land required, implications for food security, impact on biodiversity and other ecosystem services — and benefits — potential for sequestering greenhouse gases and producing clean energy.Now a study in the journal Frontiers in Environmental Science provides the most comprehensive analysis to date of competing land-use and technology options to limit global warming to 1.5 C. Led by researchers at the MIT Center for Sustainability Science and Strategy (CS3), the study applies the MIT Integrated Global System Modeling (IGSM) framework to evaluate costs and benefits of different land-based climate mitigation options in Sky2050, a 1.5 C climate-stabilization scenario developed by Shell.Under this scenario, demand for bioenergy and natural carbon sinks increase along with the need for sustainable farming and food production. To determine if there’s enough land to meet all these growing demands, the research team uses the global hectare (gha) — an area of 10,000 square meters, or 2.471 acres — as the standard unit of measurement, and current estimates of the Earth’s total habitable land area (about 10 gha) and land area used for food production and bioenergy (5 gha).The team finds that with transformative changes in policy, land management practices, and consumption patterns, global land is sufficient to provide a sustainable supply of food and ecosystem services throughout this century while also reducing greenhouse gas emissions in alignment with the 1.5 C goal. These transformative changes include policies to protect natural ecosystems; stop deforestation and accelerate reforestation and afforestation; promote advances in sustainable agriculture technology and practice; reduce agricultural and food waste; and incentivize consumers to purchase sustainably produced goods.If such changes are implemented, 2.5–3.5 gha of land would be used for NBS practices to sequester 3–6 gigatonnes (Gt) of CO2 per year, and 0.4–0.6 gha of land would be allocated for energy production — 0.2–0.3 gha for bioenergy and 0.2–0.35 gha for wind and solar power generation.“Our scenario shows that there is enough land to support a 1.5 degree C future as long as effective policies at national and global levels are in place,” says CS3 Principal Research Scientist Angelo Gurgel, the study’s lead author. “These policies must not only promote efficient use of land for food, energy, and nature, but also be supported by long-term commitments from government and industry decision-makers.” More

  • in

    Reality check on technologies to remove carbon dioxide from the air

    In 2015, 195 nations plus the European Union signed the Paris Agreement and pledged to undertake plans designed to limit the global temperature increase to 1.5 degrees Celsius. Yet in 2023, the world exceeded that target for most, if not all of, the year — calling into question the long-term feasibility of achieving that target.To do so, the world must reduce the levels of greenhouse gases in the atmosphere, and strategies for achieving levels that will “stabilize the climate” have been both proposed and adopted. Many of those strategies combine dramatic cuts in carbon dioxide (CO2) emissions with the use of direct air capture (DAC), a technology that removes CO2 from the ambient air. As a reality check, a team of researchers in the MIT Energy Initiative (MITEI) examined those strategies, and what they found was alarming: The strategies rely on overly optimistic — indeed, unrealistic — assumptions about how much CO2 could be removed by DAC. As a result, the strategies won’t perform as predicted. Nevertheless, the MITEI team recommends that work to develop the DAC technology continue so that it’s ready to help with the energy transition — even if it’s not the silver bullet that solves the world’s decarbonization challenge.DAC: The promise and the realityIncluding DAC in plans to stabilize the climate makes sense. Much work is now under way to develop DAC systems, and the technology looks promising. While companies may never run their own DAC systems, they can already buy “carbon credits” based on DAC. Today, a multibillion-dollar market exists on which entities or individuals that face high costs or excessive disruptions to reduce their own carbon emissions can pay others to take emissions-reducing actions on their behalf. Those actions can involve undertaking new renewable energy projects or “carbon-removal” initiatives such as DAC or afforestation/reforestation (planting trees in areas that have never been forested or that were forested in the past). DAC-based credits are especially appealing for several reasons, explains Howard Herzog, a senior research engineer at MITEI. With DAC, measuring and verifying the amount of carbon removed is straightforward; the removal is immediate, unlike with planting forests, which may take decades to have an impact; and when DAC is coupled with CO2 storage in geologic formations, the CO2 is kept out of the atmosphere essentially permanently — in contrast to, for example, sequestering it in trees, which may one day burn and release the stored CO2.Will current plans that rely on DAC be effective in stabilizing the climate in the coming years? To find out, Herzog and his colleagues Jennifer Morris and Angelo Gurgel, both MITEI principal research scientists, and Sergey Paltsev, a MITEI senior research scientist — all affiliated with the MIT Center for Sustainability Science and Strategy (CS3) — took a close look at the modeling studies on which those plans are based.Their investigation identified three unavoidable engineering challenges that together lead to a fourth challenge — high costs for removing a single ton of CO2 from the atmosphere. The details of their findings are reported in a paper published in the journal One Earth on Sept. 20.Challenge 1: Scaling upWhen it comes to removing CO2 from the air, nature presents “a major, non-negotiable challenge,” notes the MITEI team: The concentration of CO2 in the air is extremely low — just 420 parts per million, or roughly 0.04 percent. In contrast, the CO2 concentration in flue gases emitted by power plants and industrial processes ranges from 3 percent to 20 percent. Companies now use various carbon capture and sequestration (CCS) technologies to capture CO2 from their flue gases, but capturing CO2 from the air is much more difficult. To explain, the researchers offer the following analogy: “The difference is akin to needing to find 10 red marbles in a jar of 25,000 marbles of which 24,990 are blue [the task representing DAC] versus needing to find about 10 red marbles in a jar of 100 marbles of which 90 are blue [the task for CCS].”Given that low concentration, removing a single metric ton (tonne) of CO2 from air requires processing about 1.8 million cubic meters of air, which is roughly equivalent to the volume of 720 Olympic-sized swimming pools. And all that air must be moved across a CO2-capturing sorbent — a feat requiring large equipment. For example, one recently proposed design for capturing 1 million tonnes of CO2 per year would require an “air contactor” equivalent in size to a structure about three stories high and three miles long.Recent modeling studies project DAC deployment on the scale of 5 to 40 gigatonnes of CO2 removed per year. (A gigatonne equals 1 billion metric tonnes.) But in their paper, the researchers conclude that the likelihood of deploying DAC at the gigatonne scale is “highly uncertain.”Challenge 2: Energy requirementGiven the low concentration of CO2 in the air and the need to move large quantities of air to capture it, it’s no surprise that even the best DAC processes proposed today would consume large amounts of energy — energy that’s generally supplied by a combination of electricity and heat. Including the energy needed to compress the captured CO2 for transportation and storage, most proposed processes require an equivalent of at least 1.2 megawatt-hours of electricity for each tonne of CO2 removed.The source of that electricity is critical. For example, using coal-based electricity to drive an all-electric DAC process would generate 1.2 tonnes of CO2 for each tonne of CO2 captured. The result would be a net increase in emissions, defeating the whole purpose of the DAC. So clearly, the energy requirement must be satisfied using either low-carbon electricity or electricity generated using fossil fuels with CCS. All-electric DAC deployed at large scale — say, 10 gigatonnes of CO2 removed annually — would require 12,000 terawatt-hours of electricity, which is more than 40 percent of total global electricity generation today.Electricity consumption is expected to grow due to increasing overall electrification of the world economy, so low-carbon electricity will be in high demand for many competing uses — for example, in power generation, transportation, industry, and building operations. Using clean electricity for DAC instead of for reducing CO2 emissions in other critical areas raises concerns about the best uses of clean electricity.Many studies assume that a DAC unit could also get energy from “waste heat” generated by some industrial process or facility nearby. In the MITEI researchers’ opinion, “that may be more wishful thinking than reality.” The heat source would need to be within a few miles of the DAC plant for transporting the heat to be economical; given its high capital cost, the DAC plant would need to run nonstop, requiring constant heat delivery; and heat at the temperature required by the DAC plant would have competing uses, for example, for heating buildings. Finally, if DAC is deployed at the gigatonne per year scale, waste heat will likely be able to provide only a small fraction of the needed energy.Challenge 3: SitingSome analysts have asserted that, because air is everywhere, DAC units can be located anywhere. But in reality, siting a DAC plant involves many complex issues. As noted above, DAC plants require significant amounts of energy, so having access to enough low-carbon energy is critical. Likewise, having nearby options for storing the removed CO2 is also critical. If storage sites or pipelines to such sites don’t exist, major new infrastructure will need to be built, and building new infrastructure of any kind is expensive and complicated, involving issues related to permitting, environmental justice, and public acceptability — issues that are, in the words of the researchers, “commonly underestimated in the real world and neglected in models.”Two more siting needs must be considered. First, meteorological conditions must be acceptable. By definition, any DAC unit will be exposed to the elements, and factors like temperature and humidity will affect process performance and process availability. And second, a DAC plant will require some dedicated land — though how much is unclear, as the optimal spacing of units is as yet unresolved. Like wind turbines, DAC units need to be properly spaced to ensure maximum performance such that one unit is not sucking in CO2-depleted air from another unit.Challenge 4: CostConsidering the first three challenges, the final challenge is clear: the cost per tonne of CO2 removed is inevitably high. Recent modeling studies assume DAC costs as low as $100 to $200 per ton of CO2 removed. But the researchers found evidence suggesting far higher costs.To start, they cite typical costs for power plants and industrial sites that now use CCS to remove CO2 from their flue gases. The cost of CCS in such applications is estimated to be in the range of $50 to $150 per ton of CO2 removed. As explained above, the far lower concentration of CO2 in the air will lead to substantially higher costs.As explained under Challenge 1, the DAC units needed to capture the required amount of air are massive. The capital cost of building them will be high, given labor, materials, permitting costs, and so on. Some estimates in the literature exceed $5,000 per tonne captured per year.Then there are the ongoing costs of energy. As noted under Challenge 2, removing 1 tonne of CO2 requires the equivalent of 1.2 megawatt-hours of electricity. If that electricity costs $0.10 per kilowatt-hour, the cost of just the electricity needed to remove 1 tonne of CO2 is $120. The researchers point out that assuming such a low price is “questionable,” given the expected increase in electricity demand, future competition for clean energy, and higher costs on a system dominated by renewable — but intermittent — energy sources.Then there’s the cost of storage, which is ignored in many DAC cost estimates.Clearly, many considerations show that prices of $100 to $200 per tonne are unrealistic, and assuming such low prices will distort assessments of strategies, leading them to underperform going forward.The bottom lineIn their paper, the MITEI team calls DAC a “very seductive concept.” Using DAC to suck CO2 out of the air and generate high-quality carbon-removal credits can offset reduction requirements for industries that have hard-to-abate emissions. By doing so, DAC would minimize disruptions to key parts of the world’s economy, including air travel, certain carbon-intensive industries, and agriculture. However, the world would need to generate billions of tonnes of CO2 credits at an affordable price. That prospect doesn’t look likely. The largest DAC plant in operation today removes just 4,000 tonnes of CO2 per year, and the price to buy the company’s carbon-removal credits on the market today is $1,500 per tonne.The researchers recognize that there is room for energy efficiency improvements in the future, but DAC units will always be subject to higher work requirements than CCS applied to power plant or industrial flue gases, and there is not a clear pathway to reducing work requirements much below the levels of current DAC technologies.Nevertheless, the researchers recommend that work to develop DAC continue “because it may be needed for meeting net-zero emissions goals, especially given the current pace of emissions.” But their paper concludes with this warning: “Given the high stakes of climate change, it is foolhardy to rely on DAC to be the hero that comes to our rescue.” More

  • in

    Preparing Taiwan for a decarbonized economy

    The operations of Taiwan’s electronics, manufacturing, and financial firms vary widely, but their leaders all have at least one thing in common: They recognize the role that a changing energy landscape will play in their future success, and they’re actively planning for that transition.“They’re all interested in how Taiwan can supply energy for its economy going forward — energy that meets global goals for decarbonization,” says Robert C. Armstrong, the Chevron Professor of Chemical Engineering Emeritus at MIT, as well as a principal investigator for the Taiwan Innovative Green Economy Roadmap (TIGER) program. “Each company is going to have its own particular needs. For example, financial companies have data centers that need energy 24/7, with no interruptions. But the need for a robust, reliable, resilient energy system is shared among all of them.”Ten Taiwanese companies are participating in TIGER, a two-year program with the MIT Energy Initiative (MITEI) to explore various ways that industry and government can promote and adopt technologies, practices, and policies that will keep Taiwan competitive amid a quickly changing energy landscape. MIT research teams are exploring a set of six topics during the first year of the program, with plans to tackle a second set of topics during the second year, eventually leading to a roadmap to green energy security for Taiwan.“We are helping them to understand green energy technologies, we are helping them to understand how policies around the world might affect supply chains, and we are helping them to understand different pathways for their domestic policies,” says Sergey Paltsev, a principal investigator for the TIGER program, as well as a deputy director of the MIT Center for Sustainability Science and Strategy and a senior research scientist at MITEI. “We are looking at how Taiwan will be affected in terms of the cost of doing business and how to preserve the competitive advantage of its export-oriented industries.”“The biggest question,” Paltsev adds, “is how Taiwanese companies can decarbonize their energy in a sustainable manner.”Why Taiwan?Paul Hsu, founding partner of the Taiwanese business consultancy Paul Hsu and Partners (one of the 10 participating TIGER companies), as well as founding chair and current board member of the Epoch Foundation, has been working for more than 30 years to forge collaborations between business leaders in Taiwan and MIT researchers. The energy challenges facing Taiwanese businesses, as well as their place in the global supply chain, make the TIGER program critical not only to improve environmental sustainability, but also to ensure future competitiveness, he says. “The energy field is facing revolution,” Hsu says. “Taiwanese companies are not operating in Taiwan alone, but also operating worldwide, and we are affected by the global supply chain. We need to diversify our businesses and our energy resources, and the first thing we’re looking for in this partnership is education — an understanding about how to orient Taiwanese industry toward the future of energy.”Wendy Duan, the program director of the Asia Pacific program at MITEI, notes that Taiwan has a number of similarities to places such as Singapore and Japan. The lessons learned through the TIGER program, she says, will likely be applicable — at least on some level — to other markets throughout Asia, and even around the world.“Taiwan is very much dependent on imported energy,” Duan notes. “Many countries in East Asia are facing similar challenges, and if Taiwan has a good roadmap for the future of energy, it can be a good role model.”“Taiwan is a great place for this sort of collaboration,” Armstrong says. “Their industry is very innovative, and it’s a place where businesses are willing to implement new, important ideas. At the same time, their economy is highly dependent on trade, and they import a lot of fossil fuels today. To compete in a decarbonized global economy, they’re going to have to find alternatives to that. If you can develop a path from today’s economy in Taiwan to a future manufacturing economy that is decarbonized, then that gives you a lot of interesting tools you could bring to bear in other economies.”Uncovering solutionsStakeholders from MIT and the participating companies meet for monthly webinars and biannual in-person workshops (alternating between Cambridge, Massachusetts, and Taipei) to discuss progress. The research addresses options for Taiwan to increase its supply of green energy, methods for storing and distributing that energy more efficiently, policy levers for implementing these changes, and Taiwan’s place in the global energy economy.“The project on the electric grid, the project on storage, and the project on hydrogen — all three of those are related to the issue of how to decarbonize power generation and delivery,” notes Paltsev. “But we also need to understand how things in other parts of the world are going to affect demand for the products that are produced in Taiwan. If there is a huge change in demand for certain products due to decarbonization, Taiwanese companies are going to feel it. Therefore, the companies want to understand where the demand is going to be coming from, and how to adjust their business strategies.”One of the research projects is looking closely at advanced nuclear power. There are significant political roadblocks standing in the way, but business leaders are intrigued by the prospect of nuclear energy in Taiwan, where available land for wind and solar power generation is sparse.“So far, Taiwan government policy is anti-nuclear,” Hsu says. “The current ruling party is against it. They are still thinking about what happened in the 1960s and 1970s, and they think nuclear is very dangerous. But if you look into it, nuclear generation technology has really improved.”Implementing a green economy roadmapTIGER participants’ interest in green energy solutions is, of course, not merely academic. Ultimately, the success of the program will be determined not only by the insights from the research produced over these two years, but by how these findings constructively inform both the private and public sectors.“MIT and TIGER participants are united in their commitment to advancing regional industrial and economic development, while championing decarbonization and sustainability efforts in Taiwan,” Duan says. “MIT researchers are informed by insights and domain expertise contributed by TIGER participants, believing that their collaborative efforts can help other nations facing similar geo-economic challenges.”“We are helping the companies understand how to stay leaders in this changing world,” says Paltsev. “We want to make sure that we are not painting an unrealistically rosy picture, or conveying that it will be easy to decarbonize. On the contrary, we want to stay realistic and try to show them both where they can make advances and where we see challenges.”The goal, Armstrong says, is not energy independence for Taiwan, but rather energy security. “Energy security requires diversity of supply,” he says. “So, you have a diverse set of suppliers, who are trusted trading partners, but it doesn’t mean you’re on your own. That’s the goal for Taiwan.”What will that mean, more specifically? Well, that’s what TIGER researchers aim to learn. “It probably means a mix of energy sources,” Armstrong says. “It could be that nuclear fission provides a core of energy that companies need for their industrial operations, it could be that they can import hydrogen in the form of ammonia or another carrier, and it could be that they leverage the renewable resources they have, together with storage technologies, to provide some pretty inexpensive energy for their manufacturing sector.”“We don’t know,” Armstrong adds. “But that’s what we’re looking at, to see if we can figure out a pathway that gets them to their goals. We are optimistic that we can get there.”The companies participating in the TIGER program include AcBel Polytech Inc., CDIB Capital Group / KGI Bank Co., Ltd.; Delta Electronics, Inc.; Fubon Financial Holding Co., Ltd.; Paul Hsu and Partners Co., Ltd.; Ta Ya Electric Wire & Cable Co., Ltd.; TCC Group Holdings Co. Ltd.; Walsin Lihwa Corporation; Wistron Corporation; and Zhen Ding Technology Holding, Ltd. More

  • in

    Study finds mercury pollution from human activities is declining

    MIT researchers have some good environmental news: Mercury emissions from human activity have been declining over the past two decades, despite global emissions inventories that indicate otherwise.In a new study, the researchers analyzed measurements from all available monitoring stations in the Northern Hemisphere and found that atmospheric concentrations of mercury declined by about 10 percent between 2005 and 2020.They used two separate modeling methods to determine what is driving that trend. Both techniques pointed to a decline in mercury emissions from human activity as the most likely cause.Global inventories, on the other hand, have reported opposite trends. These inventories estimate atmospheric emissions using models that incorporate average emission rates of polluting activities and the scale of these activities worldwide.“Our work shows that it is very important to learn from actual, on-the-ground data to try and improve our models and these emissions estimates. This is very relevant for policy because, if we are not able to accurately estimate past mercury emissions, how are we going to predict how mercury pollution will evolve in the future?” says Ari Feinberg, a former postdoc in the Institute for Data, Systems, and Society (IDSS) and lead author of the study.The new results could help inform scientists who are embarking on a collaborative, global effort to evaluate pollution models and develop a more in-depth understanding of what drives global atmospheric concentrations of mercury.However, due to a lack of data from global monitoring stations and limitations in the scientific understanding of mercury pollution, the researchers couldn’t pinpoint a definitive reason for the mismatch between the inventories and the recorded measurements.“It seems like mercury emissions are moving in the right direction, and could continue to do so, which is heartening to see. But this was as far as we could get with mercury. We need to keep measuring and advancing the science,” adds co-author Noelle Selin, an MIT professor in the IDSS and the Department of Earth, Atmospheric and Planetary Sciences (EAPS).Feinberg and Selin, his MIT postdoctoral advisor, are joined on the paper by an international team of researchers that contributed atmospheric mercury measurement data and statistical methods to the study. The research appears this week in the Proceedings of the National Academy of Sciences.Mercury mismatchThe Minamata Convention is a global treaty that aims to cut human-caused emissions of mercury, a potent neurotoxin that enters the atmosphere from sources like coal-fired power plants and small-scale gold mining.The treaty, which was signed in 2013 and went into force in 2017, is evaluated every five years. The first meeting of its conference of parties coincided with disheartening news reports that said global inventories of mercury emissions, compiled in part from information from national inventories, had increased despite international efforts to reduce them.This was puzzling news for environmental scientists like Selin. Data from monitoring stations showed atmospheric mercury concentrations declining during the same period.Bottom-up inventories combine emission factors, such as the amount of mercury that enters the atmosphere when coal mined in a certain region is burned, with estimates of pollution-causing activities, like how much of that coal is burned in power plants.“The big question we wanted to answer was: What is actually happening to mercury in the atmosphere and what does that say about anthropogenic emissions over time?” Selin says.Modeling mercury emissions is especially tricky. First, mercury is the only metal that is in liquid form at room temperature, so it has unique properties. Moreover, mercury that has been removed from the atmosphere by sinks like the ocean or land can be re-emitted later, making it hard to identify primary emission sources.At the same time, mercury is more difficult to study in laboratory settings than many other air pollutants, especially due to its toxicity, so scientists have limited understanding of all chemical reactions mercury can undergo. There is also a much smaller network of mercury monitoring stations, compared to other polluting gases like methane and nitrous oxide.“One of the challenges of our study was to come up with statistical methods that can address those data gaps, because available measurements come from different time periods and different measurement networks,” Feinberg says.Multifaceted modelsThe researchers compiled data from 51 stations in the Northern Hemisphere. They used statistical techniques to aggregate data from nearby stations, which helped them overcome data gaps and evaluate regional trends.By combining data from 11 regions, their analysis indicated that Northern Hemisphere atmospheric mercury concentrations declined by about 10 percent between 2005 and 2020.Then the researchers used two modeling methods — biogeochemical box modeling and chemical transport modeling — to explore possible causes of that decline.  Box modeling was used to run hundreds of thousands of simulations to evaluate a wide array of emission scenarios. Chemical transport modeling is more computationally expensive but enables researchers to assess the impacts of meteorology and spatial variations on trends in selected scenarios.For instance, they tested one hypothesis that there may be an additional environmental sink that is removing more mercury from the atmosphere than previously thought. The models would indicate the feasibility of an unknown sink of that magnitude.“As we went through each hypothesis systematically, we were pretty surprised that we could really point to declines in anthropogenic emissions as being the most likely cause,” Selin says.Their work underscores the importance of long-term mercury monitoring stations, Feinberg adds. Many stations the researchers evaluated are no longer operational because of a lack of funding.While their analysis couldn’t zero in on exactly why the emissions inventories didn’t match up with actual data, they have a few hypotheses.One possibility is that global inventories are missing key information from certain countries. For instance, the researchers resolved some discrepancies when they used a more detailed regional inventory from China. But there was still a gap between observations and estimates.They also suspect the discrepancy might be the result of changes in two large sources of mercury that are particularly uncertain: emissions from small-scale gold mining and mercury-containing products.Small-scale gold mining involves using mercury to extract gold from soil and is often performed in remote parts of developing countries, making it hard to estimate. Yet small-scale gold mining contributes about 40 percent of human-made emissions.In addition, it’s difficult to determine how long it takes the pollutant to be released into the atmosphere from discarded products like thermometers or scientific equipment.“We’re not there yet where we can really pinpoint which source is responsible for this discrepancy,” Feinberg says.In the future, researchers from multiple countries, including MIT, will collaborate to study and improve the models they use to estimate and evaluate emissions. This research will be influential in helping that project move the needle on monitoring mercury, he says.This research was funded by the Swiss National Science Foundation, the U.S. National Science Foundation, and the U.S. Environmental Protection Agency. More

  • in

    MIT School of Science launches Center for Sustainability Science and Strategy

    The MIT School of Science is launching a center to advance knowledge and computational capabilities in the field of sustainability science, and support decision-makers in government, industry, and civil society to achieve sustainable development goals. Aligned with the Climate Project at MIT, researchers at the MIT Center for Sustainability Science and Strategy will develop and apply expertise from across the Institute to improve understanding of sustainability challenges, and thereby provide actionable knowledge and insight to inform strategies for improving human well-being for current and future generations.Noelle Selin, professor at MIT’s Institute for Data, Systems and Society and the Department of Earth, Atmospheric and Planetary Sciences, will serve as the center’s inaugural faculty director. C. Adam Schlosser and Sergey Paltsev, senior research scientists at MIT, will serve as deputy directors, with Anne Slinn as executive director.Incorporating and succeeding both the Center for Global Change Science and Joint Program on the Science and Policy of Global Change while adding new capabilities, the center aims to produce leading-edge research to help guide societal transitions toward a more sustainable future. Drawing on the long history of MIT’s efforts to address global change and its integrated environmental and human dimensions, the center is well-positioned to lead burgeoning global efforts to advance the field of sustainability science, which seeks to understand nature-society systems in their full complexity. This understanding is designed to be relevant and actionable for decision-makers in government, industry, and civil society in their efforts to develop viable pathways to improve quality of life for multiple stakeholders.“As critical challenges such as climate, health, energy, and food security increasingly affect people’s lives around the world, decision-makers need a better understanding of the earth in its full complexity — and that includes people, technologies, and institutions as well as environmental processes,” says Selin. “Better knowledge of these systems and how they interact can lead to more effective strategies that avoid unintended consequences and ensure an improved quality of life for all.”    Advancing knowledge, computational capability, and decision supportTo produce more precise and comprehensive knowledge of sustainability challenges and guide decision-makers to formulate more effective strategies, the center has set the following goals:Advance fundamental understanding of the complex interconnected physical and socio-economic systems that affect human well-being. As new policies and technologies are developed amid climate and other global changes, they interact with environmental processes and institutions in ways that can alter the earth’s critical life-support systems. Fundamental mechanisms that determine many of these systems’ behaviors, including those related to interacting climate, water, food, and socio-economic systems, remain largely unknown and poorly quantified. Better understanding can help society mitigate the risks of abrupt changes and “tipping points” in these systems.Develop, establish and disseminate new computational tools toward better understanding earth systems, including both environmental and human dimensions. The center’s work will integrate modeling and data analysis across disciplines in an era of increasing volumes of observational data. MIT multi-system models and data products will provide robust information to inform decision-making and shape the next generation of sustainability science and strategy.Produce actionable science that supports equity and justice within and across generations. The center’s research will be designed to inform action associated with measurable outcomes aligned with supporting human well-being across generations. This requires engaging a broad range of stakeholders, including not only nations and companies, but also nongovernmental organizations and communities that take action to promote sustainable development — with special attention to those who have historically borne the brunt of environmental injustice.“The center’s work will advance fundamental understanding in sustainability science, leverage leading-edge computing and data, and promote engagement and impact,” says Selin. “Our researchers will help lead scientists and strategists across the globe who share MIT’s commitment to mobilizing knowledge to inform action toward a more sustainable world.”Building a better world at MITBuilding on existing MIT capabilities in sustainability, science, and strategy, the center aims to: focus research, education, and outreach under a theme that reflects a comprehensive state of the field and international research directions, fostering a dynamic community of students, researchers, and faculty;raise the visibility of sustainability science at MIT, emphasizing links between science and action, in the context of existing Institute goals and other efforts on climate and sustainability, and in a way that reflects the vital contributions of a range of natural and social science disciplines to understanding human-environment systems; andre-emphasize MIT’s long-standing expertise in integrated systems modeling while leveraging the Institute’s concurrent leading-edge strengths in data and computing, establishing leadership that harnesses recent innovations, including those in machine learning and artificial intelligence, toward addressing the science challenges of global change and sustainability.“The Center for Sustainability Science and Strategy will provide the necessary synergy for our MIT researchers to develop, deploy, and scale up serious solutions to climate change and other critical sustainability challenges,” says Nergis Mavalvala, the Curtis and Kathleen Marble Professor of Astrophysics and dean of the MIT School of Science. “With Professor Selin at its helm, the center will also ensure that these solutions are created in concert with the people who are directly affected now and in the future.”The center builds on more than three decades of achievements by the Center for Global Change Science and the Joint Program on the Science and Policy of Global Change, both of which were directed or co-directed by professor of atmospheric science Ronald Prinn. More

  • in

    Study finds health risks in switching ships from diesel to ammonia fuel

    As container ships the size of city blocks cross the oceans to deliver cargo, their huge diesel engines emit large quantities of air pollutants that drive climate change and have human health impacts. It has been estimated that maritime shipping accounts for almost 3 percent of global carbon dioxide emissions and the industry’s negative impacts on air quality cause about 100,000 premature deaths each year.Decarbonizing shipping to reduce these detrimental effects is a goal of the International Maritime Organization, a U.N. agency that regulates maritime transport. One potential solution is switching the global fleet from fossil fuels to sustainable fuels such as ammonia, which could be nearly carbon-free when considering its production and use.But in a new study, an interdisciplinary team of researchers from MIT and elsewhere caution that burning ammonia for maritime fuel could worsen air quality further and lead to devastating public health impacts, unless it is adopted alongside strengthened emissions regulations.Ammonia combustion generates nitrous oxide (N2O), a greenhouse gas that is about 300 times more potent than carbon dioxide. It also emits nitrogen in the form of nitrogen oxides (NO and NO2, referred to as NOx), and unburnt ammonia may slip out, which eventually forms fine particulate matter in the atmosphere. These tiny particles can be inhaled deep into the lungs, causing health problems like heart attacks, strokes, and asthma.The new study indicates that, under current legislation, switching the global fleet to ammonia fuel could cause up to about 600,000 additional premature deaths each year. However, with stronger regulations and cleaner engine technology, the switch could lead to about 66,000 fewer premature deaths than currently caused by maritime shipping emissions, with far less impact on global warming.“Not all climate solutions are created equal. There is almost always some price to pay. We have to take a more holistic approach and consider all the costs and benefits of different climate solutions, rather than just their potential to decarbonize,” says Anthony Wong, a postdoc in the MIT Center for Global Change Science and lead author of the study.His co-authors include Noelle Selin, an MIT professor in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences (EAPS); Sebastian Eastham, a former principal research scientist who is now a senior lecturer at Imperial College London; Christine Mounaïm-Rouselle, a professor at the University of Orléans in France; Yiqi Zhang, a researcher at the Hong Kong University of Science and Technology; and Florian Allroggen, a research scientist in the MIT Department of Aeronautics and Astronautics. The research appears this week in Environmental Research Letters.Greener, cleaner ammoniaTraditionally, ammonia is made by stripping hydrogen from natural gas and then combining it with nitrogen at extremely high temperatures. This process is often associated with a large carbon footprint. The maritime shipping industry is betting on the development of “green ammonia,” which is produced by using renewable energy to make hydrogen via electrolysis and to generate heat.“In theory, if you are burning green ammonia in a ship engine, the carbon emissions are almost zero,” Wong says.But even the greenest ammonia generates nitrous oxide (N2O), nitrogen oxides (NOx) when combusted, and some of the ammonia may slip out, unburnt. This nitrous oxide would escape into the atmosphere, where the greenhouse gas would remain for more than 100 years. At the same time, the nitrogen emitted as NOx and ammonia would fall to Earth, damaging fragile ecosystems. As these emissions are digested by bacteria, additional N2O  is produced.NOx and ammonia also mix with gases in the air to form fine particulate matter. A primary contributor to air pollution, fine particulate matter kills an estimated 4 million people each year.“Saying that ammonia is a ‘clean’ fuel is a bit of an overstretch. Just because it is carbon-free doesn’t necessarily mean it is clean and good for public health,” Wong says.A multifaceted modelThe researchers wanted to paint the whole picture, capturing the environmental and public health impacts of switching the global fleet to ammonia fuel. To do so, they designed scenarios to measure how pollutant impacts change under certain technology and policy assumptions.From a technological point of view, they considered two ship engines. The first burns pure ammonia, which generates higher levels of unburnt ammonia but emits fewer nitrogen oxides. The second engine technology involves mixing ammonia with hydrogen to improve combustion and optimize the performance of a catalytic converter, which controls both nitrogen oxides and unburnt ammonia pollution.They also considered three policy scenarios: current regulations, which only limit NOx emissions in some parts of the world; a scenario that adds ammonia emission limits over North America and Western Europe; and a scenario that adds global limits on ammonia and NOx emissions.The researchers used a ship track model to calculate how pollutant emissions change under each scenario and then fed the results into an air quality model. The air quality model calculates the impact of ship emissions on particulate matter and ozone pollution. Finally, they estimated the effects on global public health.One of the biggest challenges came from a lack of real-world data, since no ammonia-powered ships are yet sailing the seas. Instead, the researchers relied on experimental ammonia combustion data from collaborators to build their model.“We had to come up with some clever ways to make that data useful and informative to both the technology and regulatory situations,” he says.A range of outcomesIn the end, they found that with no new regulations and ship engines that burn pure ammonia, switching the entire fleet would cause 681,000 additional premature deaths each year.“While a scenario with no new regulations is not very realistic, it serves as a good warning of how dangerous ammonia emissions could be. And unlike NOx, ammonia emissions from shipping are currently unregulated,” Wong says.However, even without new regulations, using cleaner engine technology would cut the number of premature deaths down to about 80,000, which is about 20,000 fewer than are currently attributed to maritime shipping emissions. With stronger global regulations and cleaner engine technology, the number of people killed by air pollution from shipping could be reduced by about 66,000.“The results of this study show the importance of developing policies alongside new technologies,” Selin says. “There is a potential for ammonia in shipping to be beneficial for both climate and air quality, but that requires that regulations be designed to address the entire range of potential impacts, including both climate and air quality.”Ammonia’s air quality impacts would not be felt uniformly across the globe, and addressing them fully would require coordinated strategies across very different contexts. Most premature deaths would occur in East Asia, since air quality regulations are less stringent in this region. Higher levels of existing air pollution cause the formation of more particulate matter from ammonia emissions. In addition, shipping volume over East Asia is far greater than elsewhere on Earth, compounding these negative effects.In the future, the researchers want to continue refining their analysis. They hope to use these findings as a starting point to urge the marine industry to share engine data they can use to better evaluate air quality and climate impacts. They also hope to inform policymakers about the importance and urgency of updating shipping emission regulations.This research was funded by the MIT Climate and Sustainability Consortium. More

  • in

    Sophia Chen: It’s our duty to make the world better through empathy, patience, and respect

    Sophia Chen, a fifth-year senior double majoring in mechanical engineering and art and design, learned about MIT D-Lab when she was a Florida middle schooler. She drove with her family from their home in Clearwater to Tampa to an MIT informational open house for prospective students. There, she heard about a moringa seed press that had been developed by D-Lab students. Those students, Kwami Williams ’12 and Emily Cunningham (a cross-registered Harvard University student), went on to found MoringaConnect with a goal of increasing Ghanaian farmer incomes. Over the past 12 years, the company has done just that, sometimes by a factor of 10 or more, by selling to wholesalers and establishing their own line of moringa skin and hair care products, as well as nutritional supplements and teas.“I remember getting chills,” says Sophia. “I was so in awe. MIT had always been my dream college growing up, but hearing this particular story truly cemented that dream. I even talked about D-Lab during my admissions interview. Once I came to MIT, I knew I had to take a D-Lab class — and now, at the end of my five years, I’ve taken four.”Taking four D-Lab classes during her undergraduate years may make Sophia exceptional, though not unusual. Of the nearly 4,000 enrollments in D-Lab classes over the past 22 years, as many as 20 percent took at least two classes, and many take three or more by the time the graduate. For Sophia, her D-Lab classes were a logical progression that both confirmed and expanded her career goals in global medicine.Centering the role of project community partnersSophia’s first D-Lab class was 2.722J / EC.720 (D-Lab: Design). Like all D-Lab classes, D-Lab: Design is project-based and centers the knowledge and contributions of each project’s community partner. Her team worked with a group in Uganda called Safe Water Harvesters on a project aimed at creating a solar-powered atmospheric water harvester using desiccants. They focused on early research and development for the desiccant technology by running tests for vapor absorption. Safe Water Harvesters designed the parameters and goals of the project and collaborated with the students remotely throughout the semester.Safe Water Harvesters’ role in the project was key to the project’s success. “At D-Lab, I learned the importance of understanding that solutions in international development must come from the voices and needs of people whom the intervention is trying to serve,” she says. “Some of the first questions we were taught to ask are ‘what materials and manufacturing processes are available?’ and ‘how is this technology going to be maintained by the community?’”The link between water access and gender inequityElecting to join the water harvesting project in Uganda was no accident. The previous summer, Sophia had interned with a startup targeting the spread of cholera in developing areas by engineering a new type of rapid detection technology that would sample from users’ local water sources. From there, she joined Professor Amos Winter’s Global Engineering and Research (GEAR) Lab as an Undergraduate Research Opportunities Program student and worked on a point-of-use desalination unit for households in India. Taking EC.715 (D-Lab: Water, Sanitation, and Hygiene) was a logical next step for Sophia. “This class was life-changing,” she says. “I was already passionate about clean water access and global resource equity, but I quickly discovered the complexity of WASH not just as an issue of poverty but as an issue of gender.” She joined a project spearheaded by a classmate from Nepal, which aimed to address the social taboos surrounding menstruation among Nepalese schoolgirls.“This class and project helped me realize that water insecurity and gender inequality — especially gender-based violence — ​are highly intertwined,” comments Sophia. This plays out in a variety of ways. Where there is poor sanitation infrastructure in schools, girls often miss classes or drop out altogether when menstruating. And where water is scarce, women and girls often walk miles to collect water to accommodate daily drinking, cooking, and hygiene needs. During this trek, they are vulnerable to assault and the pressure to engage in transactional sex at water access points.“It became clear to me that women are disproportionately affected by water insecurity, and that water is key to understanding women’s empowerment,” comments Sophia, “and that I wanted to keep learning about the field of development and how it intersects with gender!”So, in fall 2023, Sophia took both 11.025/EC.701 (D-Lab: Development) and WGS.277/EC.718 (D-Lab: Gender and Development). In D-Lab: Development, her team worked with Tatirano, a nongovernmental organization in Madagascar, to develop a vapor-condensing chamber for a water desalination system, a prototype they were able to test and iterate in Madagascar at the end of the semester.Getting out into the world through D-Lab fieldwork“Fieldwork with D-Lab is an eye-opening experience that anyone could benefit from,” says Sophia. “It’s easy to get lost in the MIT and tech bubble. But there’s a whole world out there with people who live such different lives than many of us, and we can learn even more from them than we can from our psets.”For Sophia’s D-Lab: Gender and Development class, she worked with the Society Empowerment Project in Kenya, ultimately traveling there during MIT’s Independent Activities Period last January. In Kenya, she worked with her team to run a workshop with teen parents to identify risk factors prior to pregnancy and postpartum challenges, in order to then ideate and develop solutions such as social programs. “Through my fieldwork in Kenya and Madagascar,” says Sophia, “it became clear how important it is to create community-based solutions that are led and maintained by community members. Solutions need community input, leadership, and trust. Ultimately, this is the only way to have long-lasting, high-impact, sustainable change. One of my D-Lab trip leaders said that you cannot import solutions. I hope all engineers recognize the significance of this statement. It is our duty as engineers and scientists to make the world a better place while carrying values of empathy, patience, and respect.”Pursuing passion and purpose at the intersection of medicine, technology, and policyAfter graduation in June, Sophia will be traveling to South Africa through MISTI Africa to help with a clinical trial and community outreach. She then intends to pursue a master’s in global health and apply to medical school, with the goal of working in global health at the intersection of medicine, technology, and policy.“It is no understatement to say that D-Lab has played a central role in helping me discover what I’m passionate about and what my purpose is in life,” she says. “I hope to dedicate my career towards solving global health inequity and gender inequality.” ​ More