More stories

  • in

    Study reveals uncertainty in how much carbon the ocean absorbs over time

    The ocean’s “biological pump” describes the many marine processes that work to take up carbon dioxide from the atmosphere and transport it deep into the ocean, where it can remain sequestered for centuries. This ocean pump is a powerful regulator of atmospheric carbon dioxide and an essential ingredient in any global climate forecast.

    But a new MIT study points to a significant uncertainty in the way the biological pump is represented in climate models today. Researchers found that the “gold standard” equation used to calculate the pump’s strength has a larger margin of error than previously thought, and that predictions of how much atmospheric carbon the ocean will pump down to various depths could be off by 10 to 15 parts per million.

    Given that the world is currently emitting carbon dioxide into the atmosphere at an annual rate of about 2.5 parts per million, the team estimates that the new uncertainty translates to about a five-year error in climate target projections.

    “This larger error bar might be critical if we want to stay within 1.5 degrees of warming targeted by the Paris Agreement,” says Jonathan Lauderdale, a research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “If current models predict we have until 2040 to cut carbon emissions, we’re expanding the uncertainty around that, to say maybe we now have until 2035, which could be quite a big deal.”

    Lauderdale and former MIT graduate student B.B. Cael, now at the National Oceanography Center in Southampton, U.K., have published their study today in the journal Geophysical Research Letters.

    Snow curve

    The marine processes that contribute to the ocean’s biological pump begin with phytoplankton, microscopic organisms that soak up carbon dioxide from the atmosphere as they grow. When they die, phytoplankton collectively sink through the water column as “marine snow,” carrying that carbon with them.

    “These particles rain down like white flaky snow that is all this dead stuff falling out of the surface ocean,” Lauderdale says.

    At various depths the particles are consumed by microbes, which convert the particles’ organic carbon and respire it into the deep ocean in an inorganic, mineral form, in a process known as remineralization.

    In the 1980s, researchers collected marine snow at locations and depths throughout the tropical Pacific. From these observations they generated a simple power law  mathematical relationship — the Martin curve, named after team member John Martin — to describe the strength of the biological pump, and how much carbon the ocean can remineralize and sequester at various depths.

    “The Martin curve is ubiquitous, and it’s really the gold standard [used in many climate models today],” Lauderdale says.

    But in 2018, Cael and co-author Kelsey Bisson showed that the power law derived to explain the Martin curve was not the only equation that could fit the observations. The power law is a simple mathematical relationship that assumes that particles fall faster with depth. But Cael found that several other mathematical relationships, each based on different mechanisms for how marine snow sinks and is remineralized, could also explain the data.

    For instance, one alternative assumes that particles fall at the same rate no matter the depth, while another assumes that particles with heavy, less-consumable phytoplankton shells fall faster than those without.

    “He found that you can’t tell which curve is the right one, which is a bit troubling, because each curve has different mechanisms behind it,” Lauderdale says. “In other words, researchers might be using the ‘wrong’ function to predict the strength of the biological pump. These discrepancies could snowball and impact climate projections.”

    A curve, reconsidered

    In the new study, Lauderdale and Cael looked at how much difference it would make to estimates of carbon stored deep in the ocean if they changed the mathematical description of the biological pump.

    They started with the same six alternative equations, or remineralization curves, that Cael had previously studied. The team looked at how climate models’ predictions of atmospheric carbon dioxide would change if they were based on any of the six alternatives, versus the Martin curve’s power law.

    To make the comparison as statistically similar as possible, they first fit each alternative equation to the Martin curve. The Martin curve describes the how much marine snow reaches various depths through the ocean. The researchers entered the data points from the curve into each alternative equation. They then ran each equation through the MITgcm, a general circulation model that simulates, among other processes, the flux of carbon dioxide between the atmosphere and the ocean.

    The team ran the climate model forward in time to see how each alternative equation for the biological pump changed the model’s estimates of carbon dioxide in the atmosphere, compared with the Martin curve’s power law. They found that the amount of carbon that the ocean is able to draw down and sequester from the atmosphere varies widely, depending on which mathematical description for the biological pump they used.

    “The surprising part was that even small changes in the amount of remineralization or marine snow making it to different depths due to the different curves can lead to significant changes in atmospheric carbon dioxide,” Lauderdale says.

    The results suggest that the ocean’s pumping strength, and the processes that govern how fast marine snow falls, are still an open question.  

    “We definitely need to make many more measurements of marine snow to break down the mechanisms behind what’s going on,” Lauderdale adds. “Because probably all these processes are relevant, but we really want to know which are driving carbon sequestration.”

    This research was supported, in part, by the National Science Foundation, the Simons Collaboration on Computational Biogeochemical Modeling of Marine Ecosystems, and the UK National Environmental Research Council. More

  • in

    Accounting for firms’ positive impacts on the environment

    Gregory Norris is an expert on quantifying firms’ impacts on the environment over the life cycles of their products and processes. His analyses help decision-makers opt for more sustainable, Earth-friendly outputs.

    He and others in this field of life-cycle assessment (LCA) have largely gone about their work by determining firms’ negative impacts on the environment, or footprints, a term most people are familiar with. But Norris felt something was missing. What about the positive impacts firms can have by, for example, changing behaviors or creating greener manufacturing processes that become available to competitors? Could they be added to the overall LCA tally?

    Introducing handprints, the term Norris coined for those positive impacts and the focus of MIT’s Sustainability and Health Initiative for NetPositive Enterprise (SHINE). SHINE is co-led by Norris and Randolph Kirchain, who both have appointments through MIT’s Materials Research Laboratory (MRL).

    Positive impacts

    “If you ask LCA practitioners what they track to determine a product’s sustainability, 99 out of 100 will talk about footprints, these negative impacts,” Norris says. “We’re about expanding that to include handprints, or positive impacts.”

    Says Kirchain, “we’re trying to make the [LCA] metrics more encompassing so firms are motivated to make positive changes as well.” And that could ultimately “increase the scope of activities that firms engage in for environmental benefits.”

    In a February 2021 paper in the International Journal of Life Cycle Assessment, Norris, Kirchain, and colleagues lay out the methodology for not only estimating handprints but also combining them with footprints. Additional authors of the paper are Jasmina Burek, Elizabeth A. Moore, and Jeremy Gregory, who are also affiliated with the MRL.

    “By giving handprints a defendable methodology, we get closer to the ideal place where everything that counts can be counted,” says Jeff Zeman, principal of TrueNorth Collective, a consulting firm for sustainability. Zeman was not involved in the work.

    As a result, Zeman continues, “designers can see the positive impact of their work show up in an organization’s messaging, as progress toward its sustainability goals, and bridge their work with other good actors to create shared benefits. Handprints have been a powerful influence on me and my team — and continue to be.”

    How it works

    Handprints are measured with the same metrics used for quantifying different footprints. For example, a classic metric for determining a product’s water footprint is the liters of water used to create that product. The same product’s water handprint would be calculated by determining the liters of water saved through a positive change such as instituting a new manufacturing process involving recycled materials. Both footprints and handprints are measured using existing life-cycle inventory databases, software, and calculation methods.

    The SHINE team has demonstrated the impact of adding handprints to LCA analyses through case studies with several companies. One such study described in the paper involved Interface, a manufacturer of flooring materials. The SHINE team calculated the company’s handprints associated with the use of “recycled” gas to help heat its manufacturing facility. Specifically, Interface captured and burned methane gas from a landfill. That gas would otherwise have been released to the atmosphere, contributing to climate change.

    After calculating both the company’s handprints and footprints, the SHINE team found that Interface had a net positive impact. As the team wrote in their paper, “with the SHINE handprint framework, we can help actors to create handprints greater than, and commensurate with, their footprints.”

    Concludes Norris: “With this paper, we hope that work on sustainability will get stronger by making these tools available to more people.”

    This work was supported by the SHINE consortium. More

  • in

    Study reveals plunge in lithium-ion battery costs

    The cost of the rechargeable lithium-ion batteries used for phones, laptops, and cars has fallen dramatically over the last three decades, and has been a major driver of the rapid growth of those technologies. But attempting to quantify that cost decline has produced ambiguous and conflicting results that have hampered attempts to project the technology’s future or devise useful policies and research priorities.

    Now, MIT researchers have carried out an exhaustive analysis of the studies that have looked at the decline in the prices these batteries, which are the dominant rechargeable technology in today’s world. The new study looks back over three decades, including analyzing the original underlying datasets and documents whenever possible, to arrive at a clear picture of the technology’s trajectory.

    The researchers found that the cost of these batteries has dropped by 97 percent since they were first commercially introduced in 1991. This rate of improvement is much faster than many analysts had claimed and is comparable to that of solar photovoltaic panels, which some had considered to be an exceptional case. The new findings are reported today in the journal Energy and Environmental Science, in a paper by MIT postdoc Micah Ziegler and Associate Professor Jessika Trancik.

    While it’s clear that there have been dramatic cost declines in some clean-energy technologies such as solar and wind, Trancik says, when they started to look into the decline in prices for lithium-ion batteries, “we saw that there was substantial disagreement as to how quickly the costs of these technologies had come down.” Similar disagreements showed up in tracing other important aspects of battery development, such as the ever-improving energy density (energy stored within a given volume) and specific energy (energy stored within a given mass).

    “These trends are so consequential for getting us to where we are right now, and also for thinking about what could happen in the future,” says Trancik, who is an associate professor in MIT’s Institute for Data, Systems and Society. While it was common knowledge that the decline in battery costs was an enabler of the recent growth in sales of electric vehicles, for example, it was unclear just how great that decline had been. Through this detailed analysis, she says, “we were able to confirm that yes, lithium-ion battery technologies have improved in terms of their costs, at rates that are comparable to solar energy technology, and specifically photovoltaic modules, which are often held up as kind of the gold standard in clean energy innovation.”

    It may seem odd that there was such great uncertainty and disagreement about how much lithium-ion battery costs had declined, and what factors accounted for it, but in fact much of the information is in the form of closely held corporate data that is difficult for researchers to access. Most lithium-ion batteries are not sold directly to consumers — you can’t run down to your typical corner drugstore to pick up a replacement battery for your iPhone, your PC, or your electric car. Instead, manufacturers buy lithium-ion batteries and build them into electronics and cars. Large companies like Apple or Tesla buy batteries by the millions, or manufacture them themselves, for prices that are negotiated or internally accounted for but never publicly disclosed.

    In addition to helping to boost the ongoing electrification of transportation, further declines in lithium-ion battery costs could potentially also increase the batteries’ usage in stationary applications as a way of compensating for the intermittent supply of clean energy sources such as solar and wind. Both applications could play a significant role in helping to curb the world’s emissions of climate-altering greenhouse gases. “I can’t overstate the importance of these trends in clean energy innovation for getting us to where we are right now, where it starts to look like we could see rapid electrification of vehicles and we are seeing the rapid growth of renewable energy technologies,” Trancik says. “Of course, there’s so much more to do to address climate change, but this has really been a game changer.”

    The new findings are not just a matter of retracing the history of battery development, but of helping to guide the future, Ziegler points out. Combing all of the published literature on the subject of the cost reductions in lithium-ion cells, he found “very different measures of the historical improvement. And across a variety of different papers, researchers were using these trends to make suggestions about how to further reduce costs of lithium-ion technologies or when they might meet cost targets.” But because the underlying data varied so much, “the recommendations that the researchers were making could be quite different.” Some studies suggested that lithium-ion batteries would not fall in cost quickly enough for certain applications, while others were much more optimistic. Such differences in data can ultimately have a real impact on the setting of research priorities and government incentives.

    The researchers dug into the original sources of the published data, in some cases finding that certain primary data had been used in multiple studies that were later cited as separate sources, or that the original data sources had been lost along the way. And while most studies have focused only on the cost, Ziegler says it became clear that such a one-dimensional analysis might underestimate how quickly lithium-ion technologies improved; in addition to cost, weight and volume are also key factors for both vehicles and portable electronics. So, the team added a second track to the study, analyzing the improvements in these parameters as well.

    “Lithium-ion batteries were not adopted because they were the least expensive technology at the time,” Ziegler says. “There were less expensive battery technologies available. Lithium-ion technology was adopted because it allows you to put portable electronics into your hand, because it allows you to make power tools that last longer and have more power, and it allows us to build cars” that can provide adequate driving range. “It felt like just looking at dollars per kilowatt-hour was only telling part of the story,” he says.

    That broader analysis helps to define what may be possible in the future, he adds: “We’re saying that lithium-ion technologies might improve more quickly for certain applications than would be projected by just looking at one measure of performance. By looking at multiple measures, you get essentially a clearer picture of the improvement rate, and this suggests that they could maybe improve more rapidly for applications where the restrictions on mass and volume are relaxed.”

    Trancik adds the new study can play an important role in energy-related policymaking. “Published data trends on the few clean technologies that have seen major cost reductions over time, wind, solar, and now lithium-ion batteries, tend to be referenced over and over again, and not only in academic papers but in policy documents and industry reports,” she says. “Many important climate policy conclusions are based on these few trends. For this reason, it is important to get them right. There’s a real need to treat the data with care, and to raise our game overall in dealing with technology data and tracking these trends.”

    “Battery costs determine price parity of electric vehicles with internal combustion engine vehicles,” says Venkat Viswanathan, an associate professor of mechanical engineering at Carnegie Mellon University, who was not associated with this work. “Thus, projecting battery cost declines is probably one of the most critical challenges in ensuring an accurate understanding of adoption of electric vehicles.”

    Viswanathan adds that “the finding that cost declines may occur faster than previously thought will enable broader adoption, increasing volumes, and leading to further cost declines. … The datasets curated, analyzed and released with this paper will have a lasting impact on the community.”

    The work was supported by the Alfred P. Sloan Foundation. More

  • in

    Study: One enzyme dictates cells’ response to a probable carcinogen

    In the past few years, several medications have been found to be contaminated with NDMA, a probable carcinogen. This chemical, which has also been found at Superfund sites and in some cases has spread to drinking water supplies, causes DNA damage that can lead to cancer.

    MIT researchers have now discovered a mechanism that helps explain whether this damage will lead to cancer in mice: The key is the way cellular DNA repair systems respond. The team found that too little activity of one enzyme necessary for DNA repair leads to much higher cancer rates, while too much activity can produce tissue damage, especially in the liver, which can be fatal.

    Activity levels of this enzyme, called AAG, can vary greatly among different people, and measuring those levels could allow doctors to predict how people might respond to NDMA exposure, says Bevin Engelward, a professor of biological engineering at MIT and the senior author of the study. “It may be that people who are low in this enzyme are more prone to cancer from environmental exposures,” she says.

    MIT postdoc Jennifer Kay is the lead author of the new study, which appears today in Cell Reports.

    Potential hazards

    For several years, Engelward’s lab, in collaboration with the lab of MIT Professor Leona Samson, has been working on a research project, funded by the National Institute of Environmental Health Sciences, to study the effects of exposure to NDMA. This chemical is found in Superfund sites including the contaminated Olin Chemical site in Wilmington, Massachusetts. In the early 2000s, municipal water wells near the site had to be shut down because the groundwater was contaminated with NDMA and other hazardous chemicals.

    More recently, it was discovered that several types of medicine, including Zantac and drugs used to treat type 2 diabetes and high blood pressure, had been contaminated with NDMA. This chemical causes specific types of DNA damage, one of which is a lesion of adenine, one of the bases found in DNA. These lesions are repaired by AAG, which snips out the damaged bases so that other enzymes can cleave the DNA backbone, enabling DNA polymerases to replace them with new ones.

    If AAG activity is very high and the polymerases (or other downstream enzymes) can’t keep up with the repair, then the DNA may end up with too many unrepaired strand breaks, which can be fatal to the cell. However, if AAG activity is too low, damaged adenines persist and can be read incorrectly by the polymerase, causing the wrong base to be paired with it. Incorrect insertion of a new base produces a mutation, and accumulated mutations are known to cause cancer.

    In the new study, the MIT team studied mice with high levels of AAG — six times the normal amount — and mice with AAG knocked out. After exposure to NDMA, the mice with no AAG had many more mutations and higher rates of cancer in the liver, where NDMA has its greatest effect. Mice with sixfold levels of AAG had fewer mutations and lower cancer rates, at first glance appearing to be beneficial. However, in those mice, the researchers found a great deal of tissue damage and cell death in the liver.

    Mice with normal amounts of AAG (“wild-type” mice) showed some mutations after NDMA exposure but overall were much better protected against both cancer and liver damage.

    “Nature did a really good job establishing the optimal levels of AAG, at least for our animal model,” Engelward says. “What is striking is that the levels of one gene out of 23,000 dictates disease outcome, yielding opposite effects depending on low or high expression.” If too low, there are too many mutations; if too high, there is too much cell death.

    Varying responses

    In humans, there is a great deal of variation in AAG levels between different people: Studies have found that some people can have up to 20 times more AAG activity than others. This suggests that people may respond very differently to damage caused by NDMA, Kay says. Measuring those levels could potentially allow doctors to predict how people may respond to NDMA exposure in the environment or in contaminated medicines, she says.

    The researchers next plan to study the effects of chronic, low-level exposure to NDMA in mice, which they hope will shed light on how such exposures might affect humans. “That’s one of the top priorities for us, to figure out what happens in a real world, everyday exposure scenario,” Kay says.

    Another population for which measuring AAG levels could be useful is cancer patients who take temozolomide, a chemotherapy drug that causes the same kind of DNA damage as NDMA. It’s possible that people with high levels of AAG could experience more severe toxic side effects from taking the drug, while people with lower levels of AAG could be susceptible to mutations that might lead to a recurrence of cancer later in life, Kay says, adding that more studies are needed to investigate these potential outcomes.

    The research was funded primarily by the National Institute of Environmental Health Sciences Superfund Basic Research Program, with additional support from the National Cancer Institute and the MIT Center for Environmental Health Sciences.

    Other authors of the paper include Joshua Corrigan, an MIT technical associate, who is second author; Amanda Armijo, an MIT postdoc; Ilana Nazari, an MIT undergraduate; Ishwar Kohale, an MIT graduate student; Robert Croy, an MIT research scientist; Sebastian Carrasco, an MIT comparative pathologist; Dushan Wadduwage, a fellow at the Center for Advanced Imaging at Harvard University; Dorothea Torous, Svetlana Avlasevich, and Stephen Dertinger of Litron Laboratories; Forest White, an MIT professor of biological engineering; John Essigmann, a professor of chemistry and biological engineering at MIT; and Samson, a professor emerita of biology and biological engineering at MIT. More

  • in

    Study predicts the oceans will start emitting ozone-depleting CFCs

    The world’s oceans are a vast repository for gases including ozone-depleting chlorofluorocarbons, or CFCs. They absorb these gases from the atmosphere and draw them down to the deep, where they can remain sequestered for centuries and more.
    Marine CFCs have long been used as tracers to study ocean currents, but their impact on atmospheric concentrations was assumed to be negligible. Now, MIT researchers have found the oceanic fluxes of at least one type of CFC, known as CFC-11, do in fact affect atmospheric concentrations. In a study appearing today in the Proceedings of the National Academy of Sciences, the team reports that the global ocean will reverse its longtime role as a sink for the potent ozone-depleting chemical.
    The researchers project that by the year 2075, the oceans will emit more CFC-11 back into the atmosphere than they absorb, emitting detectable amounts of the chemical by 2130. Further, with increasing climate change, this shift will occur 10 years earlier. The emissions of CFC-11 from the ocean will effectively extend the chemical’s average residence time, causing it to linger five years longer in the atmosphere than it otherwise would. This may impact future estimations of CFC-11 emissions.
    The new results may help scientists and policymakers better pinpoint future sources of the chemical, which is now banned worldwide under the Montreal Protocol.
    “By the time you get to the first half of the 22nd century, you’ll have enough of a flux coming out of the ocean that it might look like someone is cheating on the Montreal Protocol, but instead, it could just be what’s coming out of the ocean,” says study co-author Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s an interesting prediction and hopefully will help future researchers avoid getting confused about what’s going on.”
    Solomon’s co-authors include lead author Peidong Wang, Jeffery Scott, John Marshall, Andrew Babbin, Megan Lickley, and Ronald Prinn from MIT; David Thompson of Colorado State University; Timothy DeVries of the University of California at Santa Barbara; and Qing Liang of the NASA Goddard Space Flight Center.
    An ocean, oversaturated
    CFC-11 is a chlorofluorocarbon that was commonly used to make refrigerants and insulating foams. When emitted to the atmosphere, the chemical sets off a chain reaction that ultimately destroys ozone, the atmospheric layer that protects the Earth from harmful ultraviolet radiation. Since 2010, the production and use of the chemical has been phased out worldwide under the Montreal Protocol, a global treaty that aims to restore and protect the ozone layer.
    Since its phaseout, levels of CFC-11 in the atmosphere have been steadily declining, and scientists estimate that the ocean has absorbed about 5 to 10 percent of all manufactured CFC-11 emissions. As concentrations of the chemical continue to fall in the atmosphere, however, it’s predicted that CFC-11 will oversaturate in the ocean, pushing it to become a source rather than a sink.
    “For some time, human emissions were so large that what was going into the ocean was considered negligible,” Solomon says. “Now, as we try to get rid of human emissions, we find we can’t completely ignore what the ocean is doing anymore.”
    A weakening reservoir
    In their new paper, the MIT team looked to pinpoint when the ocean would become a source of the chemical, and to what extent the ocean would contribute to CFC-11 concentrations in the atmosphere. They also sought to understand how climate change would impact the ocean’s ability to absorb the chemical in the future.
    The researchers used a hierarchy of models to simulate the mixing within and between the ocean and atmosphere. They began with a simple model of the atmosphere and the upper and lower layers of the ocean, in both the northern and southern hemispheres. They added into this model anthropogenic emissions of CFC-11 that had previously been reported through the years, then ran the model forward in time, from 1930 to 2300, to observe changes in the chemical’s flux between the ocean and the atmosphere.
    They then replaced the ocean layers of this simple model with the MIT general circulation model, or MITgcm, a more sophisticated representation of ocean dynamics, and ran similar simulations of CFC-11 over the same time period.
    Both models produced atmospheric levels of CFC-11 through the present day that matched with recorded measurements, giving the team confidence in their approach. When they looked at the models’ future projections, they observed that the ocean began to emit more of the chemical than it absorbed, beginning around 2075. By 2145, the ocean would emit CFC-11 in amounts that would be detectable by current monitoring standards.

    Play video

    This animation shows (at right) the CFC-11 stored in the ocean over time, and (at left) the corresponding change in the chemical’s total atmospheric lifetime.

    The ocean’s uptake in the 20th century and outgassing in the future also affects the chemical’s effective residence time in the atmosphere, decreasing it by several years during uptake and increasing it by up to 5 years by the end of 2200.
    Climate change will speed up this process. The team used the models to simulate a future with global warming of about 5 degrees Celsius by the year 2100, and found that climate change will advance the ocean’s shift to a source by 10 years and produce detectable levels of CFC-11 by 2140.
    “Generally, a colder ocean will absorb more CFCs,” Wang explains. “When climate change warms the ocean, it becomes a weaker reservoir and will also outgas a little faster.”
    “Even if there were no climate change, as CFCs decay in the atmosphere, eventually the ocean has too much relative to the atmosphere, and it will come back out,” Solomon adds. “Climate change, we think, will make that happen even sooner. But the switch is not dependent on climate change.”
    Their simulations show that the ocean’s shift will occur slightly faster in the Northern Hemisphere, where large-scale ocean circulation patterns are expected to slow down, leaving more gases in the shallow ocean to escape back to the atmosphere. However, knowing the exact drivers of the ocean’s reversal will require more detailed models, which the researchers intend to explore.
    “Some of the next steps would be to do this with higher-resolution models and focus on patterns of change,” says Scott. “For now, we’ve opened up some great new questions and given an idea of what one might see.”
    This research was supported, in part, by the VoLo Foundation, the Simons Foundation, and the National Science Foundation. More

  • in

    Visualizing a climate-resilient MIT

    The Sustainability DataPool, powered by the Office of Sustainability (MITOS), gives the MIT community the opportunity to understand data on important sustainability metrics like energy, water use, emissions, and recycling rates. While most visualizations share data from past events, the newest dashboard — the MIT Climate Resiliency Dashboard (MIT certificate required to view) — looks to potential future events in the form of flooding on campus. The dashboard is an essential planning tool for ongoing work to build a climate-resilient MIT, one that fulfills its mission in the face of impacts of climate change. It’s also a tool that highlights the importance of collaboration in devising sustainability solutions.
    Development of the dashboard began in 2017 when the City of Cambridge, Massachusetts, released the first version of its FloodViewer. The viewer allowed users to map climate change threats from flooding in Cambridge. Scanning the map in the viewer, one could see all of Cambridge — except for MIT. At the time, the City of Cambridge did not have a full account of MIT’s stormwater drainage system, so the viewer was launched without it. That unmapped area served as a call to action for MITOS. 
    MITOS Assistant Director Brian Goldberg and MITOS Faculty Fellow and research scientist at the MIT Center for Global Change Science Ken Strzepek reached out to the Cambridge Community Development Department — with which MIT has long worked on climate action — and made a plan to populate the missing map information, working with the city to understand how to map MIT’s data to fit the FloodViewer model. Harmonizing MIT’s data with that of the city would complete the potential flooding picture for Cambridge, give MIT new insight on its own potential flooding threats, and enable a common climate change baseline for planning decisions about campus and city building and infrastructure projects. “We saw this as an opportunity to expand our understanding of our own threats on campus and to team with the city to explore how we could develop a common picture of climate change impacts,” says Strzepek.
    From there, Strzepek, Goldberg, and a number of MITOS student fellows began work on what would become the Climate Resiliency Dashboard. MITOS partnered with the Department of Facilities; Center for Global Change Science; Office of Emergency Management; Office of Campus Planning; Department of Earth, Atmospheric and Planetary Sciences; Urban Risk Lab; and other members of the MIT Climate Resiliency Committee for assistance on data, design, and user testing. These partnerships helped create the most accurate picture of potential flooding impacts on campus by looking at topography, stormwater management systems, and past trends.
    The beta version of the tool went live in November 2020 and functions much like the Cambridge FloodViewer: Projected flooding data is laid over a campus map of MIT, allowing users to zoom in on a portion of campus under a specific scenario — say, a 100-year storm occurring in 2030 — and see the projected potential peak rain or storm surge water depth at that location. The dashboard explains not only how these numbers were calculated, but what types of rain and storm surge events can cause them to happen. But the flood mapping is only part of the story. “Flooding itself isn’t necessarily a problem — it’s the potential of that flooding to interrupt MIT’s critical research, education, and campus operations,” explains Goldberg.
    The dashboard is already informing new building designs, such as the MIT Schwarzman College of Computing, which is designed to be resilient to a 100-year flood event anticipated under a changed climate 50 years from today. “By enabling MIT to understand flood risk for new buildings, we can respond holistically to that risk and integrate flood mitigation strategies as part of the overarching design,” explains Randa Ghattas, senior sustainability project manager in the Department of Facilities. “This could include intentionally elevating buildings or a combination of gray- and green-infrastructure site-level strategies to mitigate flooding and support multiple benefits like stormwater management, urban heat island mitigation, and enhanced outdoor comfort.”
    Information displayed in the dashboard is continually being refined as the science and engineering of flood risk modeling progress. Goldberg explains, “While the dashboard projects water depth next to campus buildings, we’re still testing methods for informing whether water will actually enter buildings via doorways, low windows, or ground-level air vents.”
    Although the dashboard will always contain a certain level of uncertainty, the plan is to continue to evolve a more robust tool. “We called it the MIT Climate Resiliency Dashboard, and not the MIT Flood Viewer, because we plan to visualize more data related to climate resiliency, like extreme and prolonged heat events,” says Goldberg, noting that heat information is expected to be added in late 2021. “As the science advances, understanding heat impacts today and going forward will bring more of that into this dashboard.” Cambridge has already modeled some aspects of future heat risk and developed preparedness plans, allowing MIT to build upon the city’s heat risk modeling, communicate findings through the Climate Resiliency Dashboard, and anticipate how MIT can protect its community, research, academics, and operations from changes in heat over time.
    This Climate Resiliency Dashboard joins many other data sets and visualizations available to the MIT community in the Sustainability DataPool — part of the fulfillment of Pillar E of MIT’s Plan for Action on Climate Change, which calls for using the campus as a test bed for change. “By testing these ideas on campus and sharing our data, findings, and planning frameworks, we’re not only supporting a more climate-resilient MIT, but also providing the tools for others to learn from us, solving these same challenges in their own communities and institutions,” says Goldberg. More

  • in

    3 Questions: Claude Grunitzky MBA '12 on launching TRUE Africa University

    Shortly after he sold TRACE, the fast-growing, New York-based media company he founded at age 24, Claude Grunitzky came to MIT as a Sloan Fellow. He chose MIT because he wanted to learn more about digital media and the ways he could leverage it for his next company. He was also interested in MIT’s approach to building new technologies that could scale through network effects.
    While at MIT Sloan, the Togolese-American entrepreneur spent considerable time at the MIT Media Lab, working with Joost Bonsen, a lecturer in media arts and sciences, and Professor Alex “Sandy” Pentland, the Media Lab Entrepreneurship Program director, on shaping what would become TRUE Africa, his digital media company focused on championing young African voices all over the world. Grunitzky graduated in 2012, earning an MBA.
    TRUE Africa was launched as a news and culture website in 2015. Grunitzky used new publishing technologies to promote African perspectives instead of relying on Western perceptions of what Africa was becoming. Grunitzky and his editorial team chose to document Africans and Afro-descendants’ innovations and contributions to global popular culture.
    In 2019, Grunitzky realized that, while useful for telling a different story about modern Africa, a media platform was not enough. He decided to pivot to education. His new vision was to create, for higher education, a remote learning platform for African youth. The pandemic, which led to the closure of many universities in Africa, gave a sense of urgency to his launch plans for the new venture, which he called TRUE Africa University (TAU). The venture is currently being incubated at the Abdul Latif Jameel World Education Lab (J-WEL).
    TAU currently consists of a webinar series focused on sustainable development in Africa. Grunitzky, serves as a host, interviewing thinkers, shapers, and doers he sees as the inventors of the future of Africa. Produced in collaboration with the MIT Center for International Studies, the webinar series features guests including Taiye Selasi, the Ghanaian-Nigerian author; Jeffrey Sachs, the American economist; and Iyinoluwa Aboyeji, the Nigerian serial entrepreneur behind some of Africa’s most valuable startups.
    Here, Grunitzky describes his inspiration for and goals for the TAU project.
    Q: What is the purpose of TRUE Africa University?
    A: Ever since I came to MIT as a Sloan fellow a decade ago, I’ve wanted to find new ways to tap into MIT’s can-do spirit of innovation and entrepreneurship to help me launch a new type of African company that would play a sizable role in solving some of Africa’s biggest problems.
    At MIT, I met kindred spirits who encouraged our experiments, but I eventually settled on launching another media company, which I named TRUE Africa. With the TRUE Africa website, I relied on my expertise in media, but three years after launching TRUE Africa online, I realized that I wanted to solve a bigger problem than what we could accomplish through reporting about young Africans and their creativity.
    Having seen excellence in motion at MIT, I came to believe that what young Africans need more than anything is quality education. I had been deeply inspired by Salman Khan ever since he launched Khan Academy, and I wanted to achieve something on that scale. I was thinking, conceptually, of a pivot to education, but I didn’t have the confidence to take on something so ambitious until I found myself in another defining MIT moment, in May 2019.
    It happened on the terrace of the Grafenegg Castle outside Vienna, in Austria. I had gone to the MIT Grafenegg Forum as a speaker on media and society in Africa, and I saw an opportunity to pitch my TRUE Africa University idea to Sanjay Sarma, the vice president for open learning at MIT who was one of the forum’s organizers. I was an admirer of Sanjay’s work overseeing edX and MIT’s other digital learning platforms, and I made my case during a short break from the seated dinner.
    He gave the TRUE Africa University idea his blessing on the spot, and three months later my Moroccan co-founder and I were camping out at Sanjay’s office and ideating, with his teams at MIT J-WEL, on curricula for digital learning in developing nations. Another person I became close to at MIT is John Tirman, the political theorist who is also the executive director at MIT’s Center for International Studies (CIS). I have been a research affiliate at CIS since 2011, and I’d organized webinars for CIS before. John and I agreed that the best way to launch the TRUE Africa University platform was through a webinar series. That is when I got to work on the programmatic aspects of the series.
    Q: Why are webinars the medium of choice for accomplishing your goals with TAU?
    A: With my background and aspirations as a storyteller, I’ve been writing, publishing, broadcasting, and operating across various media platforms since I was 21-year-old journalist. I know that content is king. The problem is, there is way too much content out there now. Social media has opened the floodgates, and the various social networks have dramatically increased content output globally, but not all that content is interesting, or engaging, or useful.
    I wanted to launch the TRUE Africa University webinar series with a film screening. It’s actually a film I executive produced, alongside Fernando Meirelles, the director of some of my favorite films, including “City of God,” “The Constant Gardener,” and last year’s “The Two Popes.” Our documentary, “The Great Green Wall,” premiered at the Venice Film Festival in 2019, and won many awards in many countries. 
    “The Great Green Wall” is an African-led movement with an epic ambition to grow an 8,000-kilometer natural wonder of the world across the entire width of Africa. It’s actually a wall of trees being planted from Senegal in the west all the way to Djibouti in the east. A decade in and roughly 15 percent under way, the initiative is already bringing life back to Africa’s degraded landscapes at an unprecedented scale, providing food security, jobs, and a reason to stay for the millions who live along its path. We launched the webinar series with a screening of that film, and a post-screening panel discussion that I moderated with Meirelles.
    Most people, including in Africa, are not aware of the devastating effects of climate change on the African continent, and on the prospects for African youth. That screening and first webinar discussion sets the tone for our higher learning ambitions with TRUE Africa University, while helping us to bring in experts who can frame some of the major issues facing young Africans, as many of them seek new pathways to a more sustainable future for the continent.
    Q: What are your longer-term goals for the project?
    A: The webinars are meant to provide fresh ideas, out-of-the-box solutions, and new ways of thinking of Africa’s future, post-pandemic. We are exploring the new digital solutions to some of Africa’s problems, and how technology can create a virtuous circle for African development. Consider this: At the end of 2000 there were just 15 million Africans with access to mobile devices. Now, more than a quarter of Africa’s population of 1.3 billion have adopted the mobile internet.
    In 2100, there will be close to 800 million people living in Nigeria alone, quadrupling the current population of 200 million. Nigeria will be the world’s second-most populated country, after India. I am launching TRUE Africa University because those young Africans need to be educated, and there is just no way that African governments will have the resources to build enough classrooms for all those students.
    The solution will have to be online, and in my wildest dreams I see TRUE Africa as a daily resource for millions of young Africans who demand quality education. The journey is just beginning, and I am aware of the hurdles on this long road. I am so fortunate that we have MIT in our corner, as we embark on this ambitious endeavor. More

  • in

    MIT Solve announces 2021 global challenges

    On March 1, MIT Solve launched its 2021 Global Challenges, with over $1.5 million in prize funding available to innovators worldwide.
    Solve seeks tech-based solutions from social entrepreneurs around the world that address five challenges. Anyone, anywhere can apply to address the challenges by the June 16 deadline. Solve also announced Eric s. Yuan, founder and CEO of Zoom, and Karlie Kloss, founder of Kode With Klossy, as 2021 Challenge Ambassadors. 
    To help with the challenge application process, Solve runs a course with MITx entitled “Business and Impact Planning for Social Enterprises,” which introduces core business model and theory-of-change concepts to early stage entrepreneurs. 
    Finalists will be invited to attend Solve Challenge Finals on Sept. 19 in New York during U.N. General Assembly week. At the event, they will pitch their solutions to Solve’s Challenge Leadership Groups, judging panels comprised of industry leaders and MIT faculty. The judges will select the most promising solutions as Solver teams.
    “After a year of turmoil, including a major threat to our collective health, disruption in schooling, lack of access to digital connectivity and meaningful work, a reckoning in the U.S. after centuries of institutionalized racism, or worsening natural hazards — supporting diverse innovators who are solving these challenges is more urgent than ever,” says Alex Amouyel, executive director of MIT Solve. “Solve is committed to bolstering communities in the U.S. and across the world by supporting innovators who are addressing our 2021 Global Challenges — wherever they are — through funding, mentorship, and an MIT-backed community. Whether you’re a prospective Solve partner or applicant, we hope you’ll join us!” 
    Solver teams participate in a nine-month program that connects them to the resources they need to scale. Thanks to its partners, to date Solve has provided over $40 million in commitments for Solver teams and entrepreneurs.
    Solve’s challenge design process collects insights and ideas from industry leaders, MIT faculty, and local community voices alike. 
    Solve’s 2021 Global Challenges are:
    Funders include the Patrick J. McGovern Foundation, General Motors, Comcast NBCUniversal, Vodafone Americas Foundation, HP, Ewing Marion Kauffman Foundation, American Student Assistance, The Robert Wood Johnson Foundation, Andan Foundation, Good Energies Foundation and the Elevate Prize Foundation. The Solve community will convene at Virtual Solve at MIT on May 3-4 with 2020 Solver teams, Solve members, and partners to build partnerships and tackle global challenges in real-time. 
    As a marketplace for social impact innovation, Solve’s mission is to solve world challenges. Solve finds promising tech-based social entrepreneurs around the world, then brings together MIT’s innovation ecosystem and a community of members to fund and support these entrepreneurs to help scale their impact. Organizations interested in joining the Solve community can learn more and apply for membership here. More