More stories

  • in

    The curse of variety in transportation systems

    Cathy Wu has always delighted in systems that run smoothly. In high school, she designed a project to optimize the best route for getting to class on time. Her research interests and career track are evidence of a propensity for organizing and optimizing, coupled with a strong sense of responsibility to contribute to society instilled by her parents at a young age.

    As an undergraduate at MIT, Wu explored domains like agriculture, energy, and education, eventually homing in on transportation. “Transportation touches each of our lives,” she says. “Every day, we experience the inefficiencies and safety issues as well as the environmental harms associated with our transportation systems. I believe we can and should do better.”

    But doing so is complicated. Consider the long-standing issue of traffic systems control. Wu explains that it is not one problem, but more accurately a family of control problems impacted by variables like time of day, weather, and vehicle type — not to mention the types of sensing and communication technologies used to measure roadway information. Every differentiating factor introduces an exponentially larger set of control problems. There are thousands of control-problem variations and hundreds, if not thousands, of studies and papers dedicated to each problem. Wu refers to the sheer number of variations as the curse of variety — and it is hindering innovation.

    Play video

    “To prove that a new control strategy can be safely deployed on our streets can take years. As time lags, we lose opportunities to improve safety and equity while mitigating environmental impacts. Accelerating this process has huge potential,” says Wu.  

    Which is why she and her group in the MIT Laboratory for Information and Decision Systems are devising machine learning-based methods to solve not just a single control problem or a single optimization problem, but families of control and optimization problems at scale. “In our case, we’re examining emerging transportation problems that people have spent decades trying to solve with classical approaches. It seems to me that we need a different approach.”

    Optimizing intersections

    Currently, Wu’s largest research endeavor is called Project Greenwave. There are many sectors that directly contribute to climate change, but transportation is responsible for the largest share of greenhouse gas emissions — 29 percent, of which 81 percent is due to land transportation. And while much of the conversation around mitigating environmental impacts related to mobility is focused on electric vehicles (EVs), electrification has its drawbacks. EV fleet turnover is time-consuming (“on the order of decades,” says Wu), and limited global access to the technology presents a significant barrier to widespread adoption.

    Wu’s research, on the other hand, addresses traffic control problems by leveraging deep reinforcement learning. Specifically, she is looking at traffic intersections — and for good reason. In the United States alone, there are more than 300,000 signalized intersections where vehicles must stop or slow down before re-accelerating. And every re-acceleration burns fossil fuels and contributes to greenhouse gas emissions.

    Highlighting the magnitude of the issue, Wu says, “We have done preliminary analysis indicating that up to 15 percent of land transportation CO2 is wasted through energy spent idling and re-accelerating at intersections.”

    To date, she and her group have modeled 30,000 different intersections across 10 major metropolitan areas in the United States. That is 30,000 different configurations, roadway topologies (e.g., grade of road or elevation), different weather conditions, and variations in travel demand and fuel mix. Each intersection and its corresponding scenarios represents a unique multi-agent control problem.

    Wu and her team are devising techniques that can solve not just one, but a whole family of problems comprised of tens of thousands of scenarios. Put simply, the idea is to coordinate the timing of vehicles so they arrive at intersections when traffic lights are green, thereby eliminating the start, stop, re-accelerate conundrum. Along the way, they are building an ecosystem of tools, datasets, and methods to enable roadway interventions and impact assessments of strategies to significantly reduce carbon-intense urban driving.

    Play video

    Their collaborator on the project is the Utah Department of Transportation, which Wu says has played an essential role, in part by sharing data and practical knowledge that she and her group otherwise would not have been able to access publicly.

    “I appreciate industry and public sector collaborations,” says Wu. “When it comes to important societal problems, one really needs grounding with practitioners. One needs to be able to hear the perspectives in the field. My interactions with practitioners expand my horizons and help ground my research. You never know when you’ll hear the perspective that is the key to the solution, or perhaps the key to understanding the problem.”

    Finding the best routes

    In a similar vein, she and her research group are tackling large coordination problems. For example, vehicle routing. “Every day, delivery trucks route more than a hundred thousand packages for the city of Boston alone,” says Wu. Accomplishing the task requires, among other things, figuring out which trucks to use, which packages to deliver, and the order in which to deliver them as efficiently as possible. If and when the trucks are electrified, they will need to be charged, adding another wrinkle to the process and further complicating route optimization.

    The vehicle routing problem, and therefore the scope of Wu’s work, extends beyond truck routing for package delivery. Ride-hailing cars may need to pick up objects as well as drop them off; and what if delivery is done by bicycle or drone? In partnership with Amazon, for example, Wu and her team addressed routing and path planning for hundreds of robots (up to 800) in their warehouses.

    Every variation requires custom heuristics that are expensive and time-consuming to develop. Again, this is really a family of problems — each one complicated, time-consuming, and currently unsolved by classical techniques — and they are all variations of a central routing problem. The curse of variety meets operations and logistics.

    By combining classical approaches with modern deep-learning methods, Wu is looking for a way to automatically identify heuristics that can effectively solve all of these vehicle routing problems. So far, her approach has proved successful.

    “We’ve contributed hybrid learning approaches that take existing solution methods for small problems and incorporate them into our learning framework to scale and accelerate that existing solver for large problems. And we’re able to do this in a way that can automatically identify heuristics for specialized variations of the vehicle routing problem.” The next step, says Wu, is applying a similar approach to multi-agent robotics problems in automated warehouses.

    Wu and her group are making big strides, in part due to their dedication to use-inspired basic research. Rather than applying known methods or science to a problem, they develop new methods, new science, to address problems. The methods she and her team employ are necessitated by societal problems with practical implications. The inspiration for the approach? None other than Louis Pasteur, who described his research style in a now-famous article titled “Pasteur’s Quadrant.” Anthrax was decimating the sheep population, and Pasteur wanted to better understand why and what could be done about it. The tools of the time could not solve the problem, so he invented a new field, microbiology, not out of curiosity but out of necessity. More

  • in

    Q&A: A high-tech take on Wagner’s “Parsifal” opera

    The world-famous Bayreuth Festival in Germany, annually centered around the works of composer Richard Wagner, launched this summer on July 25 with a production that has been making headlines. Director Jay Scheib, an MIT faculty member, has created a version of Wagner’s celebrated opera “Parsifal” that is set in an apocalyptic future (rather than the original Medieval past), and uses augmented reality headset technology for a portion of the audience, among other visual effects. People using the headsets see hundreds of additional visuals, from fast-moving clouds to arrows being shot at them. The AR portion of the production was developed through a team led by designer and MIT Technical Instructor Joshua Higgason.

    The new “Parsifal” has engendered extensive media attention and discussion among opera followers and the viewing public. Five years in the making, it was developed with the encouragement of Bayreuth Festival general manager Katharina Wagner, Richard Wagner’s great-granddaughter. The production runs until Aug. 27, and can also be streamed on Stage+. Scheib, the Class of 1949 Professor in MIT’s Music and Theater Arts program, recently talked to MIT News about the project from Bayreuth.

    Q: Your production of “Parsifal” led off this year’s entire Bayreuth festival. How’s it going?

    A: From my point of view it’s going quite swimmingly. The leading German opera critics and the audiences have been super-supportive and Bayreuth makes it possible for a work to evolve … Given the complexity of the technical challenge of making an AR project function in an opera house, the bar was so high, it was a difficult challenge, and we’re really happy we found a way forward, a way to make it work, and a way to make it fit into an artistic process. I feel great.

    Q: You offer a new interpretation of “Parsifal,” and a new setting for it. What is it, and why did you choose to interpret it this way?

    A: One of the main themes in “Parsifal” is that the long-time king of this holy grail cult is wounded, and his wound will not heal. [With that in mind], we looked at what the world was like when the opera premiered in the late 19th century, around the time of what was known as the Great African Scramble, when Europe re-drew the map of Africa, largely based on resources, including mineral resources.

    Cobalt remains [the focus of] dirty mining practices in the Democratic Republic of Congo, and is a requirement for a lot of our electronic objects, in particular batteries. There are also these massive copper deposits discovered under a Buddhist temple in Afghanistan, and lithium under a sacred site in Nevada. We face an intense challenge in climate change, and the predictions are not good. Some of our solutions like electric cars require these materials, so they’re only solutions for some people, while others suffer [where minerals are being mined]. We started thinking about how wounds never heal, and when the prospect of creating a better world opens new wounds in other communities. … That became a theme. It also comes out of the time when we were making it, when Covid happened and George Floyd was murdered, which created an opportunity in the U.S. to start speaking very openly about wounds that have not healed.

    We set it in a largely post-human environment, where we didn’t succeed, and everything has collapsed. In the third act, there’s derelict mining equipment, and the holy water is this energy-giving force, but in fact it’s this lithium-ion pool, which gives us energy and then poisons us. That’s the theme we created.

    Q: What were your goals about integrating the AR technology into the opera, and how did you achieve that?

    A: First, I was working with my collaborator Joshua Higgason. No one had ever really done this before, so we just started researching whether it was possible. And most of the people we talked to said, “Don’t do it. It’s just not going to work.” Having always been a daredevil at heart, I was like, “Oh, come on, we can figure this out.”

    We were diligent in exploring the possibilities. We made multiple trips to Bayreuth and made these milimeter-accurate laser scans of the auditorium and the stage. We built a variety of models to see how to make AR work in a large environment, where 2,000 headsets could respond simultaneously. We built a team of animators and developers and programmers and designers, from Portugal to Cambridge to New York to Hungary, the UK, and a group in Germany. Josh led this team, and they got after it, but it took us the better part of two years to make it possible for an audience, some of whom don’t really use smartphones, to put on an AR headset and have it just work.

    I can’t even believe we did this. But it’s working.

    Q: In opera there’s hopefully a productive tension between tradition and innovation. How do you think about that when it comes to Wagner at Bayreuth?

    A: Innovation is the tradition at Bayreuth. Musically and scenographically. “Parsifal” was composed for this particular opera house, and I’m incredibly respectful of what this event is made for. We are trying to create a balanced and unified experience, between the scenic design and the AR and the lighting and the costume design, and create perfect moments of convergence where you really lose yourself in the environment. I believe wholly in the production and the performers are extraordinary. Truly, truly, truly extraordinary.

    Q: People have been focused on the issue of bringing AR to Bayreuth, but what has Bayreuth brought to you as a director?

    A: Working in Bayreuth has been an incredible experience. The level of intellectual integrity among the technicians is extraordinary. The amount of care and patience and curiosity and expertise in Bayreuth is off the charts. This community of artists is the greatest. … People come here because it’s an incredible meeting of the minds, and for that I’m immensely filled with gratitude every day I come into the rehearsal room. The conductor, Pablo Heras-Casado, and I have been working on this for several years. And the music is still first. We’re setting up technology not to overtake the music, but to support it, and visually amplify it.

    It must be said that Katharina Wagner has been one of the most powerfully supportive artistic directors I have ever worked with. I find it inspiring to witness her tenacity and vision in seeing all of this through, despite the hurdles. It’s been a great collaboration. That’s the essence: great collaboration. More

  • in

    How forests can cut carbon, restore ecosystems, and create jobs

    To limit the frequency and severity of droughts, wildfires, flooding, and other adverse consequences of climate change, nearly 200 countries committed to the Paris Agreement’s long-term goal of keeping global warming well below 2 degrees Celsius. According to the latest United Nations Intergovernmental Panel on Climate Change (IPCC) Report, achieving that goal will require both large-scale greenhouse gas (GHG) emissions reduction and removal of GHGs from the atmosphere.

    At present, the most efficient and scalable GHG-removal strategy is the massive planting of trees through reforestation or afforestation — a “natural climate solution” (NCS) that extracts atmospheric carbon dioxide through photosynthesis and soil carbon sequestration.

    Despite the potential of forestry-based NCS projects to address climate change, biodiversity loss, unemployment, and other societal needs — and their appeal to policymakers, funders, and citizens — they have yet to achieve critical mass, and often underperform due to a mix of interacting ecological, social, and financial constraints. To better understand these challenges and identify opportunities to overcome them, a team of researchers at Imperial College London and the MIT Joint Program on the Science and Policy of Global Change recently studied how environmental scientists, local stakeholders, and project funders perceive the risks and benefits of NCS projects, and how these perceptions impact project goals and performance. To that end, they surveyed and consulted with dozens of recognized experts and organizations spanning the fields of ecology, finance, climate policy, and social science.

    The team’s analysis, which appears in the journal Frontiers in Climate, found two main factors that have hindered the success of forestry-based NCS projects.

    First, the ambition — levels of carbon removal, ecosystem restoration, job creation, and other environmental and social targets — of selected NCS projects is limited by funders’ perceptions of their overall risk. Among other things, funders aim to minimize operational risk (e.g., Will newly planted trees survive and grow?), political risk (e.g., Just how secure is their access to the land where trees will be planted?); and reputational risk (e.g., Will the project be perceived as an exercise in “greenwashing,” or fall way short of its promised environmental and social benefits?). Funders seeking a financial return on their initial investment are also concerned about the dependability of complex monitoring, reporting, and verification methods used to quantify atmospheric carbon removal, biodiversity gains, and other metrics of project performance.

    Second, the environmental and social benefits of NCS projects are unlikely to be realized unless the local communities impacted by these projects are granted ownership over their implementation and outcomes. But while engaging with local communities is critical to project performance, it can be challenging both legally and financially to set up incentives (e.g., payment and other forms of compensation) to mobilize such engagement.

    “Many carbon offset projects raise legitimate concerns about their effectiveness,” says study lead author Bonnie Waring, a senior lecturer at the Grantham Institute on Climate Change and the Environment, Imperial College London. “However, if nature climate solution projects are done properly, they can help with sustainable development and empower local communities.”

    Drawing on surveys and consultations with NCS experts, stakeholders, and funders, the research team highlighted several recommendations on how to overcome key challenges faced by forestry-based NCS projects and boost their environmental and social performance.

    These recommendations include encouraging funders to evaluate projects based on robust internal governance, support from regional and national governments, secure land tenure, material benefits for local communities, and full participation of community members from across a spectrum of socioeconomic groups; improving the credibility and verifiability of project emissions reductions and related co-benefits; and maintaining an open dialogue and shared costs and benefits among those who fund, implement, and benefit from these projects.

    “Addressing climate change requires approaches that include emissions mitigation from economic activities paired with greenhouse gas reductions by natural ecosystems,” says Sergey Paltsev, a co-author of the study and deputy director of the MIT Joint Program. “Guided by these recommendations, we advocate for a proper scaling-up of NCS activities from project levels to help assure integrity of emissions reductions across entire countries.” More

  • in

    A new dataset of Arctic images will spur artificial intelligence research

    As the U.S. Coast Guard (USCG) icebreaker Healy takes part in a voyage across the North Pole this summer, it is capturing images of the Arctic to further the study of this rapidly changing region. Lincoln Laboratory researchers installed a camera system aboard the Healy while at port in Seattle before it embarked on a three-month science mission on July 11. The resulting dataset, which will be one of the first of its kind, will be used to develop artificial intelligence tools that can analyze Arctic imagery.

    “This dataset not only can help mariners navigate more safely and operate more efficiently, but also help protect our nation by providing critical maritime domain awareness and an improved understanding of how AI analysis can be brought to bear in this challenging and unique environment,” says Jo Kurucar, a researcher in Lincoln Laboratory’s AI Software Architectures and Algorithms Group, which led this project.

    As the planet warms and sea ice melts, Arctic passages are opening up to more traffic, both to military vessels and ships conducting illegal fishing. These movements may pose national security challenges to the United States. The opening Arctic also leaves questions about how its climate, wildlife, and geography are changing.

    Today, very few imagery datasets of the Arctic exist to study these changes. Overhead images from satellites or aircraft can only provide limited information about the environment. An outward-looking camera attached to a ship can capture more details of the setting and different angles of objects, such as other ships, in the scene. These types of images can then be used to train AI computer-vision tools, which can help the USCG plan naval missions and automate analysis. According to Kurucar, USCG assets in the Arctic are spread thin and can benefit greatly from AI tools, which can act as a force multiplier.

    The Healy is the USCG’s largest and most technologically advanced icebreaker. Given its current mission, it was a fitting candidate to be equipped with a new sensor to gather this dataset. The laboratory research team collaborated with the USCG Research and Development Center to determine the sensor requirements. Together, they developed the Cold Region Imaging and Surveillance Platform (CRISP).

    “Lincoln Laboratory has an excellent relationship with the Coast Guard, especially with the Research and Development Center. Over a decade, we’ve established ties that enabled the deployment of the CRISP system,” says Amna Greaves, the CRISP project lead and an assistant leader in the AI Software Architectures and Algorithms Group. “We have strong ties not only because of the USCG veterans working at the laboratory and in our group, but also because our technology missions are complementary. Today it was deploying infrared sensing in the Arctic; tomorrow it could be operating quadruped robot dogs on a fast-response cutter.”

    The CRISP system comprises a long-wave infrared camera, manufactured by Teledyne FLIR (for forward-looking infrared), that is designed for harsh maritime environments. The camera can stabilize itself during rough seas and image in complete darkness, fog, and glare. It is paired with a GPS-enabled time-synchronized clock and a network video recorder to record both video and still imagery along with GPS-positional data.  

    The camera is mounted at the front of the ship’s fly bridge, and the electronics are housed in a ruggedized rack on the bridge. The system can be operated manually from the bridge or be placed into an autonomous surveillance mode, in which it slowly pans back and forth, recording 15 minutes of video every three hours and a still image once every 15 seconds.

    “The installation of the equipment was a unique and fun experience. As with any good project, our expectations going into the install did not meet reality,” says Michael Emily, the project’s IT systems administrator who traveled to Seattle for the install. Working with the ship’s crew, the laboratory team had to quickly adjust their route for running cables from the camera to the observation station after they discovered that the expected access points weren’t in fact accessible. “We had 100-foot cables made for this project just in case of this type of scenario, which was a good thing because we only had a few inches to spare,” Emily says.

    The CRISP project team plans to publicly release the dataset, anticipated to be about 4 terabytes in size, once the USCG science mission concludes in the fall.

    The goal in releasing the dataset is to enable the wider research community to develop better tools for those operating in the Arctic, especially as this region becomes more navigable. “Collecting and publishing the data allows for faster and greater progress than what we could accomplish on our own,” Kurucar adds. “It also enables the laboratory to engage in more advanced AI applications while others make more incremental advances using the dataset.”

    On top of providing the dataset, the laboratory team plans to provide a baseline object-detection model, from which others can make progress on their own models. More advanced AI applications planned for development are classifiers for specific objects in the scene and the ability to identify and track objects across images.

    Beyond assisting with USCG missions, this project could create an influential dataset for researchers looking to apply AI to data from the Arctic to help combat climate change, says Paul Metzger, who leads the AI Software Architectures and Algorithms Group.

    Metzger adds that the group was honored to be a part of this project and is excited to see the advances that come from applying AI to novel challenges facing the United States: “I’m extremely proud of how our group applies AI to the highest-priority challenges in our nation, from predicting outbreaks of Covid-19 and assisting the U.S. European Command in their support of Ukraine to now employing AI in the Arctic for maritime awareness.”

    Once the dataset is available, it will be free to download on the Lincoln Laboratory dataset website. More

  • in

    Helping the transportation sector adapt to a changing world

    After graduating from college, Nick Caros took a job as an engineer with a construction company, helping to manage the building of a new highway bridge right near where he grew up outside of Vancouver, British Columbia.  

    “I had a lot of friends that would use that new bridge to get to work,” Caros recalls. “They’d say, ‘You saved me like 20 minutes!’ That’s when I first realized that transportation could be a huge benefit to people’s lives.”

    Now a PhD candidate in the Urban Mobility Lab and the lead researcher for the MIT Transit Research Consortium, Caros works with seven transit agencies across the country to understand how workers’ transportation needs have changed as companies have adopted remote work policies.

    “Another cool thing about working on transportation is that everybody, even if they don’t engage with it on an academic level, has an opinion or wants to talk about it,” says Caros. “As soon as I mention I’ve worked with the T, they have something they want to talk about.”

    Caros is drawn to projects with social impact beyond saving his friends a few minutes during their commutes. He sees public transportation as a crucial component in combating climate change and is passionate about identifying and lowering the psychological barriers that prevent people around the world from taking advantage of their local transit systems.

    “The more I’ve learned about public transportation, the more I’ve come to realize it will play an essential part in decarbonizing urban transportation,” says Caros. “I want to continue working on these kinds of issues, like how we can make transportation more sustainable or promoting public transportation in places where it doesn’t exist or can be improved.”

    Caros says he doesn’t have a “transportation origin story,” like some of his peers who grew up in urban centers with robust public transit systems. As a child growing up in the Vancouver suburbs, he always enjoyed the outdoors, which were as close as his backyard. He chose to study engineering as an undergraduate at the University of British Columbia, fascinated by the hydroelectric dams that supply Vancouver with most of its power. But after two projects with the construction company, the second of which took him to Maryland to work on a fossil fuel project, he decided he needed a change.

    Not quite sure what he wanted to do next, Caros sought out the shortest master’s program he could find that interested him. That ended up being an 18-month master’s program in transportation planning and engineering at New York University. Initially intending to pursue the course-based program, Caros was soon offered the chance to be a research assistant in NYU’s Behavioral Urban Informatics, Logistics, and Transport Laboratory with Professor Joseph Chow. There, he worked to model an experimental transportation system of modular self-driving cars that could link and unlink with each other while in motion.

    “It was this really futuristic stuff,” says Caros. “It turned out to be a really cool project to work on because it’s kind of rare to have a blank-slate problem to try and solve. A lot of transportation engineering problems have largely been solved. We know how to make efficient and sustainable transportation systems; it’s just finding the political support and encouraging behavioral change that remains a challenge.”

    At NYU, Caros fell in love with research and the field of transportation. Later, he was drawn to MIT by its interdisciplinary PhD program that spans both urban studies and planning and civil engineering and the opportunity to work with Professor Jinhua Zhao.

    His research focuses on identifying “third places,” locations where some people go if their job gives them the flexibility to work remotely. Previously, transportation needs revolved around office spaces, typically located in city centers. With more people working from home, the first assumption is that transportation needs would decrease. But that’s not what Caros has found.

    “One major finding from our research is that people have changed where they’re going when they go to work,” says Caros. “A lot of people are working from home, but some are also working in other places, like coffee shops or co-working spaces. And these third places are not evenly distributed in Boston.”

    Identifying the concentration of these third places and what locations would benefit from them is the core of Caros’ dissertation. He’s building an algorithm that identifies ideal locations to build more shared workplaces based on both economic and social factors. Caros seeks to answer how you can minimize travel time across the board while leaving room for the spontaneous social interactions that drive a city’s productivity. His research is sponsored by seven of the largest transit agencies in the United States, who are members of the MIT Transit Research Consortium. Rather than a single agency sponsoring a single specific project, funding is pooled to tackle projects that address general topics that can apply to multiple cities.

    These kinds of problems require a multidisciplinary approach that appeals to Caros. Even when diving into the technical details of a solution, he always keeps the bigger picture in mind. He is certain that changing people’s views of public transportation will be crucial in the fight against climate change.

    “A lot of it is not necessarily engineering, but understanding what the motivations of people are,” says Caros. “Transportation is a leading sector for carbon emissions in the U.S., and so figuring out what makes people tick and how you can get them to ride public transit more, for example, would help to reduce the overall carbon cost.”

    Following the completion of his degree, Caros will join the Organization for Economic Cooperation and Development. He already spent six months at its Paris headquarters as an intern during a leave from MIT, something his lab encourages all of its students to do. Last fall, he worked on drafting policy guidelines for new mobility services such as vehicle-share scooters, and addressing transportation equity issues in Ghana. Plus, living in Paris gave him the opportunity to practice his French. Growing up in Canada, he attended a French immersion school, and his internship offered his first opportunity to use the language outside of an academic context.

    Looking forward, Caros hopes to keep tackling projects that promote sustainable public transportation. There is an urgency in getting ahead of the curve, especially in cities experiencing rapid growth.

    “You kind of get locked in,” says Caros. “It becomes much harder to build sustainable transportation systems after the fact. But it’s really just a geometry problem. Trains and buses are a way more efficient way to move people using the same amount of space as private cars.” More

  • in

    Harnessing synthetic biology to make sustainable alternatives to petroleum products

    Reducing our reliance on fossil fuels is going to require a transformation in the way we make things. That’s because the hydrocarbons found in fuels like crude oil, natural gas, and coal are also in everyday items like plastics, clothing, and cosmetics.

    Now Visolis, founded by Deepak Dugar SM ’11, MBA ’13, PhD ’13, is combining synthetic biology with chemical catalysis to reinvent the way the world makes things — and reducing gigatons of greenhouse gas emissions in the process.

    The company — which uses a microbe to ferment biomass waste like wood chips and create a molecular building block called mevalonic acid — is more sustainably producing everything from car tires and cosmetics to aviation fuels by tweaking the chemical processes involved to make different byproducts.

    “We started with [the rubber component] isoprene as the main molecule we produce [from mevalonic acid], but we’ve expanded our platform with this unique combination of chemistry and biology that allows us to decarbonize multiple supply chains very rapidly and efficiently,” Dugar explains. “Imagine carbon-negative yoga pants. We can make that happen. Tires can be carbon-negative, personal care can lower its footprint — and we’re already selling into personal care. So in everything from personal care to apparel to industrial goods, our platform is enabling decarbonization of manufacturing.”

    “Carbon-negative” is a term Dugar uses a lot. Visolis has already partnered with some of the world’s largest consumers of isoprene, a precursor to rubber, and now Dugar wants to prove out the company’s process in other emissions-intensive industries.

    “Our process is carbon-negative because plants are taking CO2 from the air, and we take that plant matter and process it into something structural, like synthetic rubber, which is used for things like roofing, tires, and other applications,” Dugar explains. “Generally speaking, most of that material at the end of its life gets recycled, for example to tarmac or road, or, worst-case scenario, it ends up in a landfill, so the CO2 that was captured by the plant matter stays captured in the materials. That means our production can be carbon-negative depending on the emissions of the production process. That allows us to not only reduce climate change but start reversing it. That was an insight I had about 10 years ago at MIT.”

    Finding a path

    For his PhD, Dugar explored the economics of using microbes to make high-octane gas additives. He also took classes at the MIT Sloan School of Management on sustainability and entrepreneurship, including the particularly influential course 15.366 (Climate and Energy Ventures). The experience inspired him to start a company.

    “I wanted to work on something that could have the largest climate impact, and that was replacing petroleum,” Dugar says. “It was about replacing petroleum not just as a fuel but as a material as well. Everything from the clothes we wear to the furniture we sit on is often made using petroleum.”

    By analyzing recent advances in synthetic biology and making some calculations from first principles, Dugar decided that a microbial approach to cleaning up the production of rubber was viable. He participated in the MIT Clean Energy Prize and worked with others at MIT to prove out the idea. But it was still just an idea. After graduation, he took a consulting job at a large company, spending his nights and weekends renting lab space to continue trying to make his sustainable rubber a reality.

    After 18 months, by applying engineering concepts like design-for-scale to synthetic biology, Dugar was able to develop a microbe that met 80 percent of his criteria for making an intermediate molecule called mevalonic acid. From there, he developed a chemical catalysis process that converted mevalonic acid to isoprene, the main component of natural rubber. Visolis has since patented other chemical conversion processes that turn mevalonic acid to aviation fuel, polymers, and fabrics.

    Dugar left his consulting job in 2014 and was awarded a fellowship to work on Visolis full-time at the Lawrence Berkeley National Lab via Activate, an incubator empowering scientists to reinvent the world.

    From rubber to jet fuels

    Today, in addition to isoprene, Visolis is selling skin care products through the brand Ameva Bio, which produces mevalonic acid-based creams by recycling plant byproducts created in other processes. The company offers refillable bottles and even offsets emissions from the shipping of its products.

    “We are working throughout the supply chain,” Dugar says. “It made sense to clean up the isoprene part of the rubber supply chain rather than the entire supply chain. But we’re also producing molecules for skin that are better for you, so you can put something much more sustainable and healthier on your body instead of petrochemicals. We launched Ameva to demonstrate that brands can leverage synthetic biology to turn carbon-negative ingredients into high-performing products.”

    Visolis is also starting the process of gaining regulatory approval for its sustainable aviation fuel, which Dugar believes could have the biggest climate impact of any of the company’s products by cleaning up the production of fuels for commercial flight.

    “We’re working with leading companies to help them decarbonize aviation” Dugar says. “If you look at the lifecycle of fuel, the current petroleum-based approach is we dig out hydrocarbons from the ground and burn it, emitting CO2 into the air. In our process, we take plant matter, which affixes to CO2 and captures renewable energy in those bonds, and then we transfer that into aviation fuel plus things like synthetic rubber, yoga pants, and other things that continue to hold the carbon. So, our factories can still operate at net zero carbon emissions.”

    Visolis is already generating millions of dollars in revenue, and Dugar says his goal is to scale the company rapidly now that its platform molecule has been validated.

    “We have been scaling our technology by 10 times every two to three years and are now looking to increase deployment of our technology at the same pace, which is very exciting.” Dugar says. “If you extrapolate that, very quickly you get to massive impact. That’s our goal.” More

  • in

    Cutting urban carbon emissions by retrofitting buildings

    To support the worldwide struggle to reduce carbon emissions, many cities have made public pledges to cut their carbon emissions in half by 2030, and some have promised to be carbon neutral by 2050. Buildings can be responsible for more than half a municipality’s carbon emissions. Today, new buildings are typically designed in ways that minimize energy use and carbon emissions. So attention focuses on cleaning up existing buildings.

    A decade ago, leaders in some cities took the first step in that process: They quantified their problem. Based on data from their utilities on natural gas and electricity consumption and standard pollutant-emission rates, they calculated how much carbon came from their buildings. They then adopted policies to encourage retrofits, such as adding insulation, switching to double-glazed windows, or installing rooftop solar panels. But will those steps be enough to meet their pledges?

    “In nearly all cases, cities have no clear plan for how they’re going to reach their goal,” says Christoph Reinhart, a professor in the Department of Architecture and director of the Building Technology Program. “That’s where our work comes in. We aim to help them perform analyses so they can say, ‘If we, as a community, do A, B, and C to buildings of a certain type within our jurisdiction, then we are going to get there.’”

    To support those analyses, Reinhart and a team in the MIT Sustainable Design Lab (SDL) — PhD candidate Zachary M. Berzolla SM ’21; former doctoral student Yu Qian Ang PhD ’22, now a research collaborator at the SDL; and former postdoc Samuel Letellier-Duchesne, now a senior building performance analyst at the international building engineering and consulting firm Introba — launched a publicly accessible website providing a series of simulation tools and a process for using them to determine the impacts of planned steps on a specific building stock. Says Reinhart: “The takeaway can be a clear technology pathway — a combination of building upgrades, renewable energy deployments, and other measures that will enable a community to reach its carbon-reduction goals for their built environment.”

    Analyses performed in collaboration with policymakers from selected cities around the world yielded insights demonstrating that reaching current goals will require more effort than city representatives and — in a few cases — even the research team had anticipated.

    Exploring carbon-reduction pathways

    The researchers’ approach builds on a physics-based “building energy model,” or BEM, akin to those that architects use to design high-performance green buildings. In 2013, Reinhart and his team developed a method of extending that concept to analyze a cluster of buildings. Based on publicly available geographic information system (GIS) data, including each building’s type, footprint, and year of construction, the method defines the neighborhood — including trees, parks, and so on — and then, using meteorological data, how the buildings will interact, the airflows among them, and their energy use. The result is an “urban building energy model,” or UBEM, for a neighborhood or a whole city.

    The website developed by the MIT team enables neighborhoods and cities to develop their own UBEM and to use it to calculate their current building energy use and resulting carbon emissions, and then how those outcomes would change assuming different retrofit programs or other measures being implemented or considered. “The website — UBEM.io — provides step-by-step instructions and all the simulation tools that a team will need to perform an analysis,” says Reinhart.

    The website starts by describing three roles required to perform an analysis: a local sustainability champion who is familiar with the municipality’s carbon-reduction efforts; a GIS manager who has access to the municipality’s urban datasets and maintains a digital model of the built environment; and an energy modeler — typically a hired consultant — who has a background in green building consulting and individual building energy modeling.

    The team begins by defining “shallow” and “deep” building retrofit scenarios. To explain, Reinhart offers some examples: “‘Shallow’ refers to things that just happen, like when you replace your old, failing appliances with new, energy-efficient ones, or you install LED light bulbs and weatherstripping everywhere,” he says. “‘Deep’ adds to that list things you might do only every 20 years, such as ripping out walls and putting in insulation or replacing your gas furnace with an electric heat pump.”

    Once those scenarios are defined, the GIS manager uploads to UBEM.io a dataset of information about the city’s buildings, including their locations and attributes such as geometry, height, age, and use (e.g., commercial, retail, residential). The energy modeler then builds a UBEM to calculate the energy use and carbon emissions of the existing building stock. Once that baseline is established, the energy modeler can calculate how specific retrofit measures will change the outcomes.

    Workshop to test-drive the method

    Two years ago, the MIT team set up a three-day workshop to test the website with sample users. Participants included policymakers from eight cities and municipalities around the world: namely, Braga (Portugal), Cairo (Egypt), Dublin (Ireland), Florianopolis (Brazil), Kiel (Germany), Middlebury (Vermont, United States), Montreal (Canada), and Singapore. Taken together, the cities represent a wide range of climates, socioeconomic demographics, cultures, governing structures, and sizes.

    Working with the MIT team, the participants presented their goals, defined shallow- and deep-retrofit scenarios for their city, and selected a limited but representative area for analysis — an approach that would speed up analyses of different options while also generating results valid for the city as a whole.

    They then performed analyses to quantify the impacts of their retrofit scenarios. Finally, they learned how best to present their findings — a critical part of the exercise. “When you do this analysis and bring it back to the people, you can say, ‘This is our homework over the next 30 years. If we do this, we’re going to get there,’” says Reinhart. “That makes you part of the community, so it’s a joint goal.”

    Sample results

    After the close of the workshop, Reinhart and his team confirmed their findings for each city and then added one more factor to the analyses: the state of the city’s electric grid. Several cities in the study had pledged to make their grid carbon-neutral by 2050. Including the grid in the analysis was therefore critical: If a building becomes all-electric and purchases its electricity from a carbon-free grid, then that building will be carbon neutral — even with no on-site energy-saving retrofits.

    The final analysis for each city therefore calculated the total kilograms of carbon dioxide equivalent emitted per square meter of floor space assuming the following scenarios: the baseline; shallow retrofit only; shallow retrofit plus a clean electricity grid; deep retrofit only; deep retrofit plus rooftop photovoltaic solar panels; and deep retrofit plus a clean electricity grid. (Note that “clean electricity grid” is based on the area’s most ambitious decarbonization target for their power grid.)

    The following paragraphs provide highlights of the analyses for three of the eight cities. Included are the city’s setting, emission-reduction goals, current and proposed measures, and calculations of how implementation of those measures would affect their energy use and carbon emissions.

    Singapore

    Singapore is generally hot and humid, and its building energy use is largely in the form of electricity for cooling. The city is dominated by high-rise buildings, so there’s not much space for rooftop solar installations to generate the needed electricity. Therefore, plans for decarbonizing the current building stock must involve retrofits. The shallow-retrofit scenario focuses on installing energy-efficient lighting and appliances. To those steps, the deep-retrofit scenario adds adopting a district cooling system. Singapore’s stated goals are to cut the baseline carbon emissions by about a third by 2030 and to cut it in half by 2050.

    The analysis shows that, with just the shallow retrofits, Singapore won’t achieve its 2030 goal. But with the deep retrofits, it should come close. Notably, decarbonizing the electric grid would enable Singapore to meet and substantially exceed its 2050 target assuming either retrofit scenario.

    Dublin

    Dublin has a mild climate with relatively comfortable summers but cold, humid winters. As a result, the city’s energy use is dominated by fossil fuels, in particular, natural gas for space heating and domestic hot water. The city presented just one target — a 40 percent reduction by 2030.

    Dublin has many neighborhoods made up of Georgian row houses, and, at the time of the workshop, the city already had a program in place encouraging groups of owners to insulate their walls. The shallow-retrofit scenario therefore focuses on weatherization upgrades (adding weatherstripping to windows and doors, insulating crawlspaces, and so on). To that list, the deep-retrofit scenario adds insulating walls and installing upgraded windows. The participants didn’t include electric heat pumps, as the city was then assessing the feasibility of expanding the existing district heating system.

    Results of the analyses show that implementing the shallow-retrofit scenario won’t enable Dublin to meet its 2030 target. But the deep-retrofit scenario will. However, like Singapore, Dublin could make major gains by decarbonizing its electric grid. The analysis shows that a decarbonized grid — with or without the addition of rooftop solar panels where possible — could more than halve the carbon emissions that remain in the deep-retrofit scenario. Indeed, a decarbonized grid plus electrification of the heating system by incorporating heat pumps could enable Dublin to meet a future net-zero target.

    Middlebury

    Middlebury, Vermont, has warm, wet summers and frigid winters. Like Dublin, its energy demand is dominated by natural gas for heating. But unlike Dublin, it already has a largely decarbonized electric grid with a high penetration of renewables.

    For the analysis, the Middlebury team chose to focus on an aging residential neighborhood similar to many that surround the city core. The shallow-retrofit scenario calls for installing heat pumps for space heating, and the deep-retrofit scenario adds improvements in building envelopes (the façade, roof, and windows). The town’s targets are a 40 percent reduction from the baseline by 2030 and net-zero carbon by 2050.

    Results of the analyses showed that implementing the shallow-retrofit scenario won’t achieve the 2030 target. The deep-retrofit scenario would get the city to the 2030 target but not to the 2050 target. Indeed, even with the deep retrofits, fossil fuel use remains high. The explanation? While both retrofit scenarios call for installing heat pumps for space heating, the city would continue to use natural gas to heat its hot water.

    Lessons learned

    For several policymakers, seeing the results of their analyses was a wake-up call. They learned that the strategies they had planned might not be sufficient to meet their stated goals — an outcome that could prove publicly embarrassing for them in the future.

    Like the policymakers, the researchers learned from the experience. Reinhart notes three main takeaways.

    First, he and his team were surprised to find how much of a building’s energy use and carbon emissions can be traced to domestic hot water. With Middlebury, for example, even switching from natural gas to heat pumps for space heating didn’t yield the expected effect: On the bar graphs generated by their analyses, the gray bars indicating carbon from fossil fuel use remained. As Reinhart recalls, “I kept saying, ‘What’s all this gray?’” While the policymakers talked about using heat pumps, they were still going to use natural gas to heat their hot water. “It’s just stunning that hot water is such a big-ticket item. It’s huge,” says Reinhart.

    Second, the results demonstrate the importance of including the state of the local electric grid in this type of analysis. “Looking at the results, it’s clear that if we want to have a successful energy transition, the building sector and the electric grid sector both have to do their homework,” notes Reinhart. Moreover, in many cases, reaching carbon neutrality by 2050 would require not only a carbon-free grid but also all-electric buildings.

    Third, Reinhart was struck by how different the bar graphs presenting results for the eight cities look. “This really celebrates the uniqueness of different parts of the world,” he says. “The physics used in the analysis is the same everywhere, but differences in the climate, the building stock, construction practices, electric grids, and other factors make the consequences of making the same change vary widely.”

    In addition, says Reinhart, “there are sometimes deeply ingrained conflicts of interest and cultural norms, which is why you cannot just say everybody should do this and do this.” For instance, in one case, the city owned both the utility and the natural gas it burned. As a result, the policymakers didn’t consider putting in heat pumps because “the natural gas was a significant source of municipal income, and they didn’t want to give that up,” explains Reinhart.

    Finally, the analyses quantified two other important measures: energy use and “peak load,” which is the maximum electricity demanded from the grid over a specific time period. Reinhart says that energy use “is probably mostly a plausibility check. Does this make sense?” And peak load is important because the utilities need to keep a stable grid.

    Middlebury’s analysis provides an interesting look at how certain measures could influence peak electricity demand. There, the introduction of electric heat pumps for space heating more than doubles the peak demand from buildings, suggesting that substantial additional capacity would have to be added to the grid in that region. But when heat pumps are combined with other retrofitting measures, the peak demand drops to levels lower than the starting baseline.

    The aftermath: An update

    Reinhart stresses that the specific results from the workshop provide just a snapshot in time; that is, where the cities were at the time of the workshop. “This is not the fate of the city,” he says. “If we were to do the same exercise today, we’d no doubt see a change in thinking, and the outcomes would be different.”

    For example, heat pumps are now familiar technology and have demonstrated their ability to handle even bitterly cold climates. And in some regions, they’ve become economically attractive, as the war in Ukraine has made natural gas both scarce and expensive. Also, there’s now awareness of the need to deal with hot water production.

    Reinhart notes that performing the analyses at the workshop did have the intended impact: It brought about change. Two years after the project had ended, most of the cities reported that they had implemented new policy measures or had expanded their analysis across their entire building stock. “That’s exactly what we want,” comments Reinhart. “This is not an academic exercise. It’s meant to change what people focus on and what they do.”

    Designing policies with socioeconomics in mind

    Reinhart notes a key limitation of the UBEM.io approach: It looks only at technical feasibility. But will the building owners be willing and able to make the energy-saving retrofits? Data show that — even with today’s incentive programs and subsidies — current adoption rates are only about 1 percent. “That’s way too low to enable a city to achieve its emission-reduction goals in 30 years,” says Reinhart. “We need to take into account the socioeconomic realities of the residents to design policies that are both effective and equitable.”

    To that end, the MIT team extended their UBEM.io approach to create a socio-techno-economic analysis framework that can predict the rate of retrofit adoption throughout a city. Based on census data, the framework creates a UBEM that includes demographics for the specific types of buildings in a city. Accounting for the cost of making a specific retrofit plus financial benefits from policy incentives and future energy savings, the model determines the economic viability of the retrofit package for representative households.

    Sample analyses for two Boston neighborhoods suggest that high-income households are largely ineligible for need-based incentives or the incentives are insufficient to prompt action. Lower-income households are eligible and could benefit financially over time, but they don’t act, perhaps due to limited access to information, a lack of time or capital, or a variety of other reasons.

    Reinhart notes that their work thus far “is mainly looking at technical feasibility. Next steps are to better understand occupants’ willingness to pay, and then to determine what set of federal and local incentive programs will trigger households across the demographic spectrum to retrofit their apartments and houses, helping the worldwide effort to reduce carbon emissions.”

    This work was supported by Shell through the MIT Energy Initiative. Zachary Berzolla was supported by the U.S. National Science Foundation Graduate Research Fellowship. Samuel Letellier-Duchesne was supported by the postdoctoral fellowship of the Natural Sciences and Engineering Research Council of Canada.

    This article appears in the Spring 2023 issue of Energy Futures, the magazine of the MIT Energy Initiative. More

  • in

    Study: The ocean’s color is changing as a consequence of climate change

    The ocean’s color has changed significantly over the last 20 years, and the global trend is likely a consequence of human-induced climate change, report scientists at MIT, the National Oceanography Center in the U.K., and elsewhere.  

    In a study appearing today in Nature, the team writes that they have detected changes in ocean color over the past two decades that cannot be explained by natural, year-to-year variability alone. These color shifts, though subtle to the human eye, have occurred over 56 percent of the world’s oceans — an expanse that is larger than the total land area on Earth.

    In particular, the researchers found that tropical ocean regions near the equator have become steadily greener over time. The shift in ocean color indicates that ecosystems within the surface ocean must also be changing, as the color of the ocean is a literal reflection of the organisms and materials in its waters.

    At this point, the researchers cannot say how exactly marine ecosystems are changing to reflect the shifting color. But they are pretty sure of one thing: Human-induced climate change is likely the driver.

    “I’ve been running simulations that have been telling me for years that these changes in ocean color are going to happen,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences and the Center for Global Change Science. “To actually see it happening for real is not surprising, but frightening. And these changes are consistent with man-induced changes to our climate.”

    “This gives additional evidence of how human activities are affecting life on Earth over a huge spatial extent,” adds lead author B. B. Cael PhD ’19 of the National Oceanography Center in Southampton, U.K. “It’s another way that humans are affecting the biosphere.”

    The study’s co-authors also include Stephanie Henson of the National Oceanography Center, Kelsey Bisson at Oregon State University, and Emmanuel Boss of the University of Maine.

    Above the noise

    The ocean’s color is a visual product of whatever lies within its upper layers. Generally, waters that are deep blue reflect very little life, whereas greener waters indicate the presence of ecosystems, and mainly phytoplankton — plant-like microbes that are abundant in upper ocean and that contain the green pigment chlorophyll. The pigment helps plankton harvest sunlight, which they use to capture carbon dioxide from the atmosphere and convert it into sugars.

    Phytoplankton are the foundation of the marine food web that sustains progressively more complex organisms, on up to krill, fish, and seabirds and marine mammals. Phytoplankton are also a powerful muscle in the ocean’s ability to capture and store carbon dioxide. Scientists are therefore keen to monitor phytoplankton across the surface oceans and to see how these essential communities might respond to climate change. To do so, scientists have tracked changes in chlorophyll, based on the ratio of how much blue versus green light is reflected from the ocean surface, which can be monitored from space

    But around a decade ago, Henson, who is a co-author of the current study, published a paper with others, which showed that, if scientists were tracking chlorophyll alone, it would take at least 30 years of continuous monitoring to detect any trend that was driven specifically by climate change. The reason, the team argued, was that the large, natural variations in chlorophyll from year to year would overwhelm any anthropogenic influence on chlorophyll concentrations. It would therefore take several decades to pick out a meaningful, climate-change-driven signal amid the normal noise.

    In 2019, Dutkiewicz and her colleagues published a separate paper, showing through a new model that the natural variation in other ocean colors is much smaller compared to that of chlorophyll. Therefore, any signal of climate-change-driven changes should be easier to detect over the smaller, normal variations of other ocean colors. They predicted that such changes should be apparent within 20, rather than 30 years of monitoring.

    “So I thought, doesn’t it make sense to look for a trend in all these other colors, rather than in chlorophyll alone?” Cael says. “It’s worth looking at the whole spectrum, rather than just trying to estimate one number from bits of the spectrum.”

     The power of seven

    In the current study, Cael and the team analyzed measurements of ocean color taken by the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite, which has been monitoring ocean color for 21 years. MODIS takes measurements in seven visible wavelengths, including the two colors researchers traditionally use to estimate chlorophyll.

    The differences in color that the satellite picks up are too subtle for human eyes to differentiate. Much of the ocean appears blue to our eye, whereas the true color may contain a mix of subtler wavelengths, from blue to green and even red.

    Cael carried out a statistical analysis using all seven ocean colors measured by the satellite from 2002 to 2022 together. He first looked at how much the seven colors changed from region to region during a given year, which gave him an idea of their natural variations. He then zoomed out to see how these annual variations in ocean color changed over a longer stretch of two decades. This analysis turned up a clear trend, above the normal year-to-year variability.

    To see whether this trend is related to climate change, he then looked to Dutkiewicz’s model from 2019. This model simulated the Earth’s oceans under two scenarios: one with the addition of greenhouse gases, and the other without it. The greenhouse-gas model predicted that a significant trend should show up within 20 years and that this trend should cause changes to ocean color in about 50 percent of the world’s surface oceans — almost exactly what Cael found in his analysis of real-world satellite data.

    “This suggests that the trends we observe are not a random variation in the Earth system,” Cael says. “This is consistent with anthropogenic climate change.”

    The team’s results show that monitoring ocean colors beyond chlorophyll could give scientists a clearer, faster way to detect climate-change-driven changes to marine ecosystems.

    “The color of the oceans has changed,” Dutkiewicz says. “And we can’t say how. But we can say that changes in color reflect changes in plankton communities, that will impact everything that feeds on plankton. It will also change how much the ocean will take up carbon, because different types of plankton have different abilities to do that. So, we hope people take this seriously. It’s not only models that are predicting these changes will happen. We can now see it happening, and the ocean is changing.”

    This research was supported, in part, by NASA. More