More stories

  • in

    Crowdsourcing data on road quality and excess fuel consumption

    America has over 4 million miles of roads and, as one might expect, monitoring them can be a monumental task.  

    To collect high-quality data on the conditions of their roads, departments of transportation (DOTs) can expect to spend $200 per mile for state-of-the-art laser profilers. For cities and states, these costs are prohibitive and often force them to resort to rudimentary approaches, like visual inspection.

    Over the past three years, a collaboration between the MIT Concrete Sustainability Hub (CSHub), the University of Massachusetts at Dartmouth, Birzeit University, and the American University of Beirut has sought to give DOTs a cheaper, but equally accurate, alternative.

    Their solution, “Carbin,” is an app that allows users to crowdsource road-quality data with their smartphones. An algorithm built into the software can then estimate how that road quality affects a user’s fuel consumption.

    Unlike prior road-quality crowdsourcing tools, the Carbin framework is the most sophisticated of its kind. Using the accelerometers found in smartphones, Carbin converts vehicle acceleration signals into standard measurements of road roughness used by most DOTs. It then collates these measurements onto fixmyroad.us, a publicly available global map.

    Since its release in 2019, Carbin has gathered almost 600,000 miles of road-quality data in more than three dozen countries. During 2020, its developers continued to advance the app. Not only have they validated their approach in two papers — one in Data-Centric Engineering and another in The Proceedings of the Royal Society — they have also collected more than 300,000 miles of data with the help of Concrete Supply Co., a ready-mix concrete manufacturer in the Carolinas. In addition, they are initiating collaborations with automotive manufacturers and vehicle telematics companies to gather data on even greater scale.

    Play video

    Roughly speaking

    Carbin is not the first phone accelerometer-based approach for crowdsourcing road quality. Several other apps, including the City of Boston’s “Street Bump,” have sought to assess road quality based on one of the most recognizable signs of poor roads: potholes.

    Though potholes have been the focus of prior apps, they are, however, not the main metric used by DOTs for measuring road quality and planning maintenance. Instead, DOTs rely on what is called road roughness.

    “The shortcoming of previous crowdsourcing approaches is that they would record the acceleration signal and look for outliers, which would indicate potholes,” explains Botshekan. “However, they could not infer the road roughness, since that is defined over longer length scales — typically from tens of centimeters to tens of meters.”

    Though roughness can seem almost imperceptible, it can have outsized effects. Rough roads not only lead to higher maintenance costs but can also increase vehicle fuel consumption — by as much as 15 percent in cities. To measure roughness, DOTs use the International Roughness Index (IRI).

    “IRI is the accumulated motion of the suspension system over a specific distance,” says Arghavan Louhghalam, an assistant professor of civil and environmental engineering at the University of Massachusetts at Dartmouth. “Higher IRI indicates lower road quality and higher fuel consumption.”

    To derive IRI, DOTs don’t actually measure suspension travel explicitly. Instead, they first capture the profile of the road — essentially, the undulations of its surface — and then simulate how a car’s suspension system would respond to it using what’s called a “quarter car model.”

    From quarter car to complete picture

    A quarter car model is essentially what it sounds like: a model of a quarter of a car. Specifically, it refers to a model of the tires, vehicle mass, and suspension system based on one wheel of a vehicle. By developing their own car dynamics model in a probabilistic setting, Botshekan and his colleagues were able to map the acceleration signals collected by Carbin users onto the behavior of a virtual vehicle and its interaction with the road. From there, they could estimate suspension properties and road roughness in terms of IRI. Using an algorithm developed based on past CSHub research, Carbin then estimates how IRI values can impact vehicle fuel consumption.

    “At the end of the day, the vehicle is like a filter,” explains Mazdak Tootkaboni, associate professor of civil and environmental engineering at UMass Dartmouth. “The excitation of the road goes through the vehicle and is then sensed by the cellphone. So, what we do is understand this filter and take it out of the equation.”

    After developing their model, the Carbin team then sought to test it against more costly, conventional methods. They did this through two different validations. 

    In the first, they measured road quality on two test tracks in the Greater Boston area — a major thoroughfare and then a highway — using a conventional laser profiler and several phones equipped with Carbin. When they compared the data afterward, they found that Carbin could predict laser-based roughness measurements with 90 percent accuracy.

    The second validation probed Carbin’s crowdsourcing capabilities. In it, they analyzed over 22,000 kilometers of Federal Highway Administration road data from California beside 27,000 kilometers of data gathered by 84 Carbin users from the same state. The results of their analysis revealed a remarkable resemblance between the crowdsourced and official data — a sign that Carbin could augment or even entirely replace conventional methods.

    21st century infrastructure, 21st century tools

    Now that they’ve thoroughly validated their model, Carbin’s developers want to expand the app to provide users, governments, and companies with unparalleled insights into both vehicles and infrastructure.

    The most apparent use for Carbin, says Jake Roxon, a CSHub postdoc and Carbin’s creator, would be as a tool for DOTs to improve America’s roads — which recently received a grade of D from the American Society of Civil Engineers.

    “On average, America’s roads are terrible,” he explains. “But the problem isn’t always in the funding of DOTs themselves, but rather how they allocate that funding. By knowing the quality of an entire road network, which is impossible with current technologies, they could fix roads more efficiently.”

    The issue, then, is how Carbin can transition from gathering data to also recommending resource allocation. To make this possible, the Carbin team is beginning to incorporate prior CSHub research on network asset management — the process through which DOTs monitor pavement performance and plan maintenance to meet performance targets.

    Besides serving the needs of DOTs, Carbin could also help private companies. “There are private firms, fleet companies especially, that would benefit from this technology,” says Roxon. “Eventually, they could use Carbin for ‘eco-routing,’ which is when you identify the route that is most fuel-efficient.”

    Such a routing option could help companies both reduce their environmental impact and running costs — for those with thousands of vehicles, the aggregate savings could be substantial.

    While further development is needed to incorporate eco-routing and asset management into Carbin, its developers see it as a promising tool. Franz-Josef Ulm, professor at the MIT Department of Civil and Environmental Engineering and faculty director of CSHub, believes that Carbin represents a necessary step forward.

    “To develop the infrastructure of the 21st century, we need 21st-century means of assessing the state of that infrastructure to ensure that any dollar spent today is well spent for the future,” he says. “That’s precisely where Carbin enters the picture.” More

  • in

    3 Questions: Nadia Christidi on the arts and the future of water

    In this ongoing series, MIT faculty, students, and alumni in the humanistic fields share perspectives that are significant for solving climate change and mitigating its myriad social and ecological impacts. Nadia Christidi is a PhD student in MIT HASTS, a program that combines research in history, anthropology, science, technology, and society. Her dissertation examines how three cities that face water supply challenges are imagining, planning, and preparing for the future of water. Christidi has a particular interest in the roles that art, design, and architecture are playing in that future imagining and future planning process. MIT SHASS Communications spoke with her on the ways that her field and visual cultures contribute to solving issues of climate change.   

    Q: There are many sensible approaches to addressing the climate crisis. Increasingly, it looks as if we’ll need all of them. What perspectives from the HASTS fields are significant for addressing climate change and its ecological and social impacts?

    A: My research focuses on how three cities that face water supply challenges are imagining, planning, and preparing for the future of water. The three cities I focus on are Los Angeles, Dubai, and Cape Town. Water is one of the key issues when it comes to adapting to climate change and my work tries to understand how climate change impacts are understood and adaptation policies developed.

    My approach to climate change and adaptation brings together various disciplines — history, anthropology, science and technology studies, and visual cultures; each of these helps me see and elucidates very particular aspects of climate change.

    I think history reminds us that our ways of being and systems are historically constructed rather than given, inevitable, or natural, and that there is an alternative. Anthropology elucidates that while we may all talk about “climate change,” what is meant by it, how it is understood and experienced, and how it is dealt with as a problem will differ from place to place; climate change is as much a social and cultural phenomenon and experience as it is a scientific or environmental one, as much a global issue as it is a local one. The social, cultural, and local, anthropology reminds us, have to be factored into meaningful policy.

    Science and technology studies sheds light on the various communities involved in developing climate change knowledge; the role that their investments, stakes, and interests play; and the translation between science and policy that needs to happen for scientifically-informed policy to emerge. The STS perspective also points out that science is one of many systems for understanding climate change and that there may be other valid, useful worldviews from which we can learn.

    And finally, visual cultures underscore how pop cultural and visual references, symbols, and imagery shape imaginaries and expectations of climate change, including scientific ones, and sometimes open up or foreclose pathways to action.

    Q: What pathways of thought and action do you personally think might be most fruitful for alleviating climate change and its impacts — and for forging a more sustainable future?

    A: I think we are going to need a lot of imagination going forward. As climate change gets underway, we’re seeing a lot more emphasis on adaptation, and imagination is key to adapting to a set of totally different circumstances.

    This belief has led me to explore the “imaginative capacities” of planning institutions, the impact of popular culture imaginaries, from the utopian to the dystopian, on our preparations for the future, and the role that creative practitioners — including artists, architects, and designers — can play in expanding our imaginative possibilities.

    One of my interlocutors aptly uses the phrase “crisis of imagination” to describe the present. In order for the necessary imagination work to take place, we must take seriously different actors as sources of knowledge, expertise, and perspectives, and make the process of imagining and planning more inclusive.

    Partly, my work considers how creative practitioners are imagining climate change and the future of water and the alternative knowledge or perspectives they can offer. Most of the works that I look at involve collaborations between artists/architects, scientists, engineers, and/or policymakers. They see artists contributing to science or transforming urban space or impacting policy.

    For instance, the UAE pavilion at the Venice Architecture Biennale, Wetland, will unveil a locally-produced salt-based building material as an alternative to cement. Developed by Dubai-based architects Wael Al Awar and Kenichi Teramoto, the pavilion tackles the issues of brine — a salty byproduct of desalination, which is the country’s main source of potable water — and the carbon footprint of cement use in Dubai’s robust construction industry.

    Inspired by historical examples of salt architecture and by the natural architectures of local salt flat ecosystems, the architects worked with scientists from NYU Abu Dhabi to develop the material. Such work shows how interdisciplinary collaborations with creative practitioners can not only advance the sciences, but also reimagine established industries and practices, and develop innovative approaches to the carbon emissions problem.

    Peggy Weil, an artist based in Los Angeles, rethinks landscape as a genre in our climate-changed present. Holding that the traditional horizontal format of the landscape is no longer representative, she develops “underscapes,” where she films the length of ice cores or aquifers, and “overscapes,” which involve studies of the air, as portraits of the Earth. These ‘scapes’ argue for a need to re-perceive our surroundings in order to more fully understand how we have chemically, hydrogeologically, and climatically transformed them.

    Peggy and I have talked extensively about how important “re-perceiving” will be for encouraging behavior changes and generating economic and political support for the work of water managers and policymakers as well as the role of the arts in driving this “re-perception.”

    Q: What dimensions of the emerging climate crisis affect you most deeply — causing uncertainty, and/or altering the ways you think about the present and the future? When you confront an issue as formidable as climate change, what gives you hope?    

    A: I think one dimension of the climate crisis I find especially disturbing is its configuration at times and in certain places as an economic opportunity, where new devastating environmental conditions are taken to be opportunities for innovation and technological development that will enable economic growth.

    This becomes especially compelling in times of economic deceleration or as the specter of the end of oil grows stronger. But we need to ask: economic growth for whom, at what costs, and with what effects? And is growth what we really need?

    I don’t think that the economy should be pitted against the environment; I am a total believer in sustainability as an issue that must encompass the economic, social, and environmental. But the real problems are with economic distribution rather than growth, and the promise of unlimited growth — as further stoked by renewables — which is a fallacy or fantasy.

    I tend to agree with journalist Naomi Klein that the market, green or not, isn’t going to solve climate change challenges because we need more than just a technofix; we need policy and behavioral changes and new investment directions, many of which go against established economic arrangements and priorities. Locally produced salt-based building materials are a good start, but not enough.

    Some of the most challenging and consequential imaginative work we will have to do will be on the social front; this will entail reconsidering some things we take for granted. I love theorist Frederic Jameson’s suggestion that “it is easier to imagine the end of the world than it is to imagine the end of capitalism,” as well as Mike Fisher’s concept of “capitalist realism,” which captures the ideological underpinnings of that worldview.

    The privatization of water is one of the scariest intensifying developments in my mind, especially given anticipated climate change effects, but I take some reassurance from projects that aim to counter such trends. One of the promising architectural proposals I’ve studied in Los Angeles is by Stephanie Newcomb. Stephanie’s work, Coopelluvia, aims to complement stormwater capture projects developed by governmental entities in LA county on public land and that form a major prong of the City of LA’s water planning strategy; it explores the possibility of turning stormwater captured in side setback spaces between private properties into a communal water resource in the low-income, predominantly Latino neighborhoods of Pacoima and Arletta in the San Fernando Valley.

    Stephanie’s proposed intervention blurs the boundary between public and private and empowers marginalized communities through developing communal resource management systems with multiple environmental and social benefits. Her work is guided by theories of the commons, rather than privatization and market-oriented solutions — and I think such projects and theories hold a lot of promise for facilitating the kinds of radical change we need.

    Series prepared by SHASS CommunicationsEditorial and Design Director: Emily HiestandCo-Editor: Kathryn O’Neill More

  • in

    Analytics platform for coastal desalination plants wins 2021 Water Innovation Prize

    Coastal desalination plants are a source of drinking water for an increasing number of people around the world. But their proximity to the ocean can cause disruptions from events like riptides and oil spills. Such disruptions reduce the productivity, lifespan, and sustainability of desalination plants.

    The winner of this year’s MIT Water Innovation Prize, Bloom Alert, is seeking to improve desalination plant operations with a new kind of data monitoring platform. The platform tracks ocean and desalination plant activity and provides early warnings about events that could interrupt clean water production or lead to coastal pollution.

    At the heart of Bloom Alert’s solution are models that crunch satellite data in real time to understand what’s going on in the ocean near the plants.

    “Coastal events can reduce a plant’s water production capacity by up to 30 percent — that means 30 percent less water for coastal communities,” Bloom Alert team member Enzo Garcia said in the winning pitch. “Our models allow plant operators to apply mitigation measures during emergencies, which improve not only plant efficiency, but also overall water security for potentially millions of people.”

    Bloom Alert’s models, which were trained on 20 years of satellite data, are capable of predicting disruption events up to 14 days in advance. That can lead to major savings: Garcia estimates severe riptide events can cost plants up to $200,000 a day.

    Team members said their subscription-based platform, developed in their home country of Chile, can be used around the globe at a fraction of the cost of existing solutions.

    The company recently completed its first pilot project with the biggest desalination plant in South America. Through the project, Garcia says Bloom is already helping to secure 20 percent of Chile’s desalinated water production.

    Now, with $18,000 in new funding earned from the competition’s grand prize, the company is targeting plants in the Middle East, where about half of the world’s desalination plants are located.

    “We seek to position ourselves as the worldwide leader in satellite intelligence for the desalination industry,” Garcia says.

    The Water Innovation Prize, which helps translate water-related research and ideas into businesses and impact, has been hosted by the MIT Water Club since its first year in 2015. Each year, student-led finalist teams from around the world pitch their innovations to students, faculty, investors, and people working in various water-related industries.

    This year’s event, held virtually on Thursday, included six finalist teams. The second place, $10,000 award was given to Nymphea Labs.

    Mosquitos are the deadliest animal on the planet. Every 30 seconds, a child dies of malaria. At a park one day, Nymphea team member Pranav Agarwal noticed a pond that had no mosquito larvae on its surface because wind was causing ripples. A nearby pond, with no wind ripples, was filled with larvae. The observation led to an idea.

    Nymphea Labs is marketing a device called Ripple that creates tiny waves on the surface of still water by leveraging solar power. The device, which costs about $10 dollars, requires no maintenance to run. It can also be deployed in fleets to cause ripples across larger bodies of water.

    “Today the most widespread solutions are insecticides and insecticide treated bed nets, but more and more we’re seeing that mosquitos are developing a resistance to these chemicals, making these efforts less effective,” Agarwal says.

    Nymphia says the device has already led to decreased mosquito populations in small tests. Now the company will be producing 100 units for further testing. The team is hoping Ripple will be helping to protect more than 10 million people by 2025.

    The third-place prize was awarded to NERAMCO, which has invented a more sustainable, high-performance polyethylene fabric called SVETEX. SVETEX is a breathable, quick drying, and stain resistant textile, and NERAMCO CEO Maren Cattonar says its production uses 100 times less water than cotton.

    “Fiber production is extremely water intensive, consuming 86 trillion meters of water per year, enough to supply the global population with drinking water for 14 years,” says Cattonar, who works as a mentor for MIT’s Sandbox Innovation Fund Program and MIT’s iTeams initiative.

    In addition to using less water, SVETEX production also eliminates aquatic dye pollution using a dry spin coloring process.

    “A staggering amount of pollution comes from textile dyeing,” Cattonar says. “Twenty percent of industrial water pollution originates from the textile industry. Each year 6.3 trillion liters of water are used to dye textiles.”

    The other finalist teams were:

    AgroBeads, which has developed biodegradable water beads designed to reduce the amount of water used in irrigation while providing plants with nutrients;

    Brineys, which seeks to to fund new water desalination plants in water insecure countries by selling artisanal salt created as a byproduct of the desalination process; and

    FinsTrust, a blockchain-based e-commerce platform designed to improve transparency and traceability in fishery products while empowering Indonesian fishermen and fish farmers.

    Many of the finalist teams seek to address problems expected to worsen over time due to climate change. A sense of urgency has come over efforts to address water shortages in particular as communities increasingly face water distress around the world.

    “Demand is outpacing supply, and it’s not happening in five years or 10 years — it’s happening now,” Tom Ferguson, a managing partner as Burnt Island Ventures, said in the keynote to the event. More

  • in

    Climate solutions depend on technology, policy, and businesses working together

    “The challenge for humanity now is how to decarbonize the global economy by 2050. To do that, we need a supercharged decade of energy innovation,” said Ernest J. Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus, founding director of the MIT Energy Initiative, and a former U.S. secretary of energy, as he opened the MIT Forefront virtual event on April 21. “But we also need practical visionaries, in every economic sector, to develop new business models that allow them to remain profitable while achieving the zero-carbon emissions.”

    The event, “Addressing Climate and Sustainability through Technology, Policy, and Business Models,” was the third in the MIT Forefront series, which invites top minds from the worlds of science, industry, and policy to propose bold new answers to urgent global problems. Moniz moderated the event, and more than 12,000 people tuned in online.

    MIT and other universities play an important role in preparing the world’s best minds to take on big climate challenges and develop the technology needed to advance sustainability efforts, a point illustrated in the main session with a video about Via Separations, a company supported by MIT’s The Engine. Co-founded by Shreya Dave ’09, SM ’12, PhD ’16, Via Separations customizes filtration technology to reduce waste and save money across multiple industries. “By next year, we are going to be eliminating carbon dioxide emissions from our customers’ facilities,” Dave said.

    Via Separations is one of many innovative companies born of MIT’s energy and climate initiatives — the work of which, as the panel went on to discuss, is critical to achieving net-zero emissions and deploying successful environmental sustainability efforts. As Moniz put it, the company embodies “the spirit of science and technology in action for the good of humankind” and exemplifies how universities and businesses, as well as technology and policy, must work together to make the best environmental choices.

    How businesses confront climate change

    Innovation in sustainable practices can be met with substantial challenges when proposed or applied to business models, particularly on the policy side, the panelists noted. But they shared some key ways that their respective organizations have employed current technologies and the challenges they face in reaching their sustainability goals. Despite each business’s different products and services, a common thread of needing new technologies to achieve their sustainability goals emerged. 

    Although 2050 is the long-term goal for net-zero emissions put forth by the Paris Agreement, the businesses represented by the panelists are thinking about the shorter term. “IBM has committed to net-zero emissions by 2030 ― without carbon offsets,” said Arvind Krishna, chairman and chief executive officer of IBM. “We believe that some carbon taxes would be a good policy tool. But policy alone is insufficient. We need advanced technological tools to reach our goal.” 

    Jeff Wilke SM ’93, who retired as Amazon’s chief executive officer of Worldwide Consumer in February 2021, outlined a number of initiatives that the online retail giant is undertaking to curb emissions. Transportation is one of their biggest hurdles to reaching zero emissions, leading to a significant investment in Class 8 electric trucks. “Another objective is to remove the need for plane shipments by getting inventory closer to urban areas, and that has been happening steadily over the years,” he said.

    Jim Fitterling, chair and chief executive officer of Dow, explained that Dow has reduced its carbon emissions by 15 percent in the past decade and is poised to reduce it further in the next. Future goals include working toward electrifying ethylene production. “If we can electrify that, it will allow us to make major strides toward carbon-dioxide reduction,” he said. “But we need more reliable and stable power to get to that point.” 

    Collaboration is key to advancing climate solutions

    Maria T. Zuber, MIT’s vice president for research, who was recently appointed by U.S. President Joe Biden as co-chair of the President’s Council of Advisors on Science and Technology, stressed that MIT innovators and industry leaders must work together to implement climate solutions. 

    “Innovation is a team sport,” said Zuber, who is also the E. A. Griswold Professor of Geophysics. “Even if MIT researchers make a huge discovery, deploying it requires cooperation on a policy level and often industry support. Policymakers need to solve problems and seize opportunities in ways that are popular. It’s not just solving technical problems ― there is a human behavior component.”

    But businesses, Zuber said, can play a huge role in advancing innovation. “If a company becomes convinced of the potential of a new technology, they can be the best advocates with policymakers,” she said.

    The question of “sustainability vs. shareholders” 

    During the Q&A session, an audience member pointed out that environmentalists are often distrustful of companies’ sustainability policies when their focus is on shareholders and profit.

    “Companies have to show that they’re part of the solution,” Fitterling said. “Investors will be afraid of high costs up front, so, say, completely electrifying a plant overnight is off the table. You have to make a plan to get there, and then incentivize that plan through policy. Carbon taxes are one way, but miss the market leverage.”

    Krishna also pushed back on the idea that companies have to choose between sustainability and profit. “It’s a false dichotomy,” he said. “If companies were only interested in short-term profits, they wouldn’t last for long.”

    “A belief I’ve heard from some environmental groups is that ‘anything a company does is greenwashing,’ and that they’ll abandon those efforts if the economy tanks,” Zuber said, referring to a practice wherein organizations spend more time marketing themselves as environmentally sustainable than on maximizing their sustainability efforts. “The economy tanked in 2020, though, and we saw companies double down on their sustainability plans. They see that it’s good for business.”

    The role of universities and businesses in sustainability innovation

    “Amazon and all corporations are adapting to the effects of climate change, like extreme weather patterns, and will need to adapt more — but I’m not ready to throw in the towel for decarbonization,” Wilke said. “Either way, companies will have to invest in decarbonization. There is no way we are going to make the progress we have to make without it.” 

    Another component is the implications of artificial intelligence (AI) and quantum computing. Krishna noted multiple ways that AI and quantum computing will play a role at IBM, including finding the most environmentally sustainable and cost-efficient ways to advance carbon separation in exhaust gases and lithium battery life in electric cars. 

    AI, quantum computing, and alternate energy sources such as fusion energy that have the potential to achieve net-zero energy, are key areas that students, researchers, and faculty members are pursuing at MIT.

    “Universities like MIT need to go as fast as we can as far as we can with the science and technology we have now,” Zuber said. “In parallel, we need to invest in and deploy a suite of new tools in science and technology breakthroughs that we need to reach the 2050 goal of decarbonizing. Finally, we need to continue to train the next generation of students and researchers who are solving these issues and deploy them to these companies to figure it out.” More

  • in

    Invitations to powerful climate action at MIT Better World (Sustainability)

    “We’re in an emergency, and we need a coordinated effort with all hands and all minds on deck trying to solve this problem.” The urgency in that call to confront climate change, issued by MIT faculty member Asegun Henry SM ’06, PhD ’09, reverberated throughout MIT Better World (Sustainability), a recent virtual gathering of the global MIT community.

    More than 830 attendees from 57 countries logged on to learn about climate change solutions in development at MIT and to consider how, in the words of Provost Martin A. Schmidt SM ’83, PhD ’88, “Every academic discipline in every corner of our community can contribute to solving this global challenge.” Schmidt, who is the Ray and Maria Stata Professor of Electrical Engineering and Computer Science, moderated the main session of the program, which also featured Vice President for Research Maria Zuber and linguistics graduate student Annauk Denise Olin.

    Henry is the Robert N. Noyce Career Development Associate Professor in the Department of Mechanical Engineering and director of the Atomistic Simulation and Energy Research Group. “The laws of thermodynamics tell us that if there is an imbalance in the rate at which we are heated by the sun … the planet will become too hot for human beings to live here. So that means we must make radical change,” he told the online audience of MIT alumni and friends. Henry’s own research focuses on energy storage, one of the greatest challenges to sustainable energy adoption. “We have to store renewable energy when we have an overabundance, and then discharge it back to the grid whenever it’s needed,” he explained. “We need the price of solar, plus batteries, to be cheaper than gas. And today that’s not true.”

    Zuber, the E. A. Griswold Professor of Geophysics, touched on the psychological and economic barriers to moving societies away from the use of fossil fuels, noting that both of her grandfathers were coal miners in Eastern Pennsylvania. “The burning of a fossil fuel, anthracite coal, was the foundation of the community and the way of life where I grew up,” she said.

    Still, Zuber — who was recently tapped by the Biden Administration to co-chair the President’s Council of Advisors on Science and Technology — expressed optimism for a sustainable future: “Our past is full of scientific and technological breakthroughs that have changed our species’ course — and changed countless lives for the better.” She highlighted three promising areas of research at MIT: improved battery storage technology, carbon capture, and nuclear fusion.

    “People used to laugh when I talked about fusion,” she said, “but they’re not laughing anymore.” This long-sought energy source may finally be coming within humanity’s reach, transforming the fight against climate change: “The key ingredient for fusion energy — hydrogen — is essentially both free and inexhaustible,” Zuber noted. In collaboration with private fusion startup Commonwealth Fusion Systems, MIT is designing and building SPARC, a compact, high-field fusion device that will demonstrate net energy — producing more energy than it consumes — for the first time in history. SPARC is a key step toward building a fusion power plant capable of producing electricity continuously within as few as 15 years.

    The third presenter was Olin, a graduate student in the MIT Indigenous Languages Initiative, where she works to preserve her Native language of Iñupiaq. “Embedded in our indigenous languages are lessons in how to take care of the environment,” she said. For example, Iñupiaq has more than 100 terms to describe ice conditions. But now, “The climate is changing so much, so fast, our elders literally don’t have words for the way sea ice is behaving.”

    During her mother’s childhood in the Alaskan village of Shishmaref, several feet of sea ice would form and remain from October to June, offering protection from storm surges. “In February 2018 and 2019,” she said, “there was no ice at all.” Erosion has resulted so fast that houses and roads have dropped into the sea without warning, and villages like Shishmaref are being forced to move away from the ocean they rely on for food. In fact, according to Olin, the word “erosion” does not capture the magnitude of the crisis. She has helped to coin an Inuit word, “usteq,” to describe the intersection of coastal flooding, permafrost degradation, and erosion that results in catastrophic land collapse.

    Olin hopes that a broader understanding of usteq will enable these events to be classified as a natural hazard by the Federal Emergency Management Agency, unlocking federal funding to help Native Alaskan villages move to stable ground. “We need more people to understand and talk about what’s at stake for our villages, for our people, and our shared humanity,” she said.

    “The work we heard about tonight,” remarked Schmidt, bringing the main presentations to a close, “embodies the MIT commitment to curiosity and discovery in pursuit of a better, more sustainable world.” More

  • in

    Undergraduates explore practical applications of artificial intelligence

    Deep neural networks excel at finding patterns in datasets too vast for the human brain to pick apart. That ability has made deep learning indispensable to just about anyone who deals with data. This year, the MIT Quest for Intelligence and the MIT-IBM Watson AI Lab sponsored 17 undergraduates to work with faculty on yearlong research projects through MIT’s Advanced Undergraduate Research Opportunities Program (SuperUROP).

    Students got to explore AI applications in climate science, finance, cybersecurity, and natural language processing, among other fields. And faculty got to work with students from outside their departments, an experience they describe in glowing terms. “Adeline is a shining testament of the value of the UROP program,” says Raffaele Ferrari, a professor in MIT’s Department of Earth and Planetary Sciences, of his advisee. “Without UROP, an oceanography professor might have never had the opportunity to collaborate with a student in computer science.”

    Highlighted below are four SuperUROP projects from this past year.

    A faster algorithm to manage cloud-computing jobs

    The shift from desktop computing to far-flung data centers in the “cloud” has created bottlenecks for companies selling computing services. Faced with a constant flux of orders and cancellations, their profits depend heavily on efficiently pairing machines with customers.

    Approximation algorithms are used to carry out this feat of optimization. Among all the possible ways of assigning machines to customers by price and other criteria, they find a schedule that achieves near-optimal profit.​ For the last year, junior Spencer Compton worked on a virtual whiteboard with MIT Professor Ronitt Rubinfeld and postdoc Slobodan Mitrović to find a faster scheduling method.

    “We didn’t write any code,” he says. “We wrote proofs and used mathematical ideas to find a more efficient way to solve this optimization problem. The same ideas that improve cloud-computing scheduling can be used to assign flight crews to planes, among other tasks.”

    In a pre-print paper on arXiv, Compton and his co-authors show how to speed up an approximation algorithm under dynamic conditions. They also show how to locate machines assigned to individual customers without computing the entire schedule.

    A big challenge was finding the crux of the project, he says. “There’s a lot of literature out there, and a lot of people who have thought about related problems. It was fun to look at everything that’s been done and brainstorm to see where we could make an impact.”​

    How much heat and carbon can the oceans absorb?

    Earth’s oceans regulate climate by drawing down excess heat and carbon dioxide from the air. But as the oceans warm, it’s unclear if they will soak up as much carbon as they do now. A slowed uptake could bring about more warming than what today’s climate models predict. It’s one of the big questions facing climate modelers as they try to refine their predictions for the future.

    The biggest obstacle in their way is the complexity of the problem: today’s global climate models lack the computing power to get a high-resolution view of the dynamics influencing key variables like sea-surface temperatures. To compensate for the lost accuracy, researchers are building surrogate models to approximate the missing dynamics without explicitly solving for them.

    In a project with MIT Professor Raffaele Ferrari and research scientist Andre Souza, MIT junior Adeline Hillier is exploring how deep learning solutions can be used to improve or replace physical models of the uppermost layer of ocean, which drives the rate of heat and carbon uptake. “If the model has a small footprint and succeeds under many of the physical conditions encountered in the real world, it could be incorporated into a global climate model and hopefully improve climate projections,” she says.

    In the course of the project, Hillier learned how to code in the programming language Julia. She also got a crash course in fluid dynamics. “You’re trying to model the effects of turbulent dynamics in the ocean,” she says. “It helps to know what the processes and physics behind them look like.”

    In search of more efficient deep learning models

    There are thousands of ways to design a deep learning model to solve a given task. Automating the design process promises to narrow the options and make these tools more accessible. But finding the optimal architecture is anything but simple. Most automated searches pick the model that maximizes validation accuracy without considering the structure of the underlying data, which may suggest a simpler, more robust solution. As a result, more reliable or data-efficient architectures are passed over.

    “Instead of looking at the accuracy of the model alone, we should focus on the structure of the data,” says MIT senior Kristian Georgiev. In a project with MIT Professor Asu Ozdaglar and graduate student Alireza Fallah, Georgiev is looking at ways to automatically query the data to find the model that best suits its constraints. “If you choose your architecture based on the data, you’re more likely to get a good and robust solution from a learning theory perspective,” he says.

    The hardest part of the project was the exploratory phase at the start, he says. To find a good research question he read through papers ranging from topics in autoML to representation theory. But it was worth it, he says, to be able to work at the intersection of optimization and generalization. “To make good progress in machine learning you need to combine both of these fields.”

    What makes humans so good at recognizing faces?

    Face recognition comes easily to humans. Picking out familiar faces in a blurred or distorted photo is a cinch. But we don’t really understand why or how to replicate this superpower in machines. To home in on the principles important to recognizing faces, researchers have shown headshots to human subjects that are progressively degraded to see where recognition starts to break down. They are now performing similar experiments on computers to see if deeper insights can be gained

    In a project with MIT Professor Pawan Sinha and the MIT Quest for Intelligence, junior Ashika Verma applied a set of filters to a dataset of celebrity photos. She blurred their faces, distorted them, and changed their color to see if a face-recognition model could pick out photos of the same face. She found that the model did best when the photos were either natural color or grayscale, consistent with the human studies. Accuracy slipped when a color filter was added, but not as much as it did for the human subjects — a wrinkle that Verma plans to investigate further.

    The work is part of a broader effort to understand what makes humans so good at recognizing faces, and how machine vision might be improved as a result. It also ties in with Project Prakash, a nonprofit in India that treats blind children and tracks their recovery to learn more about the visual system and brain plasticity. “Running human experiments takes more time and resources than running computational experiments,” says Verma’s advisor, Kyle Keane, a researcher with MIT Quest. “We’re trying to make AI as human-like as possible so we can run a lot of computational experiments to identify the most promising experiments to run on humans.”

    Degrading the images to use in the experiments, and then running them through the deep nets, was a challenge, says Verma. “It’s very slow,” she says. “You work 20 minutes at a time and then you wait.” But working in a lab with an advisor made it worth it, she says. “It was fun to dip my toes into neuroscience.”

    SuperUROP projects were funded, in part, by the MIT-IBM Watson AI Lab, MIT Quest Corporate, and by Eric Schmidt, technical advisor to Alphabet Inc., and his wife, Wendy. More

  • in

    Robotic solution for disinfecting food production plants wins agribusiness prize

    The winners of this year’s Rabobank-MIT Food and Agribusiness Innovation Prize got a good indication their pitch was striking a chord when a judge offered to have his company partner with the team for an early demonstration. The offer signified demand for their solution — to say nothing of their chances of winning the pitch competition.

    The annual competition’s MIT-based grand-prize winner, Human Dynamics, is seeking to improve sanitation in food production plants with a robotic drone — a “drobot” — that flies through facilities spraying soap and disinfectant.

    The company says the product addresses major labor shortages for food production facilities, which often must carry out daily sanitation processes.

    “They have to sanitize every night, and it’s extremely labor intensive and expensive,” says co-founder Tom Okamoto, a master’s student in MIT’s System Design and Management (SDM) program.

    In the winning pitch, Okamoto said the average large food manufacturer spends $13 million on sanitation annually. When you combine the time sanitation processes takes away from production and delays due to human error, Human Dynamics estimates it’s tackling an $80 billion problem.

    The company’s prototype uses a quadcopter drone that carries a tank, nozzle, and spray hose. Underneath the hood, the drone uses visual detection technology to validate that each area is clean, LIDAR to map out its path, and algorithms for route optimization.

    The product is designed to automate repetitive tasks while complementing other cleaning efforts currently done by humans. Workers will still be required for certain aspects of cleaning and tasks like preparing and inspecting facilities during sanitation.

    The company has already developed several proofs of concept and is planning to run a pilot project with a local food producer and distributor this summer.

    The Human Dynamics team also includes MIT researcher Takahiro Nozaki, MIT master’s student Julia Chen, and Harvard Business School students Mike Mancinelli and Kaz Yoshimaru.

    The company estimates that the addressable market for sanitation in food production facilities in the country is $3 billion.

    The second-place prize went to Resourceful, which aims to help connect buyers and sellers of food waste byproducts through an online platform. The company says there’s a growing market for upcycled products made by companies selling things like edible chips made from juice pulp, building materials made from potato skins, and eyeglasses made from orange peels. But establishing a byproduct supply chain can be difficult.

    “Being paid for byproducts should be low-hanging fruit for food manufacturers, but the system is broken,” says co-founder and CEO Kyra Atekwana, an MBA candidate at the University of Chicago’s Booth School of Business. “There are tens of millions of pounds of food waste produced in the U.S. every year, and there’s a variety of tech solutions … enabling this food waste and surplus to be captured by consumers. But there’s virtually nothing in the middle to unlock access to the 10.6 million tons of byproduct waste produced every year.”

    Buyers and sellers can offer and browse food waste byproducts on the company’s subscription-based platform. The businesses can also connect and establish contracts through the platform. Resourceful charges a small fee for each transaction.

    The company is currently launching pilots in the Chicago region before making a public launch later this year. It has also partnered with the Upcycled Food Association, a nonprofit focused on reducing food waste.

    The winners were chosen from a group of seven finalist teams. Other finalists included:

    Chicken Haus, a vertically integrated, fast-casual restaurant concept dedicated to serving locally sourced, bone-in fried chicken;
    Joise Food Technologies, which is 3-D printing the next-generation of meat alternatives and other foods using 3-D biofabrication technology and sustainable food ink formulation;
    Marble, which is developing a small-footprint robot to remove fat from the surface of meat cuts to achieve optimal yield;
    Nice Rice, which is developing a rice alternative made from pea starch, which can be upcycled; and
    Roofscapes, which deploys accessible wooden platforms to “vegetalize” roofs in dense urban areas to combat food insecurity and climate change.

    This was the sixth year of the event, which was hosted by the MIT Food and Agriculture Club. The event was sponsored by Rabobank and MIT’s Abdul Latif Jameel World Water and Food Systems Lab (J-WAFS). More

  • in

    Cave deposits show surprising shift in permafrost over the last 400,000 years

    Nearly one quarter of the land in the Northern Hemisphere, amounting to some 9 million square miles, is layered with permafrost — soil, sediment, and rocks that are frozen solid for years at a time. Vast stretches of permafrost can be found in Alaska, Siberia, and the Canadian Arctic, where persistently freezing temperatures have kept carbon, in the form of decayed bits of plants and animals, locked in the ground.

    Scientists estimate that more than 1,400 gigatons of carbon is trapped in the Earth’s permafrost. As global temperatures climb, and permafrost thaws, this frozen reservoir could potentially escape into the atmosphere as carbon dioxide and methane, significantly amplifying climate change. However, little is known about permafrost’s stability, today or in the past.

    Now geologists at MIT, Boston College, and elsewhere have reconstructed permafrost’s history over the last 1.5 million years. The researchers analyzed cave deposits in locations across western Canada and found evidence that, between 1.5 million and 400,000 years ago, permafrost was prone to thawing, even in high Arctic latitudes. Since then, however, permafrost thaw has been limited to sub-Arctic regions.

    The results, published today in Science Advances, suggest that the planet’s permafrost shifted to a more stable state in the last 400,000 years, and has been less susceptible to thawing since then. In this more stable state, permafrost likely has retained much of the carbon that it has built up during this time, having little opportunity to gradually release it.

    “The stability of the last 400,000 years may actually work against us, in that it has allowed carbon to steadily accumulate in permafrost over this time. Melting now might lead to substantially greater releases of carbon to the atmosphere than in the past,” says study co-author David McGee, associate professor in MIT’s Department of Earth, Atmospheric, and Planetary Sciences.

    McGee’s co-authors are Ben Hardt and Irit Tal at MIT; Nicole Biller-Celander, Jeremy Shakun, and Corinne Wong at Boston College; Alberto Reyes at the University of Alberta; Bernard Lauriol at the University of Ottawa; and Derek Ford at McMaster University.

    Stacked warming

    Periods of past warming are considered interglacial periods, or times between global ice ages. These geologically brief windows can warm permafrost enough to thaw. Signs of ancient permafrost thaw can be seen in stalagmites and other mineral deposits left behind as water moves through the ground and into caves. These caves, particularly at high Arctic latitudes, are often remote and difficult to access, and as a result, there has been little known about the history of permafrost, and its past stability in warming climates.

    However, in 2013, researchers at Oxford University were able to sample cave deposits from a few locations across Siberia; their analysis suggested that permafrost thaw was widespread throughout Siberia prior to 400,000 years ago. Since then, the results showed a much-reduced range of permafrost thaw.

    Shakun and Biller-Celander wondered whether the trend toward a more stable permafrost was a global one, and looked to carry out similar studies in Canada to reconstruct the permafrost history there. They linked up with pioneering cave scientists Lauriol and Ford, who provided samples of cave deposits that they collected over the years from three distinct permafrost regions: the southern Canadian Rockies, Nahanni National Park in the Northwest Territories, and the northern Yukon.

    In total, the team obtained 74 samples of speleothems, or sections of stalagmites, stalactites, and flowstones, from at least five caves in each region, representing various cave depths, geometries, and glacial histories. Each sampled cave was located on exposed slopes that were likely the first parts of the permafrost landscape to thaw with warming.

    The samples were flown to MIT, where McGee and his lab used precise geochronology techniques to determine the ages of each sample’s layers, each layer reflecting a period of permafrost thaw.

    “Each speleothem was deposited over time like stacked traffic cones,” says McGee. “We started with the outermost, youngest layers to date the most recent time that the permafrost thawed.”

    Arctic shift

    McGee and his colleagues used techniques of uranium/thorium geochronology to date the layers of each speleothem. The dating technique relies on the natural decay process of uranium to its daughter isotope, thorium 230, and the fact that uranium is soluble in water, whereas thorium is not.

    “In the rocks above the cave, as waters percolate through, they accumulate uranium and leave thorium behind,” McGee explains. “Once that water gets to the stalagmite surface and precipitates at time zero, you have uranium, and no thorium. Then gradually, uranium decays and produces thorium.”

    The team drilled out small amounts from each sample and dissolved them through various chemical steps to isolate uranium and thorium. Then they ran the two elements through a mass spectrometer to measure their amounts, the ratio of which they used to calculate a given layer’s age.

    From their analysis, the researchers observed that samples collected from the Yukon and the farthest northern sites bore samples no younger than 400,000 years old, suggesting permafrost thaw has not occurred in those sites since then.

    “There may have been some shallow thaw, but in terms of the entire rock above the cave being thawed, that hasn’t occurred for the last 400,000 years, and was much more common prior to that,” McGee says.

    The results suggest that the Earth’s permafrost was much less stable prior to 400,000 years ago and was more prone to thawing, even during interglacial periods when levels of temperature and atmospheric carbon dioxide were on par with modern levels, as other work has shown.

    “To see this evidence of a much less stable Arctic prior to 400,000 years ago, suggests even under similar conditions, the Arctic can be a very different place,” McGee says. “It raises questions for me about what caused the Arctic to shift into this more stable condition, and what can cause it to shift out of it.”

    This research was supported, in part, by the National Science Foundation, the National Sciences and Engineering Research Council of Canada, and the Polar Continental Shelf Program. More