More stories

  • in

    MIT collaborates with Biogen on three-year, $7 million initiative to address climate, health, and equity

    MIT and Biogen have announced that they will collaborate with the goal to accelerate the science and action on climate change to improve human health. This collaboration is supported by a three-year, $7 million commitment from the company and the Biogen Foundation. The biotechnology company, headquartered in Cambridge, Massachusetts’ Kendall Square, discovers and develops therapies for people living with serious neurological diseases.

    “We have long believed it is imperative for Biogen to make the fight against climate change central to our long-term corporate responsibility commitments. Through this collaboration with MIT, we aim to identify and share innovative climate solutions that will deliver co-benefits for both health and equity,” says Michel Vounatsos, CEO of Biogen. “We are also proud to support the MIT Museum, which promises to make world-class science and education accessible to all, and honor Biogen co-founder Phillip A. Sharp with a dedication inside the museum that recognizes his contributions to its development.”

    Biogen and the Biogen Foundation are supporting research and programs across a range of areas at MIT.

    Advancing climate, health, and equity

    The first such effort involves new work within the MIT Joint Program on the Science and Policy of Global Change to establish a state-of-the-art integrated model of climate and health aimed at identifying targets that deliver climate and health co-benefits.

    “Evidence suggests that not all climate-related actions deliver equal health benefits, yet policymakers, planners, and stakeholders traditionally lack the tools to consider how decisions in one arena impact the other,” says C. Adam Schlosser, deputy director of the MIT Joint Program. “Biogen’s collaboration with the MIT Joint Program — and its support of a new distinguished Biogen Fellow who will develop the new climate/health model — will accelerate our efforts to provide decision-makers with these tools.”

    Biogen is also supporting the MIT Technology and Policy Program’s Research to Policy Engagement Initiative to infuse human health as a key new consideration in decision-making on the best pathways forward to address the global climate crisis, and bridge the knowledge-to-action gap by connecting policymakers, researchers, and diverse stakeholders. As part of this work, Biogen is underwriting a distinguished Biogen Fellow to advance new research on climate, health, and equity.

    “Our work with Biogen has allowed us to make progress on key questions that matter to human health and well-being under climate change,” says Noelle Eckley Selin, who directs the MIT Technology and Policy Program and is a professor in the MIT Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences. “Further, their support of the Research to Policy Engagement Initiative helps all of our research become more effective in making change.”

    In addition, Biogen has joined 13 other companies in the MIT Climate and Sustainability Consortium (MCSC), which is supporting faculty and student research and developing impact pathways that present a range of actionable steps that companies can take — within and across industries — to advance progress toward climate targets.

    “Biogen joining the MIT Climate and Sustainability Consortium represents our commitment to working with member companies across a diverse range of industries, an approach that aims to drive changes swift and broad enough to match the scale of the climate challenge,” says Jeremy Gregory, executive director of the MCSC. “We are excited to welcome a member from the biotechnology space and look forward to harnessing Biogen’s perspectives as we continue to collaborate and work together with the MIT community in exciting and meaningful ways.”

    Making world-class science and education available to MIT Museum visitors

    Support from Biogen will honor Nobel laureate, MIT Institute professor, and Biogen co-founder Phillip A. Sharp with a named space inside the new Kendall Square location of the MIT Museum, set to open in spring 2022. Biogen also is supporting one of the museum’s opening exhibitions, “Essential MIT,” with a section focused on solving real-world problems such as climate change. It is also providing programmatic support for the museum’s Life Sciences Maker Engagement Program.

    “Phil has provided fantastic support to the MIT Museum for more than a decade as an advisory board member and now as board chair, and he has been deeply involved in plans for the new museum at Kendall Square,” says John Durant, the Mark R. Epstein (Class of 1963) Director of the museum. “Seeing his name on the wall will be a constant reminder of his key role in this development, as well as a mark of our gratitude.”

    Inspiring and empowering the next generation of scientists

    Biogen funding is also being directed to engage the next generation of scientists through support for the Biogen-MIT Biotech in Action: Virtual Lab, a program designed to foster a love of science among diverse and under-served student populations.

    Biogen’s support is part of its Healthy Climate, Healthy Lives initiative, a $250 million, 20-year commitment to eliminate fossil fuels across its operations and collaborate with renowned institutions to advance the science of climate and health and support under-served communities. Additional support is provided by the Biogen Foundation to further its long-standing focus on providing students with equitable access to outstanding science education. More

  • in

    Saving seaweed with machine learning

    Last year, Charlene Xia ’17, SM ’20 found herself at a crossroads. She was finishing up her master’s degree in media arts and sciences from the MIT Media Lab and had just submitted applications to doctoral degree programs. All Xia could do was sit and wait. In the meantime, she narrowed down her career options, regardless of whether she was accepted to any program.

    “I had two thoughts: I’m either going to get a PhD to work on a project that protects our planet, or I’m going to start a restaurant,” recalls Xia.

    Xia poured over her extensive cookbook collection, researching international cuisines as she anxiously awaited word about her graduate school applications. She even looked into the cost of a food truck permit in the Boston area. Just as she started hatching plans to open a plant-based skewer restaurant, Xia received word that she had been accepted into the mechanical engineering graduate program at MIT.

    Shortly after starting her doctoral studies, Xia’s advisor, Professor David Wallace, approached her with an interesting opportunity. MathWorks, a software company known for developing the MATLAB computing platform, had announced a new seed funding program in MIT’s Department of Mechanical Engineering. The program encouraged collaborative research projects focused on the health of the planet.

    “I saw this as a super-fun opportunity to combine my passion for food, my technical expertise in ocean engineering, and my interest in sustainably helping our planet,” says Xia.

    Play video

    From MIT Mechanical Engineering: “Saving Seaweed with Machine Learning”

    Wallace knew Xia would be up to the task of taking an interdisciplinary approach to solve an issue related to the health of the planet. “Charlene is a remarkable student with extraordinary talent and deep thoughtfulness. She is pretty much fearless, embracing challenges in almost any domain with the well-founded belief that, with effort, she will become a master,” says Wallace.

    Alongside Wallace and Associate Professor Stefanie Mueller, Xia proposed a project to predict and prevent the spread of diseases in aquaculture. The team focused on seaweed farms in particular.

    Already popular in East Asian cuisines, seaweed holds tremendous potential as a sustainable food source for the world’s ever-growing population. In addition to its nutritive value, seaweed combats various environmental threats. It helps fight climate change by absorbing excess carbon dioxide in the atmosphere, and can also absorb fertilizer run-off, keeping coasts cleaner.

    As with so much of marine life, seaweed is threatened by the very thing it helps mitigate against: climate change. Climate stressors like warm temperatures or minimal sunlight encourage the growth of harmful bacteria such as ice-ice disease. Within days, entire seaweed farms are decimated by unchecked bacterial growth.

    To solve this problem, Xia turned to the microbiota present in these seaweed farms as a predictive indicator of any threat to the seaweed or livestock. “Our project is to develop a low-cost device that can detect and prevent diseases before they affect seaweed or livestock by monitoring the microbiome of the environment,” says Xia.

    The team pairs old technology with the latest in computing. Using a submersible digital holographic microscope, they take a 2D image. They then use a machine learning system known as a neural network to convert the 2D image into a representation of the microbiome present in the 3D environment.

    “Using a machine learning network, you can take a 2D image and reconstruct it almost in real time to get an idea of what the microbiome looks like in a 3D space,” says Xia.

    The software can be run in a small Raspberry Pi that could be attached to the holographic microscope. To figure out how to communicate these data back to the research team, Xia drew upon her master’s degree research.

    In that work, under the guidance of Professor Allan Adams and Professor Joseph Paradiso in the Media Lab, Xia focused on developing small underwater communication devices that can relay data about the ocean back to researchers. Rather than the usual $4,000, these devices were designed to cost less than $100, helping lower the cost barrier for those interested in uncovering the many mysteries of our oceans. The communication devices can be used to relay data about the ocean environment from the machine learning algorithms.

    By combining these low-cost communication devices along with microscopic images and machine learning, Xia hopes to design a low-cost, real-time monitoring system that can be scaled to cover entire seaweed farms.

    “It’s almost like having the ‘internet of things’ underwater,” adds Xia. “I’m developing this whole underwater camera system alongside the wireless communication I developed that can give me the data while I’m sitting on dry land.”

    Armed with these data about the microbiome, Xia and her team can detect whether or not a disease is about to strike and jeopardize seaweed or livestock before it is too late.

    While Xia still daydreams about opening a restaurant, she hopes the seaweed project will prompt people to rethink how they consider food production in general.

    “We should think about farming and food production in terms of the entire ecosystem,” she says. “My meta-goal for this project would be to get people to think about food production in a more holistic and natural way.” More

  • in

    How diet affects tumors

    In recent years, there has been some evidence that dietary interventions can help to slow the growth of tumors. A new study from MIT, which analyzed two different diets in mice, reveals how those diets affect cancer cells, and offers an explanation for why restricting calories may slow tumor growth.

    The study examined the effects of a calorically restricted diet and a ketogenic diet in mice with pancreatic tumors. While both of these diets reduce the amount of sugar available to tumors, the researchers found that only the calorically restricted diet reduced the availability of fatty acids, and this was linked to a slowdown in tumor growth.

    The findings do not suggest that cancer patients should try to follow either of these diets, the researchers say. Instead, they believe the findings warrant further study to determine how dietary interventions might be combined with existing or emerging drugs to help patients with cancer.

    “There’s a lot of evidence that diet can affect how fast your cancer progresses, but this is not a cure,” says Matthew Vander Heiden, director of MIT’s Koch Institute for Integrative Cancer Research and the senior author of the study. “While the findings are provocative, further study is needed, and individual patients should talk to their doctor about the right dietary interventions for their cancer.”

    MIT postdoc Evan Lien is the lead author of the paper, which appears today in Nature.

    Metabolic mechanism

    Vander Heiden, who is also a medical oncologist at Dana-Farber Cancer Institute, says his patients often ask him about the potential benefits of various diets, but there is not enough scientific evidence available to offer any definitive advice. Many of the dietary questions that patients have focus on either a calorie-restricted diet, which reduces calorie consumption by 25 to 50 percent, or a ketogenic diet, which is low in carbohydrates and high in fat and protein.

    Previous studies have suggested that a calorically restricted diet might slow tumor growth in some contexts, and such a diet has been shown to extend lifespan in mice and many other animal species. A smaller number of studies exploring the effects of a ketogenic diet on cancer have produced inconclusive results.

    “A lot of the advice or cultural fads that are out there aren’t necessarily always based on very good science,” Lien says. “It seemed like there was an opportunity, especially with our understanding of cancer metabolism having evolved so much over the past 10 years or so, that we could take some of the biochemical principles that we’ve learned and apply those concepts to understanding this complex question.”

    Cancer cells consume a great deal of glucose, so some scientists had hypothesized that either the ketogenic diet or calorie restriction might slow tumor growth by reducing the amount of glucose available. However, the MIT team’s initial experiments in mice with pancreatic tumors showed that calorie restriction has a much greater effect on tumor growth than the ketogenic diet, so the researchers suspected that glucose levels were not playing a major role in the slowdown.

    To dig deeper into the mechanism, the researchers analyzed tumor growth and nutrient concentration in mice with pancreatic tumors, which were fed either a normal, ketogenic, or calorie-restricted diet. In both the ketogenic and calorie-restricted mice, glucose levels went down. In the calorie-restricted mice, lipid levels also went down, but in mice on the ketogenic diet, they went up.

    Lipid shortages impair tumor growth because cancer cells need lipids to construct their cell membranes. Normally, when lipids aren’t available in a tissue, cells can make their own. As part of this process, they need to maintain the right balance of saturated and unsaturated fatty acids, which requires an enzyme called stearoyl-CoA desaturase (SCD). This enzyme is responsible for converting saturated fatty acids into unsaturated fatty acids.

    Both calorie-restricted and ketogenic diets reduce SCD activity, but mice on the ketogenic diet had lipids available to them from their diet, so they didn’t need to use SCD. Mice on the calorie-restricted diet, however, couldn’t get fatty acids from their diet or produce their own. In these mice, tumor growth slowed significantly, compared to mice on the ketogenic diet.

    “Not only does caloric restriction starve tumors of lipids, it also impairs the process that allows them to adapt to it. That combination is really contributing to the inhibition of tumor growth,” Lien says.

    Dietary effects

    In addition to their mouse research, the researchers also looked at some human data. Working with Brian Wolpin, an oncologist at Dana-Farber Cancer Institute and an author of the paper, the team obtained data from a large cohort study that allowed them to analyze the relationship between dietary patterns and survival times in pancreatic cancer patients. From that study, the researchers found that the type of fat consumed appears to influence how patients on a low-sugar diet fare after a pancreatic cancer diagnosis, although the data are not complete enough to draw any conclusions about the effect of diet, the researchers say.

    Although this study showed that calorie restriction has beneficial effects in mice, the researchers say they do not recommend that cancer patients follow a calorie-restricted diet, which is difficult to maintain and can have harmful side effects. However, they believe that cancer cells’ dependence on the availability of unsaturated fatty acids could be exploited to develop drugs that might help slow tumor growth.

    One possible therapeutic strategy could be inhibition of the SCD enzyme, which would cut off tumor cells’ ability to produce unsaturated fatty acids.

    “The purpose of these studies isn’t necessarily to recommend a diet, but it’s to really understand the underlying biology,” Lien says. “They provide some sense of the mechanisms of how these diets work, and that can lead to rational ideas on how we might mimic those situations for cancer therapy.”

    The researchers now plan to study how diets with a variety of fat sources — including plant or animal-based fats with defined differences in saturated, monounsaturated, and polyunsaturated fatty acid content — alter tumor fatty acid metabolism and the ratio of unsaturated to saturated fatty acids.

    The research was funded by the Damon Runyon Cancer Research Foundation, the National Institutes of Health, the Lustgarten Foundation, the Dana-Farber Cancer Institute Hale Family Center for Pancreatic Cancer Research, Stand Up to Cancer, the Pancreatic Cancer Action Network, the Noble Effort Fund, the Wexler Family Fund, Promises for Purple, the Bob Parsons Fund, the Emerald Foundation, the Howard Hughes Medical Institute, the MIT Center for Precision Cancer Medicine, and the Ludwig Center at MIT. More

  • in

    How marsh grass protects shorelines

    Marsh plants, which are ubiquitous along the world’s shorelines, can play a major role in mitigating the damage to coastlines as sea levels rise and storm surges increase. Now, a new MIT study provides greater detail about how these protective benefits work under real-world conditions shaped by waves and currents.

    The study combined laboratory experiments using simulated plants in a large wave tank along with mathematical modeling. It appears in the journal Physical Review — Fluids, in a paper by former MIT visiting doctoral student Xiaoxia Zhang, now a postdoc at Dalian University of Technology, and professor of civil and environmental engineering Heidi Nepf.

    It’s already clear that coastal marsh plants provide significant protection from surges and devastating  storms. For example, it has been estimated that the damage caused by Hurricane Sandy was reduced by $625 million thanks to the damping of wave energy provided by extensive areas of marsh along the affected coasts. But the new MIT analysis incorporates details of plant morphology, such as the number and spacing of flexible leaves versus stiffer stems, and the complex interactions of currents and waves that may be coming from different directions.

    This level of detail could enable coastal restoration planners to determine the area of marsh needed to mitigate expected amounts of storm surge or sea-level rise, and to decide which types of plants to introduce to maximize protection.

    “When you go to a marsh, you often will see that the plants are arranged in zones,” says Nepf, who is the Donald and Martha Harleman Professor of Civil and Environmental Engineering. “Along the edge, you tend to have plants that are more flexible, because they are using their flexibility to reduce the wave forces they feel. In the next zone, the plants are a little more rigid and have a bit more leaves.”

    As the zones progress, the plants become stiffer, leafier, and more effective at absorbing wave energy thanks to their greater leaf area. The new modeling done in this research, which incorporated work with simulated plants in the 24-meter-long wave tank at MIT’s Parsons Lab, can enable coastal planners to take these kinds of details into account when planning protection, mitigation, or restoration projects.

    “If you put the stiffest plants at the edge, they might not survive, because they’re feeling very high wave forces. By describing why Mother Nature organizes plants in this way, we can hopefully design a more sustainable restoration,” Nepf says.

    Once established, the marsh plants provide a positive feedback cycle that helps to not only stabilize but also build up these delicate coastal lands, Zhang says. “After a few years, the marsh grasses start to trap and hold the sediment, and the elevation gets higher and higher, which might keep up with sea level rise,” she says.

    The new MIT analysis incorporates details of plant morphology, such as the number and spacing of flexible leaves versus stiffer stems, and the complex interactions of currents and waves that may be coming from different directions.

    Awareness of the protective effects of marshland has been growing, Nepf says. For example, the Netherlands has been restoring lost marshland outside the dikes that surround much of the nation’s agricultural land, finding that the marsh can protect the dikes from erosion; the marsh and dikes work together much more effectively than the dikes alone at preventing flooding.

    But most such efforts so far have been largely empirical, trial-and-error plans, Nepf says. Now, they could take advantage of this modeling to know just how much marshland with what types of plants would be needed to provide the desired level of protection.

    It also provides a more quantitative way to estimate the value provided by marshes, she says. “It could allow you to more accurately say, ‘40 meters of marsh will reduce waves this much and therefore will reduce overtopping of your levee by this much.’ Someone could use that to say, ‘I’m going to save this much money over the next 10 years if I reduce flooding by maintaining this marsh.’ It might help generate some political motivation for restoration efforts.”

    Nepf herself is already trying to get some of these findings included in coastal planning processes. She serves on a practitioner panel led by Chris Esposito of the Water Institute of the Gulf, which serves the storm-battered Louisiana coastline. “We’d like to get this work into the coatal simulations that are used for large-scale restoration and coastal planning,” she says.

    “Understanding the wave damping process in real vegetation wetlands is of critical value, as it is needed in the assessment of the coastal defense value of these wetlands,” says Zhan Hu, an associate professor of marine sciences at Sun Yat-Sen University, who was not associated with this work. “The challenge, however, lies in the quantitative representation of the wave damping process, in which many factors are at play, such as plant flexibility, morphology, and coexisting currents.”

    The new study, Hu says, “neatly combines experimental findings and analytical modeling to reveal the impact of each factor in the wave damping process. … Overall, this work is a solid step forward toward a more accurate assessment of wave damping capacity of real coastal wetlands, which is needed for science-based design and management of nature-based coastal protection.”

    The work was partly supported by the National Science Foundation and the China Scholarship Council.  More

  • in

    Rover images confirm Jezero crater is an ancient Martian lake

    The first scientific analysis of images taken by NASA’s Perseverance rover has now confirmed that Mars’ Jezero crater — which today is a dry, wind-eroded depression — was once a quiet lake, fed steadily by a small river some 3.7 billion years ago.

    The images also reveal evidence that the crater endured flash floods. This flooding was energetic enough to sweep up large boulders from tens of miles upstream and deposit them into the lakebed, where the massive rocks lie today.

    The new analysis, published today in the journal Science, is based on images of the outcropping rocks inside the crater on its western side. Satellites had previously shown that this outcrop, seen from above, resembled river deltas on Earth, where layers of sediment are deposited in the shape of a fan as the river feeds into a lake.

    Perseverance’s new images, taken from inside the crater, confirm that this outcrop was indeed a river delta. Based on the sedimentary layers in the outcrop, it appears that the river delta fed into a lake that was calm for much of its existence, until a dramatic shift in climate triggered episodic flooding at or toward the end of the lake’s history.

    “If you look at these images, you’re basically staring at this epic desert landscape. It’s the most forlorn place you could ever visit,” says Benjamin Weiss, professor of planetary sciences in MIT’s Department of Earth, Atmospheric and Planetary Sciences and a member of the analysis team. “There’s not a drop of water anywhere, and yet, here we have evidence of a very different past. Something very profound happened in the planet’s history.”

    As the rover explores the crater, scientists hope to uncover more clues to its climatic evolution. Now that they have confirmed the crater was once a lake environment, they believe its sediments could hold traces of ancient aqueous life. In its mission going forward, Perseverance will look for locations to collect and preserve sediments. These samples will eventually be returned to Earth, where scientists can probe them for Martian biosignatures.

    “We now have the opportunity to look for fossils,” says team member Tanja Bosak, associate professor of geobiology at MIT. “It will take some time to get to the rocks that we really hope to sample for signs of life. So, it’s a marathon, with a lot of potential.”

    Tilted beds

    On Feb. 18, 2021, the Perseverance rover landed on the floor of Jezero crater, a little more than a mile away from its western fan-shaped outcrop. In the first three months, the vehicle remained stationary as NASA engineers performed remote checks of the rover’s many instruments.

    During this time, two of Perseverance’s cameras, Mastcam-Z and the SuperCam Remote Micro-Imager (RMI), captured images of their surroundings, including long-distance photos of the outcrop’s edge and a formation known as Kodiak butte, a smaller outcop that planetary geologists surmise may have once been connected to the main fan-shaped outcrop but has since partially eroded.

    Once the rover downlinked images to Earth, NASA’s Perseverance science team processed and combined the images, and were able to observe distinct beds of sediment along Kodiak butte in surprisingly high resolution. The researchers measured each layer’s thickness, slope, and lateral extent, finding that the sediment must have been deposited by flowing water into a lake, rather than by wind, sheet-like floods, or other geologic processes.

    The rover also captured similar tilted sediment beds along the main outcrop. These images, together with those of Kodiak, confirm that the fan-shaped formation was indeed an ancient delta and that this delta fed into an ancient Martian lake.

    “Without driving anywhere, the rover was able to solve one of the big unknowns, which was that this crater was once a lake,” Weiss says. “Until we actually landed there and confirmed it was a lake, it was always a question.”

    Boulder flow

    When the researchers took a closer look at images of the main outcrop, they noticed large boulders and cobbles embedded in the youngest, topmost layers of the delta. Some boulders measured as wide as 1 meter across, and were estimated to weigh up to several tons. These massive rocks, the team concluded, must have come from outside the crater, and was likely part of bedrock located on the crater rim or else 40 or more miles upstream.

    Judging from their current location and dimensions, the team says the boulders were carried downstream and into the lakebed by a flash-flood that flowed up to 9 meters per second and moved up to 3,000 cubic meters of water per second.

    “You need energetic flood conditions to carry rocks that big and heavy,” Weiss says. “It’s a special thing that may be indicative of a fundamental change in the local hydrology or perhaps the regional climate on Mars.”

    Because the huge rocks lie in the upper layers of the delta, they represent the most recently deposited material. The boulders sit atop layers of older, much finer sediment. This stratification, the researchers say, indicates that for much of its existence, the ancient lake was filled by a gently flowing river. Fine sediments — and possibly organic material — drifted down the river, and settled into a gradual, sloping delta.

    However, the crater later experienced sudden flash floods that deposited large boulders onto the delta. Once the lake dried up, and over billions of years wind eroded the landscape, leaving the crater we see today.

    The cause of this climate turnaround is unknown, although Weiss says the delta’s boulders may hold some answers.

    “The most surprising thing that’s come out of these images is the potential opportunity to catch the time when this crater transitioned from an Earth-like habitable environment, to this desolate landscape wasteland we see now,” he says. “These boulder beds may be records of this transition, and we haven’t seen this in other places on Mars.”

    This research was supported, in part, by NASA. More

  • in

    A robot that finds lost items

    A busy commuter is ready to walk out the door, only to realize they’ve misplaced their keys and must search through piles of stuff to find them. Rapidly sifting through clutter, they wish they could figure out which pile was hiding the keys.

    Researchers at MIT have created a robotic system that can do just that. The system, RFusion, is a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper. It fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.

    The RFusion prototype the researchers developed relies on RFID tags, which are cheap, battery-less tags that can be stuck to an item and reflect signals sent by an antenna. Because RF signals can travel through most surfaces (like the mound of dirty laundry that may be obscuring the keys), RFusion is able to locate a tagged item within a pile.

    Using machine learning, the robotic arm automatically zeroes-in on the object’s exact location, moves the items on top of it, grasps the object, and verifies that it picked up the right thing. The camera, antenna, robotic arm, and AI are fully integrated, so RFusion can work in any environment without requiring a special set up.

    While finding lost keys is helpful, RFusion could have many broader applications in the future, like sorting through piles to fulfill orders in a warehouse, identifying and installing components in an auto manufacturing plant, or helping an elderly individual perform daily tasks in the home, though the current prototype isn’t quite fast enough yet for these uses.

    “This idea of being able to find items in a chaotic world is an open problem that we’ve been working on for a few years. Having robots that are able to search for things under a pile is a growing need in industry today. Right now, you can think of this as a Roomba on steroids, but in the near term, this could have a lot of applications in manufacturing and warehouse environments,” said senior author Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab.

    Co-authors include research assistant Tara Boroushaki, the lead author; electrical engineering and computer science graduate student Isaac Perper; research associate Mergen Nachin; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. The research will be presented at the Association for Computing Machinery Conference on Embedded Networked Senor Systems next month.

    Play video

    Sending signals

    RFusion begins searching for an object using its antenna, which bounces signals off the RFID tag (like sunlight being reflected off a mirror) to identify a spherical area in which the tag is located. It combines that sphere with the camera input, which narrows down the object’s location. For instance, the item can’t be located on an area of a table that is empty.

    But once the robot has a general idea of where the item is, it would need to swing its arm widely around the room taking additional measurements to come up with the exact location, which is slow and inefficient.

    The researchers used reinforcement learning to train a neural network that can optimize the robot’s trajectory to the object. In reinforcement learning, the algorithm is trained through trial and error with a reward system.

    “This is also how our brain learns. We get rewarded from our teachers, from our parents, from a computer game, etc. The same thing happens in reinforcement learning. We let the agent make mistakes or do something right and then we punish or reward the network. This is how the network learns something that is really hard for it to model,” Boroushaki explains.

    In the case of RFusion, the optimization algorithm was rewarded when it limited the number of moves it had to make to localize the item and the distance it had to travel to pick it up.

    Once the system identifies the exact right spot, the neural network uses combined RF and visual information to predict how the robotic arm should grasp the object, including the angle of the hand and the width of the gripper, and whether it must remove other items first. It also scans the item’s tag one last time to make sure it picked up the right object.

    Cutting through clutter

    The researchers tested RFusion in several different environments. They buried a keychain in a box full of clutter and hid a remote control under a pile of items on a couch.

    But if they fed all the camera data and RF measurements to the reinforcement learning algorithm, it would have overwhelmed the system. So, drawing on the method a GPS uses to consolidate data from satellites, they summarized the RF measurements and limited the visual data to the area right in front of the robot.

    Their approach worked well — RFusion had a 96 percent success rate when retrieving objects that were fully hidden under a pile.

    “Sometimes, if you only rely on RF measurements, there is going to be an outlier, and if you rely only on vision, there is sometimes going to be a mistake from the camera. But if you combine them, they are going to correct each other. That is what made the system so robust,” Boroushaki says.

    In the future, the researchers hope to increase the speed of the system so it can move smoothly, rather than stopping periodically to take measurements. This would enable RFusion to be deployed in a fast-paced manufacturing or warehouse setting.

    Beyond its potential industrial uses, a system like this could even be incorporated into future smart homes to assist people with any number of household tasks, Boroushaki says.

    “Every year, billions of RFID tags are used to identify objects in today’s complex supply chains, including clothing and lots of other consumer goods. The RFusion approach points the way to autonomous robots that can dig through a pile of mixed items and sort them out using the data stored in the RFID tags, much more efficiently than having to inspect each item individually, especially when the items look similar to a computer vision system,” says Matthew S. Reynolds, CoMotion Presidential Innovation Fellow and associate professor of electrical and computer engineering at the University of Washington, who was not involved in the research. “The RFusion approach is a great step forward for robotics operating in complex supply chains where identifying and ‘picking’ the right item quickly and accurately is the key to getting orders fulfilled on time and keeping demanding customers happy.”

    The research is sponsored by the National Science Foundation, a Sloan Research Fellowship, NTT DATA, Toppan, Toppan Forms, and the Abdul Latif Jameel Water and Food Systems Lab. More

  • in

    New “risk triage” platform pinpoints compounding threats to US infrastructure

    Over a 36-hour period in August, Hurricane Henri delivered record rainfall in New York City, where an aging storm-sewer system was not built to handle the deluge, resulting in street flooding. Meanwhile, an ongoing drought in California continued to overburden aquifers and extend statewide water restrictions. As climate change amplifies the frequency and intensity of extreme events in the United States and around the world, and the populations and economies they threaten grow and change, there is a critical need to make infrastructure more resilient. But how can this be done in a timely, cost-effective way?

    An emerging discipline called multi-sector dynamics (MSD) offers a promising solution. MSD homes in on compounding risks and potential tipping points across interconnected natural and human systems. Tipping points occur when these systems can no longer sustain multiple, co-evolving stresses, such as extreme events, population growth, land degradation, drinkable water shortages, air pollution, aging infrastructure, and increased human demands. MSD researchers use observations and computer models to identify key precursory indicators of such tipping points, providing decision-makers with critical information that can be applied to mitigate risks and boost resilience in infrastructure and managed resources.

    At MIT, the Joint Program on the Science and Policy of Global Change has since 2018 been developing MSD expertise and modeling tools and using them to explore compounding risks and potential tipping points in selected regions of the United States. In a two-hour webinar on Sept. 15, MIT Joint Program researchers presented an overview of the program’s MSD research tool set and its applications.  

    MSD and the risk triage platform

    “Multi-sector dynamics explores interactions and interdependencies among human and natural systems, and how these systems may adapt, interact, and co-evolve in response to short-term shocks and long-term influences and stresses,” says MIT Joint Program Deputy Director C. Adam Schlosser, noting that such analysis can reveal and quantify potential risks that would likely evade detection in siloed investigations. “These systems can experience cascading effects or failures after crossing tipping points. The real question is not just where these tipping points are in each system, but how they manifest and interact across all systems.”

    To address that question, the program’s MSD researchers have developed the MIT Socio-Environmental Triage (MST) platform, now publicly available for the first time. Focused on the continental United States, the first version of the platform analyzes present-day risks related to water, land, climate, the economy, energy, demographics, health, and infrastructure, and where these compound to create risk hot spots. It’s essentially a screening-level visualization tool that allows users to examine risks, identify hot spots when combining risks, and make decisions about how to deploy more in-depth analysis to solve complex problems at regional and local levels. For example, MST can identify hot spots for combined flood and poverty risks in the lower Mississippi River basin, and thereby alert decision-makers as to where more concentrated flood-control resources are needed.

    Successive versions of the platform will incorporate projections based on the MIT Joint Program’s Integrated Global System Modeling (IGSM) framework of how different systems and stressors may co-evolve into the future and thereby change the risk landscape. This enhanced capability could help uncover cost-effective pathways for mitigating and adapting to a wide range of environmental and economic risks.  

    MSD applications

    Five webinar presentations explored how MIT Joint Program researchers are applying the program’s risk triage platform and other MSD modeling tools to identify potential tipping points and risks in five key domains: water quality, land use, economics and energy, health, and infrastructure. 

    Joint Program Principal Research Scientist Xiang Gao described her efforts to apply a high-resolution U.S. water-quality model to calculate a location-specific, water-quality index over more than 2,000 river basins in the country. By accounting for interactions among climate, agriculture, and socioeconomic systems, various water-quality measures can be obtained ranging from nitrate and phosphate levels to phytoplankton concentrations. This modeling approach advances a unique capability to identify potential water-quality risk hot spots for freshwater resources.

    Joint Program Research Scientist Angelo Gurgel discussed his MSD-based analysis of how climate change, population growth, changing diets, crop-yield improvements and other forces that drive land-use change at the global level may ultimately impact how land is used in the United States. Drawing upon national observational data and the IGSM framework, the analysis shows that while current U.S. land-use trends are projected to persist or intensify between now and 2050, there is no evidence of any concerning tipping points arising throughout this period.  

    MIT Joint Program Research Scientist Jennifer Morris presented several examples of how the risk triage platform can be used to combine existing U.S. datasets and the IGSM framework to assess energy and economic risks at the regional level. For example, by aggregating separate data streams on fossil-fuel employment and poverty, one can target selected counties for clean energy job training programs as the nation moves toward a low-carbon future. 

    “Our modeling and risk triage frameworks can provide pictures of current and projected future economic and energy landscapes,” says Morris. “They can also highlight interactions among different human, built, and natural systems, including compounding risks that occur in the same location.”  

    MIT Joint Program research affiliate Sebastian Eastham, a research scientist at the MIT Laboratory for Aviation and the Environment, described an MSD approach to the study of air pollution and public health. Linking the IGSM with an atmospheric chemistry model, Eastham ultimately aims to better understand where the greatest health risks are in the United States and how they may compound throughout this century under different policy scenarios. Using the risk triage tool to combine current risk metrics for air quality and poverty in a selected county based on current population and air-quality data, he showed how one can rapidly identify cardiovascular and other air-pollution-induced disease risk hot spots.

    Finally, MIT Joint Program research affiliate Alyssa McCluskey, a lecturer at the University of Colorado at Boulder, showed how the risk triage tool can be used to pinpoint potential risks to roadways, waterways, and power distribution lines from flooding, extreme temperatures, population growth, and other stressors. In addition, McCluskey described how transportation and energy infrastructure development and expansion can threaten critical wildlife habitats.

    Enabling comprehensive, location-specific analyses of risks and hot spots within and among multiple domains, the Joint Program’s MSD modeling tools can be used to inform policymaking and investment from the municipal to the global level.

    “MSD takes on the challenge of linking human, natural, and infrastructure systems in order to inform risk analysis and decision-making,” says Schlosser. “Through our risk triage platform and other MSD models, we plan to assess important interactions and tipping points, and to provide foresight that supports action toward a sustainable, resilient, and prosperous world.”

    This research is funded by the U.S. Department of Energy’s Office of Science as an ongoing project. More

  • in

    For campus “porosity hunters,” climate resilience is the goal

    At MIT, it’s not uncommon to see groups navigating campus with smartphones and measuring devices in hand, using the Institute as a test bed for research. During one week this summer more than a dozen students, researchers, and faculty, plus an altimeter, could be seen doing just that as they traveled across MIT to measure the points of entry into campus buildings — including windows, doors, and vents — known as a building’s porosity.

    Why measure campus building porosity?

    The group was part of the MIT Porosity Hunt, a citizen-science effort that is using the MIT campus as a place to test emerging methodologies, instruments, and data collection processes to better understand the potential impact of a changing climate — and specifically storm scenarios resulting from it — on infrastructure. The hunt is a collaborative effort between the Urban Risk Lab, led by director and associate professor of architecture and urbanism Miho Mazereeuw, and the Office of Sustainability (MITOS), aimed at supporting an MIT that is resilient to the impacts of climate change, including flooding and extreme heat events. Working over three days, members of the hunt catalogued openings in dozens of buildings across campus to better support flood mapping and resiliency planning at MIT.

    For Mazereeuw, the data collection project lies at the nexus of her work with the Urban Risk Lab and as a member of MIT’s Climate Resiliency Committee. While the lab’s mission is to “develop methods, prototypes, and technologies to embed risk reduction and preparedness into the design of cities and regions to increase resilience,” the Climate Resiliency Committee — made up of faculty, staff, and researchers — is focused on assessing, planning, and operationalizing a climate-resilient MIT. The work of both the lab and the committee is embedded in the recently released MIT Climate Resiliency Dashboard, a visualization tool that allows users to understand potential flooding impacts of a number of storm scenarios and drive decision-making.

    While the debut of the tool signaled a big advancement in resiliency planning at MIT, some, including Mazereeuw, saw an opportunity for enhancement. In working with Ken Strzepek, a MITOS Faculty Fellow and research scientist at the MIT Center for Global Change Science who was also an integral part of this work, Mazereeuw says she was surprised to learn that even the most sophisticated flood modeling treats buildings as solid blocks. With all buildings being treated the same, despite varying porosity, the dashboard is limited in some flood scenario analysis. To address this, Mazereeuw and others got to work to fill in that additional layer of data, with the citizen science efforts a key factor of that work. “Understanding the porosity of the building is important to understanding how much water actually goes in the building in these scenarios,” she explains.

    Though surveyors are often used to collect and map this type of information, Mazereeuw wanted to leverage the MIT community in order to collect data quickly while engaging students, faculty, and researchers as resiliency stewards for the campus. “It’s important for projects like this to encourage awareness,” she explains. “Generally, when something fails, we notice it, but otherwise we don’t. With climate change bringing on more uncertainty in the scale and intensity of events, we need everyone to be more aware and help us understand things like vulnerabilities.”

    To do this, MITOS and the Urban Risk Lab reached out to more than a dozen students, who were joined by faculty, staff, and researchers, to map porosity of 31 campus buildings connected by basements. The buildings were chosen based on this connectivity, understanding that water that reaches one basement could potentially flow to another.

    Urban Risk Lab research scientists Aditya Barve and Mayank Ojha aided the group’s efforts by creating a mapping app and chatbot to support consistency in reporting and ease of use. Each team member used the app to find buildings where porosity points needed to be mapped. As teams arrived at the building exteriors, they entered their location in the app, which then triggered the Facebook and LINE-powered chatbot on their phone. There, students were guided through measuring the opening, adjusting for elevation to correlate to the City of Cambridge base datum, and, based on observable features, noting the materials and quality of the opening on a one-through-three scale. Over just three days, the team, which included Mazereeuw herself, mapped 1,030 porosity points that will aid in resiliency planning and preparation on campus in a number of ways.

    “The goal is to understand various heights for flood waters around porous spots on campus,” says Mazereeuw. “But the impact can be different depending on the space. We hope this data can inform safety as well as understanding potential damage to research or disruption to campus operations from future storms.”

    The porosity data collection is complete for this round — future hunts will likely be conducted to confirm and converge data — but one team member’s work continues at the basement level of MIT. Katarina Boukin, a PhD student in civil and environmental engineering and PhD student fellow with MITOS, has been focused on methods of collecting data beneath buildings at MIT to understand how they would be impacted if flood water were to enter. “We have a number of connected basements on campus, and if one of them floods, potentially all of them do,” explains Boukin. “By looking at absolute elevation and porosity, we’re connecting the outside to the inside and tracking how much and where water may flow.” With the added data from the Porosity Hunt, a complete picture of vulnerabilities and resiliency opportunities can be shared.

    Synthesizing much of this data is where Eva Then ’21 comes in. Then was among the students who worked to capture data points over the three days and is now working in ArcGIS — an online mapping software that also powers the Climate Resiliency Dashboard — to process and visualize the data collected. Once completed, the data will be incorporated into the campus flood model to increase the accuracy of projections on the Climate Resiliency Dashboard. “Over the next decades, the model will serve as an adaptive planning tool to make campus safe and resilient amid growing climate risks,” Then says.

    For Mazereeuw, the Porosity Hunt and data collected additionally serve as a study in scalability, providing valuable insight on how similar research efforts inspired by the MIT test bed approach could be undertaken and inform policy beyond MIT. She also hopes it will inspire students to launch their own hunts in the future, becoming resiliency stewards for their campus and dorms. “Going through measuring and documenting turns on and shows a new set of goggles — you see campus and buildings in a slightly different way,” she says, “Having people look carefully and document change is a powerful tool in climate and resiliency planning.” 

    Mazereeuw also notes that recent devastating flooding events across the country, including those resulting from Hurricane Ida, have put a special focus on this work. “The loss of life that occurred in that storm, including those who died as waters flooded their basement homes  underscores the urgency of this type of research, planning, and readiness.” More