More stories

  • in

    MIT students explore food sustainability

    As students approached the homestretch of the fall semester, many were focused on completing final projects and preparing for exams. During this time of year, some students may neglect their well-being to the point of skipping meals. To help alleviate end-of-term stress and to give students a delicious study break, the Food Security Action Team recently offered a group of first-year students the opportunity to join a food tour of Daily Table, a new grocer located in Cambridge’s Central Square.

    Seventeen students along with staff from Student Financial Services, Office of the First Year, and the Office of Sustainability led the group from the steps of 77 Massachusetts Avenue a few blocks down the street to Daily Table in Central Square. As part of participating in the program, students were given a $25 TechCash gift card to shop for grocery items during the trip. To make things even more fun, MIT staff created a recipe challenge to encourage students to work together on making their own variation of quesadillas.

    Healthy, affordable, sustainable

    At Daily Table, students were greeted by Celia Grant, director of community engagement and programs from Daily Table, who led them through a tour of the space and highlighted the history and model of the grocery store, as well as some of its unique features. Founded by former Trader Joe’s president Doug Rauch in 2015, Daily Table operates three retail stores in Dorchester, Roxbury, and Central Square, and a commissary kitchen in the Boston metro area. Two more stores are in the works: one in Mattapan and another in Salem. For added convenience, Daily Table also offers free grocery delivery within a two-mile radius of its three locations.

    The Daily Table’s ethos is that delicious and wholesome food should be available, accessible, and affordable for everyone. To achieve these goals, Daily Table provides a wide selection of fresh produce, nutritious grocery staples, and made-from-scratch prepared grab-n-go foods at affordable prices. “All of our products meet strict nutritional guidelines for sodium and sugar so that customers can make food choices based on their diets, not based on price,” says Grant.

    In addition to a large network of farmers, manufacturers, and distributors who supply food to their stores, Daily Table often recovers and rescues perfectly good food that would have otherwise been sent to landfills. Surplus food, packaging and/or label changes, and items with close expiration dates are often discarded by larger grocery stores in the supply chain. But Daily Table steps in to break this cycle of waste and sell these products to customers at a much lower cost. 

    The pandemic has uncovered how difficult it can be for individuals and families to budget for necessities like utilities, rent, and even food. Daily Table seeks to create a more sustainable future by providing access to more well-balanced, nutritious food. “Even before the pandemic, it was challenging for families on limited incomes to meet the nutrition needs of their families. Post-pandemic, this challenge has now encompassed even more households, even those that have never before been challenged in this way,” says Grant. “As winter moves through, and inflation increases, the need for more affordable food and nutrition will rise. Daily Table is prepared to help meet those needs, and more.” 

    Food resources at MIT

    Downstairs at the Daily Table Central Square store, MIT staff members led a discussion about the components of a sustainable food system at MIT and beyond, shared advice on how to budget for food, and offered tips on how to make grocery shopping or cooking fun with fellow classmates and peers. “Shopping at Daily Table provides an experiential case study in solving for multiple goals at once — from the environmental impacts of food waste to healthy eating to affordability — an important framework to consider when tackling climate challenges.” says Susy Jones, senior sustainability project manager in the MIT Office of Sustainability.

    The group also discussed budgeting expenses, including food. “By taking students to the grocery store and providing some small but meaningful tips, we provided them the opportunity to put their learning into practice!” says Erica Aguiar, associate director for financial education in Student Financial Services. “We saw students taking a closer look at prices and even coming together to share groceries.”

    MIT senior and DormCon Dining Chair Ashley Holton shared her grocery shopping strategies with the group, and how she utilizes resources available at MIT. “Having a plan before you enter the grocery store is really important,” says Holton. “Not only does it save time, but it helps you avoid potentially getting more than what your budget allows for, while also making sure you get all the food you’ll need.”

    This program, along with many others, is part of MIT’s larger effort on fostering a more food-secure and sustainable campus for all students. Food Security Action Team members, including students, staff, and campus partners, are striving to achieve this goal by ensuring that there continues to be a well-organized and coordinated action around food security that can be implemented effectively each year. For example, to make shopping at Daily Table even easier, MIT has made it a priority to ensure the store accepts TechCash.

    No MIT student should go hungry due to lack of money or resources, and no student should feel like they need to be “really hungry” to ask for help. MIT offers several other resources to help students find the nutrition and other support they need. In addition, the Office of Student Wellbeing launched their DoingWell website, which offers programs and resources to help students prioritize their well-being by practicing healthy habits and getting support when they need it.

    “In my own cost-analysis comparison of staple grocery items of all the local grocery stores, no other store comes close to being able to offer what Daily Table does for the prices it does. It’s really remarkable to learn and experience just how Daily Table is changing the food system,” says Holton. “Its model is one of the many ways that will continue to foster a more food-secure community where everyone — including MIT students — can access affordable, nutritious food.” More

  • in

    Meet the 2021-22 Accenture Fellows

    Launched in October of 2020, the MIT and Accenture Convergence Initiative for Industry and Technology underscores the ways in which industry and technology come together to spur innovation. The five-year initiative aims to achieve its mission through research, education, and fellowships. To that end, Accenture has once again awarded five annual fellowships to MIT graduate students working on research in industry and technology convergence who are underrepresented, including by race, ethnicity, and gender.

    This year’s Accenture Fellows work across disciplines including robotics, manufacturing, artificial intelligence, and biomedicine. Their research covers a wide array of subjects, including: advancing manufacturing through computational design, with the potential to benefit global vaccine production; designing low-energy robotics for both consumer electronics and the aerospace industry; developing robotics and machine learning systems that may aid the elderly in their homes; and creating ingestible biomedical devices that can help gather medical data from inside a patient’s body.

    Student nominations from each unit within the School of Engineering, as well as from the four other MIT schools and the MIT Schwarzman College of Computing, were invited as part of the application process. Five exceptional students were selected as fellows in the initiative’s second year.

    Xinming (Lily) Liu is a PhD student in operations research at MIT Sloan School of Management. Her work is focused on behavioral and data-driven operations for social good, incorporating human behaviors into traditional optimization models, designing incentives, and analyzing real-world data. Her current research looks at the convergence of social media, digital platforms, and agriculture, with particular attention to expanding technological equity and economic opportunity in developing countries. Liu earned her BS from Cornell University, with a double major in operations research and computer science.

    Caris Moses is a PhD student in electrical engineering and computer science specializing inartificial intelligence. Moses’ research focuses on using machine learning, optimization, and electromechanical engineering to build robotics systems that are robust, flexible, intelligent, and can learn on the job. The technology she is developing holds promise for industries including flexible, small-batch manufacturing; robots to assist the elderly in their households; and warehouse management and fulfillment. Moses earned her BS in mechanical engineering from Cornell University and her MS in computer science from Northeastern University.

    Sergio Rodriguez Aponte is a PhD student in biological engineering. He is working on the convergence of computational design and manufacturing practices, which have the potential to impact industries such as biopharmaceuticals, food, and wellness/nutrition. His current research aims to develop strategies for applying computational tools, such as multiscale modeling and machine learning, to the design and production of manufacturable and accessible vaccine candidates that could eventually be available globally. Rodriguez Aponte earned his BS in industrial biotechnology from the University of Puerto Rico at Mayaguez.

    Soumya Sudhakar SM ’20 is a PhD student in aeronautics and astronautics. Her work is focused on theco-design of new algorithms and integrated circuits for autonomous low-energy robotics that could have novel applications in aerospace and consumer electronics. Her contributions bring together the emerging robotics industry, integrated circuits industry, aerospace industry, and consumer electronics industry. Sudhakar earned her BSE in mechanical and aerospace engineering from Princeton University and her MS in aeronautics and astronautics from MIT.

    So-Yoon Yang is a PhD student in electrical engineering and computer science. Her work on the development of low-power, wireless, ingestible biomedical devices for health care is at the intersection of the medical device, integrated circuit, artificial intelligence, and pharmaceutical fields. Currently, the majority of wireless biomedical devices can only provide a limited range of medical data measured from outside the body. Ingestible devices hold promise for the next generation of personal health care because they do not require surgical implantation, can be useful for detecting physiological and pathophysiological signals, and can also function as therapeutic alternatives when treatment cannot be done externally. Yang earned her BS in electrical and computer engineering from Seoul National University in South Korea and her MS in electrical engineering from Caltech. More

  • in

    Scientists build new atlas of ocean’s oxygen-starved waters

    Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

    Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

    The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

    From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers — roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

    The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

    “It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

    The team’s study appears today in the journal Global Biogeochemical Cycles.

    Airing out artifacts

    Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

    Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

    “But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

    Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

    Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

    “We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

    Bottoming out

    The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

    The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

    “We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says. 

    The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

    “We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

    Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

    “How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

    “This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s oxygen supply is controlled.”

    This research is supported, in part, by the Simons Foundation. More

  • in

    Q&A: Options for the Diablo Canyon nuclear plant

    The Diablo Canyon nuclear plant in California, the only one still operating in the state, is set to close in 2025. A team of researchers at MIT’s Center for Advanced Nuclear Energy Systems, Abdul Latif Jameel Water and Food Systems Lab, and Center for Energy and Environmental Policy Research; Stanford’s Precourt Energy Institute; and energy analysis firm LucidCatalyst LLC have analyzed the potential benefits the plant could provide if its operation were extended to 2030 or 2045.

    They found that this nuclear plant could simultaneously help to stabilize the state’s electric grid, provide desalinated water to supplement the state’s chronic water shortages, and provide carbon-free hydrogen fuel for transportation. MIT News asked report co-authors Jacopo Buongiorno, the TEPCO Professor of Nuclear Science and Engineering, and John Lienhard, the Jameel Professor of Water and Food, to discuss the group’s findings.

    Q: Your report suggests co-locating a major desalination plant alongside the existing Diablo Canyon power plant. What would be the potential benefits from operating a desalination plant in conjunction with the power plant?

    Lienhard: The cost of desalinated water produced at Diablo Canyon would be lower than for a stand-alone plant because the cost of electricity would be significantly lower and you could take advantage of the existing infrastructure for the intake of seawater and the outfall of brine. Electricity would be cheaper because the location takes advantage of Diablo Canyon’s unique capability to provide low cost, zero-carbon baseload power.

    Depending on the scale at which the desalination plant is built, you could make a very significant impact on the water shortfalls of state and federal projects in the area. In fact, one of the numbers that came out of this study was that an intermediate-sized desalination plant there would produce more fresh water than the highest estimate of the net yield from the proposed Delta Conveyance Project on the Sacramento River. You could get that amount of water at Diablo Canyon for an investment cost less than half as large, and without the associated impacts that would come with the Delta Conveyance Project.

    And the technology envisioned for desalination here, reverse osmosis, is available off the shelf. You can buy this equipment today. In fact, it’s already in use in California and thousands of other places around the world.

    Q: You discuss in the report three potential products from the Diablo Canyon plant:  desalinatinated water, power for the grid, and clean hydrogen. How well can the plant accommodate all of those efforts, and are there advantages to combining them as opposed to doing any one of them separately?

    Buongiorno: California, like many other regions in the world, is facing multiple challenges as it seeks to reduce carbon emissions on a grand scale. First, the wide deployment of intermittent energy sources such as solar and wind creates a great deal of variability on the grid that can be balanced by dispatchable firm power generators like Diablo. So, the first mission for Diablo is to continue to provide reliable, clean electricity to the grid.

    The second challenge is the prolonged drought and water scarcity for the state in general. And one way to address that is water desalination co-located with the nuclear plant at the Diablo site, as John explained.

    The third challenge is related to decarbonization the transportation sector. A possible approach is replacing conventional cars and trucks with vehicles powered by fuel cells which consume hydrogen. Hydrogen has to be produced from a primary energy source. Nuclear power, through a process called electrolysis, can do that quite efficiently and in a manner that is carbon-free.

    Our economic analysis took into account the expected revenue from selling these multiple products — electricity for the grid, hydrogen for the transportation sector, water for farmers or other local users — as well as the costs associated with deploying the new facilities needed to produce desalinated water and hydrogen. We found that, if Diablo’s operating license was extended until 2035, it would cut carbon emissions by an average of 7 million metric tons a year — a more than 11 percent reduction from 2017 levels — and save ratepayers $2.6 billion in power system costs.

    Further delaying the retirement of Diablo to 2045 would spare 90,000 acres of land that would need to be dedicated to renewable energy production to replace the facility’s capacity, and it would save ratepayers up to $21 billion in power system costs.

    Finally, if Diablo was operated as a polygeneration facility that provides electricity, desalinated water, and hydrogen simultaneously, its value, quantified in terms of dollars per unit electricity generated, could increase by 50 percent.

    Lienhard: Most of the desalination scenarios that we considered did not consume the full electrical output of that plant, meaning that under most scenarios you would continue to make electricity and do something with it, beyond just desalination. I think it’s also important to remember that this power plant produces 15 percent of California’s carbon-free electricity today and is responsible for 8 percent of the state’s total electrical production. In other words, Diablo Canyon is a very large factor in California’s decarbonization. When or if this plant goes offline, the near-term outcome is likely to be increased reliance on natural gas to produce electricity, meaning a rise in California’s carbon emissions.

    Q: This plant in particular has been highly controversial since its inception. What’s your assessment of the plant’s safety beyond its scheduled shutdown, and how do you see this report as contributing to the decision-making about that shutdown?

    Buongiorno: The Diablo Canyon Nuclear Power Plant has a very strong safety record. The potential safety concern for Diablo is related to its proximity to several fault lines. Being located in California, the plant was designed to withstand large earthquakes to begin with. Following the Fukushima accident in 2011, the Nuclear Regulatory Commission reviewed the plant’s ability to withstand external events (e.g., earthquakes, tsunamis, floods, tornadoes, wildfires, hurricanes) of exceptionally rare and severe magnitude. After nine years of assessment the NRC’s conclusion is that “existing seismic capacity or effective flood protection [at Diablo Canyon] will address the unbounded reevaluated hazards.” That is, Diablo was designed and built to withstand even the rarest and strongest earthquakes that are physically possible at this site.

    As an additional level of protection, the plant has been retrofitted with special equipment and procedures meant to ensure reliable cooling of the reactor core and spent fuel pool under a hypothetical scenario in which all design-basis safety systems have been disabled by a severe external event.

    Lienhard: As for the potential impact of this report, PG&E [the California utility] has already made the decision to shut down the plant, and we and others hope that decision will be revisited and reversed. We believe that this report gives the relevant stakeholders and policymakers a lot of information about options and value associated with keeping the plant running, and about how California could benefit from clean water and clean power generated at Diablo Canyon. It’s not up to us to make the decision, of course — that is a decision that must be made by the people of California. All we can do is provide information.

    Q: What are the biggest challenges or obstacles to seeing these ideas implemented?

    Lienhard: California has very strict environmental protection regulations, and it’s good that they do. One of the areas of great concern to California is the health of the ocean and protection of the coastal ecosystem. As a result, very strict rules are in place about the intake and outfall of both power plants and desalination plants, to protect marine life. Our analysis suggests that this combined plant can be implemented within the parameters prescribed by the California Ocean Plan and that it can meet the regulatory requirements.

    We believe that deeper analysis would be needed before you could proceed. You would need to do site studies and really get out into the water and look in detail at what’s there. But the preliminary analysis is positive. A second challenge is that the discourse in California around nuclear power has generally not been very supportive, and similarly some groups in California oppose desalination. We expect that that both of those points of view would be part of the conversation about whether or not to procede with this project.

    Q: How particular is this analysis to the specifics of this location? Are there aspects of it that apply to other nuclear plants, domestically or globally?

    Lienhard: Hundreds of nuclear plants around the world are situated along the coast, and many are in water stressed regions. Although our analysis focused on Diablo Canyon, we believe that the general findings are applicable to many other seaside nuclear plants, so that this approach and these conclusions could potentially be applied at hundreds of sites worldwide. More

  • in

    Saving seaweed with machine learning

    Last year, Charlene Xia ’17, SM ’20 found herself at a crossroads. She was finishing up her master’s degree in media arts and sciences from the MIT Media Lab and had just submitted applications to doctoral degree programs. All Xia could do was sit and wait. In the meantime, she narrowed down her career options, regardless of whether she was accepted to any program.

    “I had two thoughts: I’m either going to get a PhD to work on a project that protects our planet, or I’m going to start a restaurant,” recalls Xia.

    Xia poured over her extensive cookbook collection, researching international cuisines as she anxiously awaited word about her graduate school applications. She even looked into the cost of a food truck permit in the Boston area. Just as she started hatching plans to open a plant-based skewer restaurant, Xia received word that she had been accepted into the mechanical engineering graduate program at MIT.

    Shortly after starting her doctoral studies, Xia’s advisor, Professor David Wallace, approached her with an interesting opportunity. MathWorks, a software company known for developing the MATLAB computing platform, had announced a new seed funding program in MIT’s Department of Mechanical Engineering. The program encouraged collaborative research projects focused on the health of the planet.

    “I saw this as a super-fun opportunity to combine my passion for food, my technical expertise in ocean engineering, and my interest in sustainably helping our planet,” says Xia.

    Play video

    From MIT Mechanical Engineering: “Saving Seaweed with Machine Learning”

    Wallace knew Xia would be up to the task of taking an interdisciplinary approach to solve an issue related to the health of the planet. “Charlene is a remarkable student with extraordinary talent and deep thoughtfulness. She is pretty much fearless, embracing challenges in almost any domain with the well-founded belief that, with effort, she will become a master,” says Wallace.

    Alongside Wallace and Associate Professor Stefanie Mueller, Xia proposed a project to predict and prevent the spread of diseases in aquaculture. The team focused on seaweed farms in particular.

    Already popular in East Asian cuisines, seaweed holds tremendous potential as a sustainable food source for the world’s ever-growing population. In addition to its nutritive value, seaweed combats various environmental threats. It helps fight climate change by absorbing excess carbon dioxide in the atmosphere, and can also absorb fertilizer run-off, keeping coasts cleaner.

    As with so much of marine life, seaweed is threatened by the very thing it helps mitigate against: climate change. Climate stressors like warm temperatures or minimal sunlight encourage the growth of harmful bacteria such as ice-ice disease. Within days, entire seaweed farms are decimated by unchecked bacterial growth.

    To solve this problem, Xia turned to the microbiota present in these seaweed farms as a predictive indicator of any threat to the seaweed or livestock. “Our project is to develop a low-cost device that can detect and prevent diseases before they affect seaweed or livestock by monitoring the microbiome of the environment,” says Xia.

    The team pairs old technology with the latest in computing. Using a submersible digital holographic microscope, they take a 2D image. They then use a machine learning system known as a neural network to convert the 2D image into a representation of the microbiome present in the 3D environment.

    “Using a machine learning network, you can take a 2D image and reconstruct it almost in real time to get an idea of what the microbiome looks like in a 3D space,” says Xia.

    The software can be run in a small Raspberry Pi that could be attached to the holographic microscope. To figure out how to communicate these data back to the research team, Xia drew upon her master’s degree research.

    In that work, under the guidance of Professor Allan Adams and Professor Joseph Paradiso in the Media Lab, Xia focused on developing small underwater communication devices that can relay data about the ocean back to researchers. Rather than the usual $4,000, these devices were designed to cost less than $100, helping lower the cost barrier for those interested in uncovering the many mysteries of our oceans. The communication devices can be used to relay data about the ocean environment from the machine learning algorithms.

    By combining these low-cost communication devices along with microscopic images and machine learning, Xia hopes to design a low-cost, real-time monitoring system that can be scaled to cover entire seaweed farms.

    “It’s almost like having the ‘internet of things’ underwater,” adds Xia. “I’m developing this whole underwater camera system alongside the wireless communication I developed that can give me the data while I’m sitting on dry land.”

    Armed with these data about the microbiome, Xia and her team can detect whether or not a disease is about to strike and jeopardize seaweed or livestock before it is too late.

    While Xia still daydreams about opening a restaurant, she hopes the seaweed project will prompt people to rethink how they consider food production in general.

    “We should think about farming and food production in terms of the entire ecosystem,” she says. “My meta-goal for this project would be to get people to think about food production in a more holistic and natural way.” More

  • in

    How diet affects tumors

    In recent years, there has been some evidence that dietary interventions can help to slow the growth of tumors. A new study from MIT, which analyzed two different diets in mice, reveals how those diets affect cancer cells, and offers an explanation for why restricting calories may slow tumor growth.

    The study examined the effects of a calorically restricted diet and a ketogenic diet in mice with pancreatic tumors. While both of these diets reduce the amount of sugar available to tumors, the researchers found that only the calorically restricted diet reduced the availability of fatty acids, and this was linked to a slowdown in tumor growth.

    The findings do not suggest that cancer patients should try to follow either of these diets, the researchers say. Instead, they believe the findings warrant further study to determine how dietary interventions might be combined with existing or emerging drugs to help patients with cancer.

    “There’s a lot of evidence that diet can affect how fast your cancer progresses, but this is not a cure,” says Matthew Vander Heiden, director of MIT’s Koch Institute for Integrative Cancer Research and the senior author of the study. “While the findings are provocative, further study is needed, and individual patients should talk to their doctor about the right dietary interventions for their cancer.”

    MIT postdoc Evan Lien is the lead author of the paper, which appears today in Nature.

    Metabolic mechanism

    Vander Heiden, who is also a medical oncologist at Dana-Farber Cancer Institute, says his patients often ask him about the potential benefits of various diets, but there is not enough scientific evidence available to offer any definitive advice. Many of the dietary questions that patients have focus on either a calorie-restricted diet, which reduces calorie consumption by 25 to 50 percent, or a ketogenic diet, which is low in carbohydrates and high in fat and protein.

    Previous studies have suggested that a calorically restricted diet might slow tumor growth in some contexts, and such a diet has been shown to extend lifespan in mice and many other animal species. A smaller number of studies exploring the effects of a ketogenic diet on cancer have produced inconclusive results.

    “A lot of the advice or cultural fads that are out there aren’t necessarily always based on very good science,” Lien says. “It seemed like there was an opportunity, especially with our understanding of cancer metabolism having evolved so much over the past 10 years or so, that we could take some of the biochemical principles that we’ve learned and apply those concepts to understanding this complex question.”

    Cancer cells consume a great deal of glucose, so some scientists had hypothesized that either the ketogenic diet or calorie restriction might slow tumor growth by reducing the amount of glucose available. However, the MIT team’s initial experiments in mice with pancreatic tumors showed that calorie restriction has a much greater effect on tumor growth than the ketogenic diet, so the researchers suspected that glucose levels were not playing a major role in the slowdown.

    To dig deeper into the mechanism, the researchers analyzed tumor growth and nutrient concentration in mice with pancreatic tumors, which were fed either a normal, ketogenic, or calorie-restricted diet. In both the ketogenic and calorie-restricted mice, glucose levels went down. In the calorie-restricted mice, lipid levels also went down, but in mice on the ketogenic diet, they went up.

    Lipid shortages impair tumor growth because cancer cells need lipids to construct their cell membranes. Normally, when lipids aren’t available in a tissue, cells can make their own. As part of this process, they need to maintain the right balance of saturated and unsaturated fatty acids, which requires an enzyme called stearoyl-CoA desaturase (SCD). This enzyme is responsible for converting saturated fatty acids into unsaturated fatty acids.

    Both calorie-restricted and ketogenic diets reduce SCD activity, but mice on the ketogenic diet had lipids available to them from their diet, so they didn’t need to use SCD. Mice on the calorie-restricted diet, however, couldn’t get fatty acids from their diet or produce their own. In these mice, tumor growth slowed significantly, compared to mice on the ketogenic diet.

    “Not only does caloric restriction starve tumors of lipids, it also impairs the process that allows them to adapt to it. That combination is really contributing to the inhibition of tumor growth,” Lien says.

    Dietary effects

    In addition to their mouse research, the researchers also looked at some human data. Working with Brian Wolpin, an oncologist at Dana-Farber Cancer Institute and an author of the paper, the team obtained data from a large cohort study that allowed them to analyze the relationship between dietary patterns and survival times in pancreatic cancer patients. From that study, the researchers found that the type of fat consumed appears to influence how patients on a low-sugar diet fare after a pancreatic cancer diagnosis, although the data are not complete enough to draw any conclusions about the effect of diet, the researchers say.

    Although this study showed that calorie restriction has beneficial effects in mice, the researchers say they do not recommend that cancer patients follow a calorie-restricted diet, which is difficult to maintain and can have harmful side effects. However, they believe that cancer cells’ dependence on the availability of unsaturated fatty acids could be exploited to develop drugs that might help slow tumor growth.

    One possible therapeutic strategy could be inhibition of the SCD enzyme, which would cut off tumor cells’ ability to produce unsaturated fatty acids.

    “The purpose of these studies isn’t necessarily to recommend a diet, but it’s to really understand the underlying biology,” Lien says. “They provide some sense of the mechanisms of how these diets work, and that can lead to rational ideas on how we might mimic those situations for cancer therapy.”

    The researchers now plan to study how diets with a variety of fat sources — including plant or animal-based fats with defined differences in saturated, monounsaturated, and polyunsaturated fatty acid content — alter tumor fatty acid metabolism and the ratio of unsaturated to saturated fatty acids.

    The research was funded by the Damon Runyon Cancer Research Foundation, the National Institutes of Health, the Lustgarten Foundation, the Dana-Farber Cancer Institute Hale Family Center for Pancreatic Cancer Research, Stand Up to Cancer, the Pancreatic Cancer Action Network, the Noble Effort Fund, the Wexler Family Fund, Promises for Purple, the Bob Parsons Fund, the Emerald Foundation, the Howard Hughes Medical Institute, the MIT Center for Precision Cancer Medicine, and the Ludwig Center at MIT. More

  • in

    Rover images confirm Jezero crater is an ancient Martian lake

    The first scientific analysis of images taken by NASA’s Perseverance rover has now confirmed that Mars’ Jezero crater — which today is a dry, wind-eroded depression — was once a quiet lake, fed steadily by a small river some 3.7 billion years ago.

    The images also reveal evidence that the crater endured flash floods. This flooding was energetic enough to sweep up large boulders from tens of miles upstream and deposit them into the lakebed, where the massive rocks lie today.

    The new analysis, published today in the journal Science, is based on images of the outcropping rocks inside the crater on its western side. Satellites had previously shown that this outcrop, seen from above, resembled river deltas on Earth, where layers of sediment are deposited in the shape of a fan as the river feeds into a lake.

    Perseverance’s new images, taken from inside the crater, confirm that this outcrop was indeed a river delta. Based on the sedimentary layers in the outcrop, it appears that the river delta fed into a lake that was calm for much of its existence, until a dramatic shift in climate triggered episodic flooding at or toward the end of the lake’s history.

    “If you look at these images, you’re basically staring at this epic desert landscape. It’s the most forlorn place you could ever visit,” says Benjamin Weiss, professor of planetary sciences in MIT’s Department of Earth, Atmospheric and Planetary Sciences and a member of the analysis team. “There’s not a drop of water anywhere, and yet, here we have evidence of a very different past. Something very profound happened in the planet’s history.”

    As the rover explores the crater, scientists hope to uncover more clues to its climatic evolution. Now that they have confirmed the crater was once a lake environment, they believe its sediments could hold traces of ancient aqueous life. In its mission going forward, Perseverance will look for locations to collect and preserve sediments. These samples will eventually be returned to Earth, where scientists can probe them for Martian biosignatures.

    “We now have the opportunity to look for fossils,” says team member Tanja Bosak, associate professor of geobiology at MIT. “It will take some time to get to the rocks that we really hope to sample for signs of life. So, it’s a marathon, with a lot of potential.”

    Tilted beds

    On Feb. 18, 2021, the Perseverance rover landed on the floor of Jezero crater, a little more than a mile away from its western fan-shaped outcrop. In the first three months, the vehicle remained stationary as NASA engineers performed remote checks of the rover’s many instruments.

    During this time, two of Perseverance’s cameras, Mastcam-Z and the SuperCam Remote Micro-Imager (RMI), captured images of their surroundings, including long-distance photos of the outcrop’s edge and a formation known as Kodiak butte, a smaller outcop that planetary geologists surmise may have once been connected to the main fan-shaped outcrop but has since partially eroded.

    Once the rover downlinked images to Earth, NASA’s Perseverance science team processed and combined the images, and were able to observe distinct beds of sediment along Kodiak butte in surprisingly high resolution. The researchers measured each layer’s thickness, slope, and lateral extent, finding that the sediment must have been deposited by flowing water into a lake, rather than by wind, sheet-like floods, or other geologic processes.

    The rover also captured similar tilted sediment beds along the main outcrop. These images, together with those of Kodiak, confirm that the fan-shaped formation was indeed an ancient delta and that this delta fed into an ancient Martian lake.

    “Without driving anywhere, the rover was able to solve one of the big unknowns, which was that this crater was once a lake,” Weiss says. “Until we actually landed there and confirmed it was a lake, it was always a question.”

    Boulder flow

    When the researchers took a closer look at images of the main outcrop, they noticed large boulders and cobbles embedded in the youngest, topmost layers of the delta. Some boulders measured as wide as 1 meter across, and were estimated to weigh up to several tons. These massive rocks, the team concluded, must have come from outside the crater, and was likely part of bedrock located on the crater rim or else 40 or more miles upstream.

    Judging from their current location and dimensions, the team says the boulders were carried downstream and into the lakebed by a flash-flood that flowed up to 9 meters per second and moved up to 3,000 cubic meters of water per second.

    “You need energetic flood conditions to carry rocks that big and heavy,” Weiss says. “It’s a special thing that may be indicative of a fundamental change in the local hydrology or perhaps the regional climate on Mars.”

    Because the huge rocks lie in the upper layers of the delta, they represent the most recently deposited material. The boulders sit atop layers of older, much finer sediment. This stratification, the researchers say, indicates that for much of its existence, the ancient lake was filled by a gently flowing river. Fine sediments — and possibly organic material — drifted down the river, and settled into a gradual, sloping delta.

    However, the crater later experienced sudden flash floods that deposited large boulders onto the delta. Once the lake dried up, and over billions of years wind eroded the landscape, leaving the crater we see today.

    The cause of this climate turnaround is unknown, although Weiss says the delta’s boulders may hold some answers.

    “The most surprising thing that’s come out of these images is the potential opportunity to catch the time when this crater transitioned from an Earth-like habitable environment, to this desolate landscape wasteland we see now,” he says. “These boulder beds may be records of this transition, and we haven’t seen this in other places on Mars.”

    This research was supported, in part, by NASA. More

  • in

    A robot that finds lost items

    A busy commuter is ready to walk out the door, only to realize they’ve misplaced their keys and must search through piles of stuff to find them. Rapidly sifting through clutter, they wish they could figure out which pile was hiding the keys.

    Researchers at MIT have created a robotic system that can do just that. The system, RFusion, is a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper. It fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.

    The RFusion prototype the researchers developed relies on RFID tags, which are cheap, battery-less tags that can be stuck to an item and reflect signals sent by an antenna. Because RF signals can travel through most surfaces (like the mound of dirty laundry that may be obscuring the keys), RFusion is able to locate a tagged item within a pile.

    Using machine learning, the robotic arm automatically zeroes-in on the object’s exact location, moves the items on top of it, grasps the object, and verifies that it picked up the right thing. The camera, antenna, robotic arm, and AI are fully integrated, so RFusion can work in any environment without requiring a special set up.

    While finding lost keys is helpful, RFusion could have many broader applications in the future, like sorting through piles to fulfill orders in a warehouse, identifying and installing components in an auto manufacturing plant, or helping an elderly individual perform daily tasks in the home, though the current prototype isn’t quite fast enough yet for these uses.

    “This idea of being able to find items in a chaotic world is an open problem that we’ve been working on for a few years. Having robots that are able to search for things under a pile is a growing need in industry today. Right now, you can think of this as a Roomba on steroids, but in the near term, this could have a lot of applications in manufacturing and warehouse environments,” said senior author Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab.

    Co-authors include research assistant Tara Boroushaki, the lead author; electrical engineering and computer science graduate student Isaac Perper; research associate Mergen Nachin; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. The research will be presented at the Association for Computing Machinery Conference on Embedded Networked Senor Systems next month.

    Play video

    Sending signals

    RFusion begins searching for an object using its antenna, which bounces signals off the RFID tag (like sunlight being reflected off a mirror) to identify a spherical area in which the tag is located. It combines that sphere with the camera input, which narrows down the object’s location. For instance, the item can’t be located on an area of a table that is empty.

    But once the robot has a general idea of where the item is, it would need to swing its arm widely around the room taking additional measurements to come up with the exact location, which is slow and inefficient.

    The researchers used reinforcement learning to train a neural network that can optimize the robot’s trajectory to the object. In reinforcement learning, the algorithm is trained through trial and error with a reward system.

    “This is also how our brain learns. We get rewarded from our teachers, from our parents, from a computer game, etc. The same thing happens in reinforcement learning. We let the agent make mistakes or do something right and then we punish or reward the network. This is how the network learns something that is really hard for it to model,” Boroushaki explains.

    In the case of RFusion, the optimization algorithm was rewarded when it limited the number of moves it had to make to localize the item and the distance it had to travel to pick it up.

    Once the system identifies the exact right spot, the neural network uses combined RF and visual information to predict how the robotic arm should grasp the object, including the angle of the hand and the width of the gripper, and whether it must remove other items first. It also scans the item’s tag one last time to make sure it picked up the right object.

    Cutting through clutter

    The researchers tested RFusion in several different environments. They buried a keychain in a box full of clutter and hid a remote control under a pile of items on a couch.

    But if they fed all the camera data and RF measurements to the reinforcement learning algorithm, it would have overwhelmed the system. So, drawing on the method a GPS uses to consolidate data from satellites, they summarized the RF measurements and limited the visual data to the area right in front of the robot.

    Their approach worked well — RFusion had a 96 percent success rate when retrieving objects that were fully hidden under a pile.

    “Sometimes, if you only rely on RF measurements, there is going to be an outlier, and if you rely only on vision, there is sometimes going to be a mistake from the camera. But if you combine them, they are going to correct each other. That is what made the system so robust,” Boroushaki says.

    In the future, the researchers hope to increase the speed of the system so it can move smoothly, rather than stopping periodically to take measurements. This would enable RFusion to be deployed in a fast-paced manufacturing or warehouse setting.

    Beyond its potential industrial uses, a system like this could even be incorporated into future smart homes to assist people with any number of household tasks, Boroushaki says.

    “Every year, billions of RFID tags are used to identify objects in today’s complex supply chains, including clothing and lots of other consumer goods. The RFusion approach points the way to autonomous robots that can dig through a pile of mixed items and sort them out using the data stored in the RFID tags, much more efficiently than having to inspect each item individually, especially when the items look similar to a computer vision system,” says Matthew S. Reynolds, CoMotion Presidential Innovation Fellow and associate professor of electrical and computer engineering at the University of Washington, who was not involved in the research. “The RFusion approach is a great step forward for robotics operating in complex supply chains where identifying and ‘picking’ the right item quickly and accurately is the key to getting orders fulfilled on time and keeping demanding customers happy.”

    The research is sponsored by the National Science Foundation, a Sloan Research Fellowship, NTT DATA, Toppan, Toppan Forms, and the Abdul Latif Jameel Water and Food Systems Lab. More