More stories

  • in

    Study: The ocean’s color is changing as a consequence of climate change

    The ocean’s color has changed significantly over the last 20 years, and the global trend is likely a consequence of human-induced climate change, report scientists at MIT, the National Oceanography Center in the U.K., and elsewhere.  

    In a study appearing today in Nature, the team writes that they have detected changes in ocean color over the past two decades that cannot be explained by natural, year-to-year variability alone. These color shifts, though subtle to the human eye, have occurred over 56 percent of the world’s oceans — an expanse that is larger than the total land area on Earth.

    In particular, the researchers found that tropical ocean regions near the equator have become steadily greener over time. The shift in ocean color indicates that ecosystems within the surface ocean must also be changing, as the color of the ocean is a literal reflection of the organisms and materials in its waters.

    At this point, the researchers cannot say how exactly marine ecosystems are changing to reflect the shifting color. But they are pretty sure of one thing: Human-induced climate change is likely the driver.

    “I’ve been running simulations that have been telling me for years that these changes in ocean color are going to happen,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences and the Center for Global Change Science. “To actually see it happening for real is not surprising, but frightening. And these changes are consistent with man-induced changes to our climate.”

    “This gives additional evidence of how human activities are affecting life on Earth over a huge spatial extent,” adds lead author B. B. Cael PhD ’19 of the National Oceanography Center in Southampton, U.K. “It’s another way that humans are affecting the biosphere.”

    The study’s co-authors also include Stephanie Henson of the National Oceanography Center, Kelsey Bisson at Oregon State University, and Emmanuel Boss of the University of Maine.

    Above the noise

    The ocean’s color is a visual product of whatever lies within its upper layers. Generally, waters that are deep blue reflect very little life, whereas greener waters indicate the presence of ecosystems, and mainly phytoplankton — plant-like microbes that are abundant in upper ocean and that contain the green pigment chlorophyll. The pigment helps plankton harvest sunlight, which they use to capture carbon dioxide from the atmosphere and convert it into sugars.

    Phytoplankton are the foundation of the marine food web that sustains progressively more complex organisms, on up to krill, fish, and seabirds and marine mammals. Phytoplankton are also a powerful muscle in the ocean’s ability to capture and store carbon dioxide. Scientists are therefore keen to monitor phytoplankton across the surface oceans and to see how these essential communities might respond to climate change. To do so, scientists have tracked changes in chlorophyll, based on the ratio of how much blue versus green light is reflected from the ocean surface, which can be monitored from space

    But around a decade ago, Henson, who is a co-author of the current study, published a paper with others, which showed that, if scientists were tracking chlorophyll alone, it would take at least 30 years of continuous monitoring to detect any trend that was driven specifically by climate change. The reason, the team argued, was that the large, natural variations in chlorophyll from year to year would overwhelm any anthropogenic influence on chlorophyll concentrations. It would therefore take several decades to pick out a meaningful, climate-change-driven signal amid the normal noise.

    In 2019, Dutkiewicz and her colleagues published a separate paper, showing through a new model that the natural variation in other ocean colors is much smaller compared to that of chlorophyll. Therefore, any signal of climate-change-driven changes should be easier to detect over the smaller, normal variations of other ocean colors. They predicted that such changes should be apparent within 20, rather than 30 years of monitoring.

    “So I thought, doesn’t it make sense to look for a trend in all these other colors, rather than in chlorophyll alone?” Cael says. “It’s worth looking at the whole spectrum, rather than just trying to estimate one number from bits of the spectrum.”

     The power of seven

    In the current study, Cael and the team analyzed measurements of ocean color taken by the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite, which has been monitoring ocean color for 21 years. MODIS takes measurements in seven visible wavelengths, including the two colors researchers traditionally use to estimate chlorophyll.

    The differences in color that the satellite picks up are too subtle for human eyes to differentiate. Much of the ocean appears blue to our eye, whereas the true color may contain a mix of subtler wavelengths, from blue to green and even red.

    Cael carried out a statistical analysis using all seven ocean colors measured by the satellite from 2002 to 2022 together. He first looked at how much the seven colors changed from region to region during a given year, which gave him an idea of their natural variations. He then zoomed out to see how these annual variations in ocean color changed over a longer stretch of two decades. This analysis turned up a clear trend, above the normal year-to-year variability.

    To see whether this trend is related to climate change, he then looked to Dutkiewicz’s model from 2019. This model simulated the Earth’s oceans under two scenarios: one with the addition of greenhouse gases, and the other without it. The greenhouse-gas model predicted that a significant trend should show up within 20 years and that this trend should cause changes to ocean color in about 50 percent of the world’s surface oceans — almost exactly what Cael found in his analysis of real-world satellite data.

    “This suggests that the trends we observe are not a random variation in the Earth system,” Cael says. “This is consistent with anthropogenic climate change.”

    The team’s results show that monitoring ocean colors beyond chlorophyll could give scientists a clearer, faster way to detect climate-change-driven changes to marine ecosystems.

    “The color of the oceans has changed,” Dutkiewicz says. “And we can’t say how. But we can say that changes in color reflect changes in plankton communities, that will impact everything that feeds on plankton. It will also change how much the ocean will take up carbon, because different types of plankton have different abilities to do that. So, we hope people take this seriously. It’s not only models that are predicting these changes will happen. We can now see it happening, and the ocean is changing.”

    This research was supported, in part, by NASA. More

  • in

    Studying rivers from worlds away

    Rivers have flowed on two other worlds in the solar system besides Earth: Mars, where dry tracks and craters are all that’s left of ancient rivers and lakes, and Titan, Saturn’s largest moon, where rivers of liquid methane still flow today.

    A new technique developed by MIT geologists allows scientists to see how intensely rivers used to flow on Mars, and how they currently flow on Titan. The method uses satellite observations to estimate the rate at which rivers move fluid and sediment downstream.

    Applying their new technique, the MIT team calculated how fast and deep rivers were in certain regions on Mars more than 1 billion years ago. They also made similar estimates for currently active rivers on Titan, even though the moon’s thick atmosphere and distance from Earth make it harder to explore, with far fewer available images of its surface than those of Mars.

    “What’s exciting about Titan is that it’s active. With this technique, we have a method to make real predictions for a place where we won’t get more data for a long time,” says Taylor Perron, the Cecil and Ida Green Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “And on Mars, it gives us a time machine, to take the rivers that are dead now and get a sense of what they were like when they were actively flowing.”

    Perron and his colleagues have published their results today in the Proceedings of the National Academy of Sciences. Perron’s MIT co-authors are first author Samuel Birch, Paul Corlies, and Jason Soderblom, with Rose Palermo and Andrew Ashton of the Woods Hole Oceanographic Institution (WHOI), Gary Parker of the University of Illinois at Urbana-Champaign, and collaborators from the University of California at Los Angeles, Yale University, and Cornell University.

    River math

    The team’s study grew out of Perron and Birch’s puzzlement over Titan’s rivers. The images taken by NASA’s Cassini spacecraft have shown a curious lack of fan-shaped deltas at the mouths of most of the moon’s rivers, contrary to many rivers on Earth. Could it be that Titan’s rivers don’t carry enough flow or sediment to build deltas?

    The group built on the work of co-author Gary Parker, who in the 2000s developed a series of mathematical equations to describe river flow on Earth. Parker had studied measurements of rivers taken directly in the field by others. From these data, he found there were certain universal relationships between a river’s physical dimensions — its width, depth, and slope — and the rate at which it flowed. He drew up equations to describe these relationships mathematically, accounting for other variables such as the gravitational field acting on the river, and the size and density of the sediment being pushed along a river’s bed.

    “This means that rivers with different gravity and materials should follow similar relationships,” Perron says. “That opened up a possibility to apply this to other planets too.”

    Getting a glimpse

    On Earth, geologists can make field measurements of a river’s width, slope, and average sediment size, all of which can be fed into Parker’s equations to accurately predict a river’s flow rate, or how much water and sediment it can move downstream. But for rivers on other planets, measurements are more limited, and largely based on images and elevation measurements collected by remote satellites. For Mars, multiple orbiters have taken high-resolution images of the planet. For Titan, views are few and far between.

    Birch realized that any estimate of river flow on Mars or Titan would have to be based on the few characteristics that can be measured from remote images and topography — namely, a river’s width and slope. With some algebraic tinkering, he adapted Parker’s equations to work only with width and slope inputs. He then assembled data from 491 rivers on Earth, tested the modified equations on these rivers, and found that the predictions based solely on each river’s width and slope were accurate.

    Then, he applied the equations to Mars, and specifically, to the ancient rivers leading into Gale and Jezero Craters, both of which are thought to have been water-filled lakes billions of years ago. To predict the flow rate of each river, he plugged into the equations Mars’ gravity, and estimates of each river’s width and slope, based on images and elevation measurements taken by orbiting satellites.

    From their predictions of flow rate, the team found that rivers likely flowed for at least 100,000 years at Gale Crater and at least 1 million years at Jezero Crater — long enough to have possibly supported life. They were also able to compare their predictions of the average size of sediment on each river’s bed with actual field measurements of Martian grains near each river, taken by NASA’s Curiosity and Perseverance rovers. These few field measurements allowed the team to check that their equations, applied on Mars, were accurate.

    The team then took their approach to Titan. They zeroed in on two locations where river slopes can be measured, including a river that flows into a lake the size of Lake Ontario. This river appears to form a delta as it feeds into the lake. However, the delta is one of only a few thought to exist on the moon — nearly every viewable river flowing into a lake mysteriously lacks a delta. The team also applied their method to one of these other delta-less rivers.

    They calculated both rivers’ flow and found that they may be comparable to some of the biggest rivers on Earth, with deltas estimated to have a flow rate as large as the Mississippi. Both rivers should move enough sediment to build up deltas. Yet, most rivers on Titan lack the fan-shaped deposits. Something else must be at work to explain this lack of river deposits.

    In another finding, the team calculated that rivers on Titan should be wider and have a gentler slope than rivers carrying the same flow on Earth or Mars. “Titan is the most Earth-like place,” Birch says. ”We’ve only gotten a glimpse of it. There’s so much more that we know is down there, and this remote technique is pushing us a little closer.”

    This research was supported, in part, by NASA and the Heising-Simons Foundation. More

  • in

    Chemists discover why photosynthetic light-harvesting is so efficient

    When photosynthetic cells absorb light from the sun, packets of energy called photons leap between a series of light-harvesting proteins until they reach the photosynthetic reaction center. There, cells convert the energy into electrons, which eventually power the production of sugar molecules.

    This transfer of energy through the light-harvesting complex occurs with extremely high efficiency: Nearly every photon of light absorbed generates an electron, a phenomenon known as near-unity quantum efficiency.

    A new study from MIT chemists offers a potential explanation for how proteins of the light-harvesting complex, also called the antenna, achieve that high efficiency. For the first time, the researchers were able to measure the energy transfer between light-harvesting proteins, allowing them to discover that the disorganized arrangement of these proteins boosts the efficiency of the energy transduction.

    “In order for that antenna to work, you need long-distance energy transduction. Our key finding is that the disordered organization of the light-harvesting proteins enhances the efficiency of that long-distance energy transduction,” says Gabriela Schlau-Cohen, an associate professor of chemistry at MIT and the senior author of the new study.

    MIT postdocs Dihao Wang and Dvir Harris and former MIT graduate student Olivia Fiebig PhD ’22 are the lead authors of the paper, which appears this week in the Proceedings of the National Academy of Sciences. Jianshu Cao, an MIT professor of chemistry, is also an author of the paper.

    Energy capture

    For this study, the MIT team focused on purple bacteria, which are often found in oxygen-poor aquatic environments and are commonly used as a model for studies of photosynthetic light-harvesting.

    Within these cells, captured photons travel through light-harvesting complexes consisting of proteins and light-absorbing pigments such as chlorophyll. Using ultrafast spectroscopy, a technique that uses extremely short laser pulses to study events that happen on timescales of femtoseconds to nanoseconds, scientists have been able to study how energy moves within a single one of these proteins. However, studying how energy travels between these proteins has proven much more challenging because it requires positioning multiple proteins in a controlled way.

    To create an experimental setup where they could measure how energy travels between two proteins, the MIT team designed synthetic nanoscale membranes with a composition similar to those of naturally occurring cell membranes. By controlling the size of these membranes, known as nanodiscs, they were able to control the distance between two proteins embedded within the discs.

    For this study, the researchers embedded two versions of the primary light-harvesting protein found in purple bacteria, known as LH2 and LH3, into their nanodiscs. LH2 is the protein that is present during normal light conditions, and LH3 is a variant that is usually expressed only during low light conditions.

    Using the cryo-electron microscope at the MIT.nano facility, the researchers could image their membrane-embedded proteins and show that they were positioned at distances similar to those seen in the native membrane. They were also able to measure the distances between the light-harvesting proteins, which were on the scale of 2.5 to 3 nanometers.

    Disordered is better

    Because LH2 and LH3 absorb slightly different wavelengths of light, it is possible to use ultrafast spectroscopy to observe the energy transfer between them. For proteins spaced closely together, the researchers found that it takes about 6 picoseconds for a photon of energy to travel between them. For proteins farther apart, the transfer takes up to 15 picoseconds.

    Faster travel translates to more efficient energy transfer, because the longer the journey takes, the more energy is lost during the transfer.

    “When a photon gets absorbed, you only have so long before that energy gets lost through unwanted processes such as nonradiative decay, so the faster it can get converted, the more efficient it will be,” Schlau-Cohen says.

    The researchers also found that proteins arranged in a lattice structure showed less efficient energy transfer than proteins that were arranged in randomly organized structures, as they usually are in living cells.

    “Ordered organization is actually less efficient than the disordered organization of biology, which we think is really interesting because biology tends to be disordered. This finding tells us that that may not just be an inevitable downside of biology, but organisms may have evolved to take advantage of it,” Schlau-Cohen says.

    Now that they have established the ability to measure inter-protein energy transfer, the researchers plan to explore energy transfer between other proteins, such as the transfer between proteins of the antenna to proteins of the reaction center. They also plan to study energy transfer between antenna proteins found in organisms other than purple bacteria, such as green plants.

    The research was funded primarily by the U.S. Department of Energy. More

  • in

    MIT engineering students take on the heat of Miami

    Think back to the last time you had to wait for a bus. How miserable were you? If you were in Boston, your experience might have included punishing wind and icy sleet — or, more recently, a punch of pollen straight to the sinuses. But in Florida’s Miami-Dade County, where the effects of climate change are both drastic and intensifying, commuters have to contend with an entirely different set of challenges: blistering temperatures and scorching humidity, making long stints waiting in the sun nearly unbearable.

    One of Miami’s most urgent transportation needs is shared by car-clogged Boston: coaxing citizens to use the municipal bus network, rather than the emissions-heavy individual vehicles currently contributing to climate change. But buses can be a tough sell in a sunny city where humidity hovers between 60 and 80 percent year-round. 

    Enter MIT’s Department of Electrical Engineering and Computer Science (EECS) and the MIT Priscilla King Gray (PKG) Public Service Center. The result of close collaboration between the two organizations, class 6.900 (Engineering For Impact) challenges EECS students to apply their engineering savvy to real-world problems beyond the MIT campus.

    This spring semester, the real-world problem was heat. 

    Miami-Dade County Department of Transportation and Public Works Chief Innovation Officer Carlos Cruz-Casas explains: “We often talk about the city we want to live in, about how the proper mix of public transportation, on-demand transit, and other mobility solutions, such as e-bikes and e-scooters, could help our community live a car-light life. However, none of this will be achievable if the riders are not comfortable when doing so.” 

    “When people think of South Florida and climate change, they often think of sea level rise,” says Juan Felipe Visser, deputy director of equity and engagement within the Office of the Mayor in Miami-Dade. “But heat really is the silent killer. So the focus of this class, on heat at bus stops, is very apt.” With little tree cover to give relief at some of the hottest stops, Miami-Dade commuters cluster in tiny patches of shade behind bus stops, sometimes giving up when the heat becomes unbearable. 

    A more conventional electrical engineering course might use temperature monitoring as an abstract example, building sample monitors in isolation and grading them as a merely academic exercise. But Professor Joel Volman, EECS faculty head of electrical engineering, and Joe Steinmeyer, senior lecturer in EECS, had something more impactful in mind.

    “Miami-Dade has a large population of people who are living in poverty, undocumented, or who are otherwise marginalized,” says Voldman. “Waiting, sometimes for a very long time, in scorching heat for the bus is just one aspect of how a city population can be underserved, but by measuring patterns in how many people are waiting for a bus, how long they wait, and in what conditions, we can begin to see where services are not keeping up with demand.”

    Only after that gap is quantified can the work of city and transportation planners begin, Cruz-Casas explains: “We needed to quantify the time riders are exposed to extreme heat and prioritize improvements, including on-time performance improvements, increasing service frequency, or looking to enhance the tree canopy near the bus stop.” 

    Quantifying that time — and the subjective experience of the wait — proved tricky, however. With over 7,500 bus stops along 101 bus routes, Miami-Dade’s transportation network presents a considerable data-collection challenge. A network of physical temperature monitors could be useful, but only if it were carefully calibrated to meet the budgetary, environmental, privacy, and implementation requirements of the city. But how do you work with city officials — not to mention all of bus-riding Miami — from over 2,000 miles away? 

    This is where the PKG Center comes in. “We are a hub and a connector and facilitator of best practices,” explains Jill Bassett, associate dean and director of the center, who worked with Voldman and Steinmeyer to find a municipal partner organization for the course. “We bring knowledge of current pedagogy around community-engaged learning, which includes: help with framing a partnership that centers community-identified concerns and is mutually beneficial; identifying and learning from a community partner; talking through ways to build in opportunities for student learners to reflect on power dynamics, reciprocity, systems thinking, long-term planning, continuity, ethics, all the types of things that come up with this kind of shared project.”

    Through a series of brainstorming conversations, Bassett helped Voldman and Steinmeyer structure a well-defined project plan, as Cruz-Casas weighed in on the county’s needed technical specifications (including affordability, privacy protection, and implementability).

    “This course brings together a lot of subject area experts,” says Voldman. “We brought in guest lecturers, including Abby Berenson from the Sloan Leadership Center, to talk about working in teams; engineers from BOSE to talk about product design, certification, and environmental resistance; the co-founder and head of engineering from MIT spinout Butlr to talk about their low-power occupancy sensor; Tony Hu from MIT IDM [Integrated Design and Management] to talk about industrial design; and Katrina LaCurts from EECS to talk about communications and networking.”

    With the support of two generous donations and a gift of software from Altium, 6.900 developed into a hands-on exercise in hardware/software product development with a tangible goal in sight: build a better bus monitor.

    The challenges involved in this undertaking became apparent as soon as the 6.900 students began designing their monitors. “The most challenging requirement to meet was that the monitor be able to count how many people were waiting — and for how long they’d been standing there — while still maintaining privacy,” says Fabian Velazquez ’23 a recent EECS graduate. The task was complicated by commuters’ natural tendency to stand where the shade goes — whether beneath a tree or awning or snaking against a nearby wall in a line — rather than directly next to the bus sign or inside the bus shelter. “Accurately measuring people count with a camera — the most straightforward choice — is already quite difficult since you have to incorporate machine learning to identify which objects in frame are people. Maintaining privacy added an extra layer of constraint … since there is no guarantee the collected data wouldn’t be vulnerable.”

    As the groups weighed various privacy-preserving options, including lidar, radar, and thermal imaging, the class realized that Wi-Fi “sniffers,” which count the number of Wi-Fi enabled signals in the immediate area, were their best option to count waiting passengers. “We were all excited and ready for this amazing, answer-to-all-our-problems radar sensor to count people,” says Velasquez. “That component was extremely complex, however, and the complexity would have ultimately made my team use a lot of time and resources to integrate with our system. We also had a short time-to-market for this system we developed. We made the trade-off of complexity for robustness.” 

    The weather also posed its own set of challenges. “Environmental conditions were big factors on the structure and design of our devices,” says Yong Yan (Crystal) Liang, a rising junior majoring in EECS. “We incorporated humidity and temperature sensors into our data to show the weather at individual stops. Additionally, we also considered how our enclosure may be affected by extreme heat or potential hurricanes.”

    The heat variable proved problematic in multiple ways. “People detection was especially difficult, for in the Miami heat, thermal cameras may not be able to distinguish human body temperature from the surrounding air temperature, and the glare of the sun off of other surfaces in the area makes most forms of imaging very buggy,” says Katherine Mohr ’23. “My team had considered using mmWave sensors to get around these constraints, but we found the processing to be too difficult, and (like the rest of the class), we decided to only move forward with Wi-Fi/BLE [Bluetooth Low Energy] sniffers.”

    The most valuable component of the new class may well have been the students’ exposure to real-world hardware/software engineering product development, where limitations on time and budget always exist, and where client requests must be carefully considered.  “Having an actual client to work with forced us to learn how to turn their wants into more specific technical specifications,” says Mohr. “We chose deliverables each week to complete by Friday, prioritizing tasks which would get us to a minimum viable product, as well as tasks that would require extra manufacturing time, like designing the printed-circuit board and enclosure.”

    Play video

    Joel Voldman, who co-designed 6.900 (Engineering For Impact) with Joe Steinmeyer and MIT’s Priscilla King Gray (PKG) Public Service Center, describes how the course allowed students help develop systems for the public good. Voldman is the winner of the 2023 Teaching with Digital Technology Award, which is co-sponsored by MIT Open Learning and the Office of the Vice Chancellor. Video: MIT Open Learning

    Crystal Liang counted her conversations with city representatives as among her most valuable 6.900 experiences. “We generated a lot of questions and were able to communicate with the community leaders of this project from Miami-Dade, who made time to answer all of them and gave us ideas from the goals they were trying to achieve,” she reports. “This project gave me a new perspective on problem-solving because it taught me to see things from the community members’ point of view.” Some of those community leaders, including Marta Viciedo, co-founder of Transit Alliance Miami, joined the class’s final session on May 16 to review the students’ proposed solutions. 

    The students’ thoughtful approach paid off when it was time to present the heat monitors to the class’s client. In a group conference call with Miami-Dade officials toward the end of the semester, the student teams shared their findings and the prototypes they’d created, along with videos of the devices at work. Juan Felipe Visser was among those in attendance. “This is a lot of work,” he told the students following their presentation. “So first of all, thank you for doing that, and for presenting to us. I love the concept. I took the bus this morning, as I do every morning, and was battered by the sun and the heat. So I personally appreciated the focus.” 

    Cruz-Casas agreed: “I am pleasantly surprised by the diverse approach the students are taking. We presented a challenge, and they have responded to it and managed to think beyond the problem at hand. I’m very optimistic about how the outcomes of this project will have a long-lasting impact for our community. At a minimum, I’m thinking that the more awareness we raise about this topic, the more opportunities we have to have the brightest minds seeking for a solution.”

    The creators of 6.900 agree, and hope that their class helps more MIT engineers to broaden their perspective on the meaning and application of their work. 

    “We are really excited about students applying their skills within a real-world, complex environment that will impact real people,” says Bassett. “We are excited that they are learning that it’s not just the design of technology that matters, but that climate; environment and built environment; and issues around socioeconomics, race, and equity, all come into play. There are layers and layers to the creation and deployment of technology in a demographically diverse multilingual community that is at the epicenter of climate change.” More

  • in

    A new mathematical “blueprint” is accelerating fusion device development

    Developing commercial fusion energy requires scientists to understand sustained processes that have never before existed on Earth. But with so many unknowns, how do we make sure we’re designing a device that can successfully harness fusion power?

    We can fill gaps in our understanding using computational tools like algorithms and data simulations to knit together experimental data and theory, which allows us to optimize fusion device designs before they’re built, saving much time and resources.

    Currently, classical supercomputers are used to run simulations of plasma physics and fusion energy scenarios, but to address the many design and operating challenges that still remain, more powerful computers are a necessity, and of great interest to plasma researchers and physicists.

    Quantum computers’ exponentially faster computing speeds have offered plasma and fusion scientists the tantalizing possibility of vastly accelerated fusion device development. Quantum computers could reconcile a fusion device’s many design parameters — for example, vessel shape, magnet spacing, and component placement — at a greater level of detail, while also completing the tasks faster. However, upgrading to a quantum computer is no simple task.

    In a paper, “Dyson maps and unitary evolution for Maxwell equations in tensor dielectric media,” recently published in Physics Review A, Abhay K. Ram, a research scientist at the MIT Plasma Science and Fusion Center (PSFC), and his co-authors Efstratios Koukoutsis, Kyriakos Hizanidis, and George Vahala present a framework that would facilitate the use of quantum computers to study electromagnetic waves in plasma and its manipulation in magnetic confinement fusion devices.

    Quantum computers excel at simulating quantum physics phenomena, but many topics in plasma physics are predicated on the classical physics model. A plasma (which is the “dielectric media” referenced in the paper’s title) consists of many particles — electrons and ions — the collective behaviors of which are effectively described using classic statistical physics. In contrast, quantum effects that influence atomic and subatomic scales are averaged out in classical plasma physics.  

    Furthermore, the descriptive limitations of quantum mechanics aren’t suited to plasma. In a fusion device, plasmas are heated and manipulated using electromagnetic waves, which are one of the most important and ubiquitous occurrences in the universe. The behaviors of electromagnetic waves, including how waves are formed and interact with their surroundings, are described by Maxwell’s equations — a foundational component of classical plasma physics, and of general physics as well. The standard form of Maxwell’s equations is not expressed in “quantum terms,” however, so implementing the equations on a quantum computer is like fitting a square peg in a round hole: it doesn’t work.

    Consequently, for plasma physicists to take advantage of quantum computing’s power for solving problems, classical physics must be translated into the language of quantum mechanics. The researchers tackled this translational challenge, and in their paper, they reveal that a Dyson map can bridge the translational divide between classical physics and quantum mechanics. Maps are mathematical functions that demonstrate how to take an input from one kind of space and transform it to an output that is meaningful in a different kind of space. In the case of Maxwell’s equations, a Dyson map allows classical electromagnetic waves to be studied in the space utilized by quantum computers. In essence, it reconfigures the square peg so it will fit into the round hole without compromising any physics.

    The work also gives a blueprint of a quantum circuit encoded with equations expressed in quantum bits (“qubits”) rather than classical bits so the equations may be used on quantum computers. Most importantly, these blueprints can be coded and tested on classical computers.

    “For years we have been studying wave phenomena in plasma physics and fusion energy science using classical techniques. Quantum computing and quantum information science is challenging us to step out of our comfort zone, thereby ensuring that I have not ‘become comfortably numb,’” says Ram, quoting a Pink Floyd song.

    The paper’s Dyson map and circuits have put quantum computing power within reach, fast-tracking an improved understanding of plasmas and electromagnetic waves, and putting us that much closer to the ideal fusion device design.    More

  • in

    Advancing material innovation to address the polymer waste crisis

    Products made from polymers — ranging from plastic bags to clothing to cookware to electronics — provide many comforts and support today’s standard of living, but since they do not decompose easily, they pose long-term environmental challenges. Developing polymers, a large class of materials, with a more sustainable life cycle is a critical step in making progress toward a green economy and addressing this piece of the global climate change crisis. The development of biodegradable polymers, however, remains limited by current biodegradation testing methods.

    To address this limitation, a team of MIT researchers led by Bradley D. Olsen, the Alexander and I. Michael Kasser (1960) Professor in the Department of Chemical Engineering, has developed an expansive biodegradation dataset to help determine whether or not a polymer is biodegradable.

    Their findings were recently published in The Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS), in a paper titled “High-Throughput Experimentation for Discovery of Biodegradable Polyesters.” The MIT team is led by Olsen and PhD candidates Katharina A. Fransen and Sarah H. M. Av-Ron, and also includes postdoc Dylan J. Walsh and undergraduate students Tess R. Buchanan, Dechen T. Rota, and Lana Van Note.

    “Despite polymer waste being a known and significant contributor to the climate crisis, the study of polymer biodegradation has been limited to a small number of polymers because current biodegradation testing methods are time- and resource-intensive,” says Olsen. “This limited scope slows new material innovation, so we are working to open that up to a much broader portfolio of materials.”

    Unique high-throughput approach

    The dataset Olsen’s team has developed, with support from the MIT Climate and Sustainability Consortium (MCSC), the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS), and DIC Corporation, includes more than 600 distinct polyester chemistries.

    “The ingenuity of our work is pushing the screening to be high-throughput, which accelerates the pace of discovery,” says Av-Ron. High-throughput synthesis methods enable large quantities of samples to be screened rapidly, identifying products with the desired property or function you are looking for. In this case, the high-throughput approach was conducted using a method called clear-zone assay, which detects polymer biofragmentation and identifies polymer degrading bacteria. The biodegradation dataset can then lead to structure-property relationships, a concept central to materials science and engineering, where relationships between the chemical detail and property can be established, and used to build a biodegradation prediction model. When developing these models to predict biodegradation, the researchers were interested in looking into the potential linearity and nonlinearity of the relationships between structure and biodegradability.

    “We consider our scientific breakthrough to be having this large dataset, and the qualitative relationships and predictive models such a substantial  amount of data enabled,” adds Av-Ron. “It was captivating to figure out how to integrate the high complexity of polymer chemical representation with predictive machine-learning models. I was very excited to get a validation accuracy of 82 percent for one representation/model combination. With additional data we might be able to improve our predictions even more.”

    The team’s work focuses largely on polyesters; the development of biodegradable polyesters presents a key opportunity for addressing the polymer sustainability crisis and reducing the environmental burden of the polymer life cycle.

    One strain of bacteria, many chemistries

    The biodegradation test that these data create is accessible and cost-effective to put in place; initial industry feedback has been positive. The datasets are also more reproducible than many other standards in this space.

    “With our method, there is one strain of bacteria, so you know exactly what you’re testing,” says Av-Ron. This speaks to the uniqueness of the team’s approach.

    “When polymers are developed, normally the strength of the material is examined first, and then once the material is developed, whether or not it biodegrades comes second,” says Fransen.

    Olsen and his team are examining the opposite — developing the biodegradability screen first, to help filter and focus what to look for in a material. This way, the team’s infrastructure can assess a lot of different options, quickly.

    “There has been big movement recently in developing sustainable polymers,” concludes Fransen, “and having something like this that is quick, tangible, and relatively inexpensive, could add a lot of value to that community.”

    Fransen received a 2022 J-WAFS Fellowship for this work, and she and Av-Ron together won second place in the 2022 J-WAFS World Food Day Student Video Competition, as this research can be applied to creating more sustainable food packaging. More

  • in

    Megawatt electrical motor designed by MIT engineers could help electrify aviation

    Aviation’s huge carbon footprint could shrink significantly with electrification. To date, however, only small all-electric planes have gotten off the ground. Their electric motors generate hundreds of kilowatts of power. To electrify larger, heavier jets, such as commercial airliners, megawatt-scale motors are required. These would be propelled by hybrid or turbo-electric propulsion systems where an electrical machine is coupled with a gas turbine aero-engine.

    To meet this need, a team of MIT engineers is now creating a 1-megawatt motor that could be a key stepping stone toward electrifying larger aircraft. The team has designed and tested the major components of the motor, and shown through detailed computations that the coupled components can work as a whole to generate one megawatt of power, at a weight and size competitive with current small aero-engines.

    For all-electric applications, the team envisions the motor could be paired with a source of electricity such as a battery or a fuel cell. The motor could then turn the electrical energy into mechanical work to power a plane’s propellers. The electrical machine could also be paired with a traditional turbofan jet engine to run as a hybrid propulsion system, providing electric propulsion during certain phases of a flight.

    “No matter what we use as an energy carrier — batteries, hydrogen, ammonia, or sustainable aviation fuel — independent of all that, megawatt-class motors will be a key enabler for greening aviation,” says Zoltan Spakovszky, the T. Wilson Professor in Aeronautics and the Director of the Gas Turbine Laboratory (GTL) at MIT, who leads the project.

    Spakovszky and members of his team, along with industry collaborators, will present their work at a special session of the American Institute of Aeronautics and Astronautics – Electric Aircraft Technologies Symposium (EATS) at the Aviation conference in June.

    The MIT team is composed of faculty, students, and research staff from GTL and the MIT Laboratory for Electromagnetic and Electronic Systems: Henry Andersen Yuankang Chen, Zachary Cordero, David Cuadrado,  Edward Greitzer, Charlotte Gump, James Kirtley, Jr., Jeffrey Lang, David Otten, David Perreault, and Mohammad Qasim,  along with Marc Amato of Innova-Logic LLC. The project is sponsored by Mitsubishi Heavy Industries (MHI).

    Heavy stuff

    To prevent the worst impacts from human-induced climate change, scientists have determined that global emissions of carbon dioxide must reach net zero by 2050. Meeting this target for aviation, Spakovszky says, will require “step-change achievements” in the design of unconventional aircraft, smart and flexible fuel systems, advanced materials, and safe and efficient electrified propulsion. Multiple aerospace companies are focused on electrified propulsion and the design of megawatt-scale electric machines that are powerful and light enough to propel passenger aircraft.

    “There is no silver bullet to make this happen, and the devil is in the details,” Spakovszky says. “This is hard engineering, in terms of co-optimizing individual components and making them compatible with each other while maximizing overall performance. To do this means we have to push the boundaries in materials, manufacturing, thermal management, structures and rotordynamics, and power electronics”

    Broadly speaking, an electric motor uses electromagnetic force to generate motion. Electric motors, such as those that power the fan in your laptop, use electrical energy — from a battery or power supply — to generate a magnetic field, typically through copper coils. In response, a magnet, set near the coils, then spins in the direction of the generated field and can then drive a fan or propeller.

    Electric machines have been around for over 150 years, with the understanding that, the bigger the appliance or vehicle, the larger the copper coils  and the magnetic rotor, making the machine heavier. The more power the electrical machine generates, the more heat it produces, which requires additional elements to keep the components cool — all of which can take up space and add significant weight to the system, making it challenging for airplane applications.

    “Heavy stuff doesn’t go on airplanes,” Spakovszky says. “So we had to come up with a compact, lightweight, and powerful architecture.”

    Good trajectory

    As designed, the MIT electric motor and power electronics are each about the size of a checked suitcase weighing less than an adult passenger.

    The motor’s main components are: a high-speed rotor, lined with an array of magnets with varying orientation of polarity; a compact low-loss stator that fits inside the rotor and contains an intricate array of copper windings; an advanced heat exchanger that keeps the components cool while transmitting the torque of the machine; and a distributed power electronics system, made from 30 custom-built circuit boards, that precisely change the currents running through each of the stator’s copper windings, at high frequency.

    “I believe this is the first truly co-optimized integrated design,” Spakovszky says. “Which means we did a very extensive design space exploration where all considerations from thermal management, to rotor dynamics, to power electronics and electrical machine architecture were assessed in an integrated way to find out what is the best possible combination to get the required specific power at one megawatt.”

    As a whole system, the motor is designed such that the distributed circuit boards are close coupled with the electrical machine to minimize transmission loss and to allow effective air cooling through the integrated heat exchanger.

    “This is a high-speed machine, and to keep it rotating while creating torque, the magnetic fields have to be traveling very quickly, which we can do through our circuit boards switching at high frequency,” Spakovszky says.

    To mitigate risk, the team has built and tested each of the major components individually, and shown that they can operate as designed and at conditions exceeding normal operational demands. The researchers plan to assemble the first fully working electric motor, and start testing it in the fall.

    “The electrification of aircraft has been on a steady rise,” says Phillip Ansell, director of the Center for Sustainable Aviation at the University of Illinois Urbana-Champaign, who was not involved in the project. “This group’s design uses a wonderful combination of conventional and cutting-edge methods for electric machine development, allowing it to offer both robustness and efficiency to meet the practical needs of aircraft of the future.”

    Once the MIT team can demonstrate the electric motor as a whole, they say the design could power regional aircraft and could also be a companion to conventional jet engines, to enable hybrid-electric propulsion systems. The team also envision that multiple one-megawatt motors could power multiple fans distributed along the wing on future aircraft configurations. Looking ahead, the foundations of the one-megawatt electrical machine design could potentially be scaled up to multi-megawatt motors, to power larger passenger planes.

    “I think we’re on a good trajectory,” says Spakovszky, whose group and research have focused on more than just gas turbines. “We are not electrical engineers by training, but addressing the 2050 climate grand challenge is of utmost importance; working with electrical engineering faculty, staff and students for this goal can draw on MIT’s breadth of technologies so the whole is greater than the sum of the parts. So we are reinventing ourselves in new areas. And MIT gives you the opportunity to do that.” More

  • in

    River erosion can shape fish evolution, study suggests

    If we could rewind the tape of species evolution around the world and play it forward over hundreds of millions of years to the present day, we would see biodiversity clustering around regions of tectonic turmoil. Tectonically active regions such as the Himalayan and Andean mountains are especially rich in flora and fauna due to their shifting landscapes, which act to divide and diversify species over time.

    But biodiversity can also flourish in some geologically quieter regions, where tectonics hasn’t shaken up the land for millennia. The Appalachian Mountains are a prime example: The range has not seen much tectonic activity in hundreds of millions of years, and yet the region is a notable hotspot of freshwater biodiversity.

    Now, an MIT study identifies a geological process that may shape the diversity of species in tectonically inactive regions. In a paper appearing today in Science, the researchers report that river erosion can be a driver of biodiversity in these older, quieter environments.

    They make their case in the southern Appalachians, and specifically the Tennessee River Basin, a region known for its huge diversity of freshwater fishes. The team found that as rivers eroded through different rock types in the region, the changing landscape pushed a species of fish known as the greenfin darter into different tributaries of the river network. Over time, these separated populations developed into their own distinct lineages.

    The team speculates that erosion likely drove the greenfin darter to diversify. Although the separated populations appear outwardly similar, with the greenfin darter’s characteristic green-tinged fins, they differ substantially in their genetic makeup. For now, the separated populations are classified as one single species. 

    “Give this process of erosion more time, and I think these separate lineages will become different species,” says Maya Stokes PhD ’21, who carried out part of the work as a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).

    The greenfin darter may not be the only species to diversify as a consequence of river erosion. The researchers suspect that erosion may have driven many other species to diversify throughout the basin, and possibly other tectonically inactive regions around the world.

    “If we can understand the geologic factors that contribute to biodiversity, we can do a better job of conserving it,” says Taylor Perron, the Cecil and Ida Green Professor of Earth, Atmospheric, and Planetary Sciences at MIT.

    The study’s co-authors include collaborators at Yale University, Colorado State University, the University of Tennessee, the University of Massachusetts at Amherst, and the Tennessee Valley Authority (TVA). Stokes is currently an assistant professor at Florida State University.

    Fish in trees

    The new study grew out of Stokes’ PhD work at MIT, where she and Perron were exploring connections between geomorphology (the study of how landscapes evolve) and biology. They came across work at Yale by Thomas Near, who studies lineages of North American freshwater fishes. Near uses DNA sequence data collected from freshwater fishes across various regions of North America to show how and when certain species evolved and diverged in relation to each other.

    Near brought a curious observation to the team: a habitat distribution map of the greenfin darter showing that the fish was found in the Tennessee River Basin — but only in the southern half. What’s more, Near had mitochondrial DNA sequence data showing that the fish’s populations appeared to be different in their genetic makeup depending on the tributary in which they were found.

    To investigate the reasons for this pattern, Stokes gathered greenfin darter tissue samples from Near’s extensive collection at Yale, as well as from the field with help from TVA colleagues. She then analyzed DNA sequences from across the entire genome, and compared the genes of each individual fish to every other fish in the dataset. The team then created a phylogenetic tree of the greenfin darter, based on the genetic similarity between fish.

    From this tree, they observed that fish within a tributary were more related to each other than to fish in other tributaries. What’s more, fish within neighboring tributaries were more similar to each other than fish from more distant tributaries.

    “Our question was, could there have been a geological mechanism that, over time, took this single species, and splintered it into different, genetically distinct groups?” Perron says.

    A changing landscape

    Stokes and Perron started to observe a “tight correlation” between greenfin darter habitats and the type of rock where they are found. In particular, much of the southern half of the Tennessee River Basin, where the species abounds, is made of metamorphic rock, whereas the northern half consists of sedimentary rock, where the fish are not found.

    They also observed that the rivers running through metamorphic rock are steeper and more narrow, which generally creates more turbulence, a characteristic greenfin darters seem to prefer. The team wondered: Could the distribution of greenfin darter habitat have been shaped by a changing landscape of rock type, as rivers eroded into the land over time?

    To check this idea, the researchers developed a model to simulate how a landscape evolves as rivers erode through various rock types. They fed the model information about the rock types in the Tennessee River Basin today, then ran the simulation back to see how the same region may have looked millions of years ago, when more metamorphic rock was exposed.

    They then ran the model forward and observed how the exposure of metamorphic rock shrank over time. They took special note of where and when connections between tributaries crossed into non-metamorphic rock, blocking fish from passing between those tributaries. They drew up a simple timeline of these blocking events and compared this to the phylogenetic tree of diverging greenfin darters. The two were remarkably similar: The fish seemed to form separate lineages in the same order as when their respective tributaries became separated from the others.

    “It means it’s plausible that erosion through different rock layers caused isolation between different populations of the greenfin darter and caused lineages to diversify,” Stokes says.

    “This study is highly compelling because it reveals a much more subtle but powerful mechanism for speciation in passive margins,” says Josh Roering, professor of Earth sciences at the University of Oregon, who was not involved in the study. “Stokes and Perron have revealed some of the intimate connections between aquatic species and geology that may be much more common than we realize.”

    This research was supported, in part, by the mTerra Catalyst Fund and the U.S. National Science Foundation through the AGeS Geochronology Program and the Graduate Research Fellowship Program. While at MIT, Stokes received support through the Martin Fellowship for Sustainability and the Hugh Hampton Young Fellowship. More