More stories

  • in

    Shining a light on oil fields to make them more sustainable

    Operating an oil field is complex and there is a staggeringly long list of things that can go wrong.

    One of the most common problems is spills of the salty brine that’s a toxic byproduct of pumping oil. Another is over- or under-pumping that can lead to machine failure and methane leaks. (The oil and gas industry is the largest industrial emitter of methane in the U.S.) Then there are extreme weather events, which range from winter frosts to blazing heat, that can put equipment out of commission for months. One of the wildest problems Sebastien Mannai SM ’14, PhD ’18 has encountered are hogs that pop open oil tanks with their snouts to enjoy on-demand oil baths.

    Mannai helps oil field owners detect and respond to these problems while optimizing the operation of their machinery to prevent the issues from occurring in the first place. He is the founder and CEO of Amplified Industries, a company selling oil field monitoring and control tools that help make the industry more efficient and sustainable.

    Amplified Industries’ sensors and analytics give oil well operators real-time alerts when things go wrong, allowing them to respond to issues before they become disasters.

    “We’re able to find 99 percent of the issues affecting these machines, from mechanical failures to human errors, including issues happening thousands of feet underground,” Mannai explains. “With our AI solution, operators can put the wells on autopilot, and the system automatically adjusts or shuts the well down as soon as there’s an issue.”

    Amplified currently works with private companies in states spanning from Texas to Wyoming, that own and operate as many as 3,000 wells. Such companies make up the majority of oil well operators in the U.S. and operate both new and older, more failure-prone equipment that has been in the field for decades.

    Such operators also have a harder time responding to environmental regulations like the Environmental Protection Agency’s new methane guidelines, which seek to dramatically reduce emissions of the potent greenhouse gas in the industry over the next few years.

    “These operators don’t want to be releasing methane,” Mannai explains. “Additionally, when gas gets into the pumping equipment, it leads to premature failures. We can detect gas and slow the pump down to prevent it. It’s the best of both worlds: The operators benefit because their machines are working better, saving them money while also giving them a smaller environmental footprint with fewer spills and methane leaks.”

    Leveraging “every MIT resource I possibly could”

    Mannai learned about the cutting-edge technology used in the space and aviation industries as he pursued his master’s degree at the Gas Turbine Laboratory in MIT’s Department of Aeronautics and Astronautics. Then, during his PhD at MIT, he worked with an oil services company and discovered the oil and gas industry was still relying on decades-old technologies and equipment.

    “When I first traveled to the field, I could not believe how old-school the actual operations were,” says Mannai, who has previously worked in rocket engine and turbine factories. “A lot of oil wells have to be adjusted by feel and rules of thumb. The operators have been let down by industrial automation and data companies.”

    Monitoring oil wells for problems typically requires someone in a pickup truck to drive hundreds of miles between wells looking for obvious issues, Mannai says. The sensors that are deployed are expensive and difficult to replace. Over time, they’re also often damaged in the field to the point of being unusable, forcing technicians to make educated guesses about the status of each well.

    “We often see that equipment unplugged or programmed incorrectly because it is incredibly over-complicated and ill-designed for the reality of the field,” Mannai says. “Workers on the ground often have to rip it out and bypass the control system to pump by hand. That’s how you end up with so many spills and wells pumping at suboptimal levels.”

    To build a better oil field monitoring system, Mannai received support from the MIT Sandbox Innovation Fund and the Venture Mentoring Service (VMS). He also participated in the delta V summer accelerator at the Martin Trust Center for MIT Entrepreneurship, the fuse program during IAP, and the MIT I-Corps program, and took a number of classes at the MIT Sloan School of Management. In 2019, Amplified Industries — which operated under the name Acoustic Wells until recently — won the MIT $100K Entrepreneurship competition.

    “My approach was to sign up to every possible entrepreneurship related program and to leverage every MIT resource I possibly could,” Mannai says. “MIT was amazing for us.”

    Mannai officially launched the company after his postdoc at MIT, and Amplified raised its first round of funding in early 2020. That year, Amplified’s small team moved into the Greentown Labs startup incubator in Somerville.

    Mannai says building the company’s battery-powered, low-cost sensors was a huge challenge. The sensors run machine-learning inference models and their batteries last for 10 years. They also had to be able to handle extreme conditions, from the scorching hot New Mexico desert to the swamps of Louisiana and the freezing cold winters in North Dakota.

    “We build very rugged, resilient hardware; it’s a must in those environments” Mannai says. “But it’s also very simple to deploy, so if a device does break, it’s like changing a lightbulb: We ship them a new one and it takes them a couple of minutes to swap it out.”

    Customers equip each well with four or five of Amplified’s sensors, which attach to the well’s cables and pipes to measure variables like tension, pressure, and amps. Vast amounts of data are then sent to Amplified’s cloud and processed by their analytics engine. Signal processing methods and AI models are used to diagnose problems and control the equipment in real-time, while generating notifications for the operators when something goes wrong. Operators can then remotely adjust the well or shut it down.

    “That’s where AI is important, because if you just record everything and put it in a giant dashboard, you create way more work for people,” Mannai says. “The critical part is the ability to process and understand this newly recorded data and make it readily usable in the real world.”

    Amplified’s dashboard is customized for different people in the company, so field technicians can quickly respond to problems and managers or owners can get a high-level view of how everything is running.

    Mannai says often when Amplified’s sensors are installed, they’ll immediately start detecting problems that were unknown to engineers and technicians in the field. To date, Amplified has prevented hundreds of thousands of gallons worth of brine water spills, which are particularly damaging to surrounding vegetation because of their high salt and sulfur content.

    Preventing those spills is only part of Amplified’s positive environmental impact; the company is now turning its attention toward the detection of methane leaks.

    Helping a changing industry

    The EPA’s proposed new Waste Emissions Charge for oil and gas companies would start at $900 per metric ton of reported methane emissions in 2024 and increase to $1,500 per metric ton in 2026 and beyond.

    Mannai says Amplified is well-positioned to help companies comply with the new rules. Its equipment has already showed it can detect various kinds of leaks across the field, purely based on analytics of existing data.

    “Detecting methane leaks typically requires someone to walk around every valve and piece of piping with a thermal camera or sniffer, but these operators often have thousands of valves and hundreds of miles of pipes,” Mannai says. “What we see in the field is that a lot of times people don’t know where the pipes are because oil wells change owners so frequently, or they will miss an intermittent leak.”

    Ultimately Mannai believes a strong data backend and modernized sensing equipment will become the backbone of the industry, and is a necessary prerequisite to both improving efficiency and cleaning up the industry.

    “We’re selling a service that ensures your equipment is working optimally all the time,” Mannai says. “That means a lot fewer fines from the EPA, but it also means better-performing equipment. There’s a mindset change happening across the industry, and we’re helping make that transition as easy and affordable as possible.” More

  • in

    A new sensor detects harmful “forever chemicals” in drinking water

    MIT chemists have designed a sensor that detects tiny quantities of perfluoroalkyl and polyfluoroalkyl substances (PFAS) — chemicals found in food packaging, nonstick cookware, and many other consumer products.

    These compounds, also known as “forever chemicals” because they do not break down naturally, have been linked to a variety of harmful health effects, including cancer, reproductive problems, and disruption of the immune and endocrine systems.

    Using the new sensor technology, the researchers showed that they could detect PFAS levels as low as 200 parts per trillion in a water sample. The device they designed could offer a way for consumers to test their drinking water, and it could also be useful in industries that rely heavily on PFAS chemicals, including the manufacture of semiconductors and firefighting equipment.

    “There’s a real need for these sensing technologies. We’re stuck with these chemicals for a long time, so we need to be able to detect them and get rid of them,” says Timothy Swager, the John D. MacArthur Professor of Chemistry at MIT and the senior author of the study, which appears this week in the Proceedings of the National Academy of Sciences.

    Other authors of the paper are former MIT postdoc and lead author Sohyun Park and MIT graduate student Collette Gordon.

    Detecting PFAS

    Coatings containing PFAS chemicals are used in thousands of consumer products. In addition to nonstick coatings for cookware, they are also commonly used in water-repellent clothing, stain-resistant fabrics, grease-resistant pizza boxes, cosmetics, and firefighting foams.

    These fluorinated chemicals, which have been in widespread use since the 1950s, can be released into water, air, and soil, from factories, sewage treatment plants, and landfills. They have been found in drinking water sources in all 50 states.

    In 2023, the Environmental Protection Agency created an “advisory health limit” for two of the most hazardous PFAS chemicals, known as perfluorooctanoic acid (PFOA) and perfluorooctyl sulfonate (PFOS). These advisories call for a limit of 0.004 parts per trillion for PFOA and 0.02 parts per trillion for PFOS in drinking water.

    Currently, the only way that a consumer could determine if their drinking water contains PFAS is to send a water sample to a laboratory that performs mass spectrometry testing. However, this process takes several weeks and costs hundreds of dollars.

    To create a cheaper and faster way to test for PFAS, the MIT team designed a sensor based on lateral flow technology — the same approach used for rapid Covid-19 tests and pregnancy tests. Instead of a test strip coated with antibodies, the new sensor is embedded with a special polymer known as polyaniline, which can switch between semiconducting and conducting states when protons are added to the material.

    The researchers deposited these polymers onto a strip of nitrocellulose paper and coated them with a surfactant that can pull fluorocarbons such as PFAS out of a drop of water placed on the strip. When this happens, protons from the PFAS are drawn into the polyaniline and turn it into a conductor, reducing the electrical resistance of the material. This change in resistance, which can be measured precisely using electrodes and sent to an external device such as a smartphone, gives a quantitative measurement of how much PFAS is present.

    This approach works only with PFAS that are acidic, which includes two of the most harmful PFAS — PFOA and perfluorobutanoic acid (PFBA).

    A user-friendly system

    The current version of the sensor can detect concentrations as low as 200 parts per trillion for PFBA, and 400 parts per trillion for PFOA. This is not quite low enough to meet the current EPA guidelines, but the sensor uses only a fraction of a milliliter of water. The researchers are now working on a larger-scale device that would be able to filter about a liter of water through a membrane made of polyaniline, and they believe this approach should increase the sensitivity by more than a hundredfold, with the goal of meeting the very low EPA advisory levels.

    “We do envision a user-friendly, household system,” Swager says. “You can imagine putting in a liter of water, letting it go through the membrane, and you have a device that measures the change in resistance of the membrane.”

    Such a device could offer a less expensive, rapid alternative to current PFAS detection methods. If PFAS are detected in drinking water, there are commercially available filters that can be used on household drinking water to reduce those levels. The new testing approach could also be useful for factories that manufacture products with PFAS chemicals, so they could test whether the water used in their manufacturing process is safe to release into the environment.

    The research was funded by an MIT School of Science Fellowship to Gordon, a Bose Research Grant, and a Fulbright Fellowship to Park. More

  • in

    Technologies for water conservation and treatment move closer to commercialization

    The Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) provides Solutions Grants to help MIT researchers launch startup companies or products to commercialize breakthrough technologies in water and food systems. The Solutions Grant Program began in 2015 and is supported by Community Jameel. In addition to one-year, renewable grants of up to $150,000, the program also matches grantees with industry mentors and facilitates introductions to potential investors. Since its inception, the J-WAFS Solutions Program has awarded over $3 million in funding to the MIT community. Numerous startups and products, including a portable desalination device and a company commercializing a novel food safety sensor, have spun out of this support.

    The 2023 J-WAFS Solutions Grantees are Professor C. Cem Tasan of the Department of Materials Science and Engineering and Professor Andrew Whittle of the Department of Civil and Environmental Engineering. Tasan’s project involves reducing water use in steel manufacturing and Whittle’s project tackles harmful algal blooms in water. Project work commences this September.

    “This year’s Solutions Grants are being award to professors Tasan and Whittle to help commercialize technologies they have been developing at MIT,” says J-WAFS executive director Renee J. Robins. “With J-WAFS’ support, we hope to see the teams move their technologies from the lab to the market, so they can have a beneficial impact on water use and water quality challenges,” Robins adds.

    Reducing water consumption by solid-state steelmaking

    Water is a major requirement for steel production. The steel industry ranks fourth in industrial freshwater consumption worldwide, since large amounts of water are needed mainly for cooling purposes in the process. Unfortunately, a strong correlation has also been shown to exist between freshwater use in steelmaking and water contamination. As the global demand for steel increases and freshwater availability decreases due to climate change, improved methods for more sustainable steel production are needed.

    A strategy to reduce the water footprint of steelmaking is to explore steel recycling processes that avoid liquid metal processing. With this motivation, Cem Tasan, the Thomas B. King Associate Professor of Metallurgy in the Department of Materials Science and Engineering, and postdoc Onur Guvenc PhD created a new process called Scrap Metal Consolidation (SMC). SMC is based on a well-established metal forming process known as roll bonding. Conventionally, roll bonding requires intensive prior surface treatment of the raw material, specific atmospheric conditions, and high deformation levels. Tasan and Guvenc’s research revealed that SMC can overcome these restrictions by enabling the solid-state bonding of scrap into a sheet metal form, even when the surface quality, atmospheric conditions, and deformation levels are suboptimal. Through lab-scale proof-of-principle investigations, they have already identified SMC process conditions and validated the mechanical formability of resulting steel sheets, focusing on mild steel, the most common sheet metal scrap.

    The J-WAFS Solutions Grant will help the team to build customer product prototypes, design the processing unit, and develop a scale-up strategy and business model. By simultaneously decreasing water usage, energy demand, contamination risk, and carbon dioxide burden, SMC has the potential to decrease the energy need for steel recycling by up to 86 percent, as well as reduce the linked carbon dioxide emissions and safeguard the freshwater resources that would otherwise be directed to industrial consumption. 

    Detecting harmful algal blooms in water before it’s too late

    Harmful algal blooms (HABs) are a growing problem in both freshwater and saltwater environments worldwide, causing an estimated $13 billion in annual damage to drinking water, water for recreational use, commercial fishing areas, and desalination activities. HABs pose a threat to both human health and aquaculture, thereby threatening the food supply. Toxins in HABs are produced by some cyanobacteria, or blue-green algae, whose communities change in composition in response to eutrophication from agricultural runoff, sewer overflows, or other events. Mitigation of risks from HABs are most effective when there is advance warning of these changes in algal communities. 

    Most in situ measurements of algae are based on fluorescence spectroscopy that is conducted with LED-induced fluorescence (LEDIF) devices, or probes that induce fluorescence of specific algal pigments using LED light sources. While LEDIFs provide reasonable estimates of concentrations of individual pigments, they lack resolution to discriminate algal classes within complex mixtures found in natural water bodies. In prior research, Andrew Whittle, the Edmund K. Turner Professor of Civil and Environmental Engineering, worked with colleagues to design REMORA, a low-cost, field-deployable prototype spectrofluorometer for measuring induced fluorescence. This research was part of a collaboration between MIT and the AMS Institute. Whittle and the team successfully trained a machine learning model to discriminate and quantify cell concentrations for mixtures of different algal groups in water samples through an extensive laboratory calibration program using various algae cultures. The group demonstrated these capabilities in a series of field measurements at locations in Boston and Amsterdam. 

    Whittle will work with Fábio Duarte of the Department of Urban Studies and Planning, the Senseable City Lab, and MIT’s Center for Real Estate to refine the design of REMORA. They will develop software for autonomous operation of the sensor that can be deployed remotely on mobile vessels or platforms to enable high-resolution spatiotemporal monitoring for harmful algae. Sensor commercialization will hopefully be able to exploit the unique capabilities of REMORA for long-term monitoring applications by water utilities, environmental regulatory agencies, and water-intensive industries.  More

  • in

    Devices offers long-distance, low-power underwater communication

    MIT researchers have demonstrated the first system for ultra-low-power underwater networking and communication, which can transmit signals across kilometer-scale distances.

    This technique, which the researchers began developing several years ago, uses about one-millionth the power that existing underwater communication methods use. By expanding their battery-free system’s communication range, the researchers have made the technology more feasible for applications such as aquaculture, coastal hurricane prediction, and climate change modeling.

    “What started as a very exciting intellectual idea a few years ago — underwater communication with a million times lower power — is now practical and realistic. There are still a few interesting technical challenges to address, but there is a clear path from where we are now to deployment,” says Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab.

    Underwater backscatter enables low-power communication by encoding data in sound waves that it reflects, or scatters, back toward a receiver. These innovations enable reflected signals to be more precisely directed at their source.

    Due to this “retrodirectivity,” less signal scatters in the wrong directions, allowing for more efficient and longer-range communication.

    When tested in a river and an ocean, the retrodirective device exhibited a communication range that was more than 15 times farther than previous devices. However, the experiments were limited by the length of the docks available to the researchers.

    To better understand the limits of underwater backscatter, the team also developed an analytical model to predict the technology’s maximum range. The model, which they validated using experimental data, showed that their retrodirective system could communicate across kilometer-scale distances.

    The researchers shared these findings in two papers which will be presented at this year’s ACM SIGCOMM and MobiCom conferences. Adib, senior author on both papers, is joined on the SIGCOMM paper by co-lead authors Aline Eid, a former postdoc who is now an assistant professor at the University of Michigan, and Jack Rademacher, a research assistant; as well as research assistants Waleed Akbar and Purui Wang, and postdoc Ahmed Allam. The MobiCom paper is also written by co-lead authors Akbar and Allam.

    Communicating with sound waves

    Underwater backscatter communication devices utilize an array of nodes made from “piezoelectric” materials to receive and reflect sound waves. These materials produce an electric signal when mechanical force is applied to them.

    When sound waves strike the nodes, they vibrate and convert the mechanical energy to an electric charge. The nodes use that charge to scatter some of the acoustic energy back to the source, transmitting data that a receiver decodes based on the sequence of reflections.

    But because the backscattered signal travels in all directions, only a small fraction reaches the source, reducing the signal strength and limiting the communication range.

    To overcome this challenge, the researchers leveraged a 70-year-old radio device called a Van Atta array, in which symmetric pairs of antennas are connected in such a way that the array reflects energy back in the direction it came from.

    But connecting piezoelectric nodes to make a Van Atta array reduces their efficiency. The researchers avoided this problem by placing a transformer between pairs of connected nodes. The transformer, which transfers electric energy from one circuit to another, allows the nodes to reflect the maximum amount of energy back to the source.

    “Both nodes are receiving and both nodes are reflecting, so it is a very interesting system. As you increase the number of elements in that system, you build an array that allows you to achieve much longer communication ranges,” Eid explains.

    In addition, they used a technique called cross-polarity switching to encode binary data in the reflected signal. Each node has a positive and a negative terminal (like a car battery), so when the positive terminals of two nodes are connected and the negative terminals of two nodes are connected, that reflected signal is a “bit one.”

    But if the researchers switch the polarity, and the negative and positive terminals are connected to each other instead, then the reflection is a “bit zero.”

    “Just connecting the piezoelectric nodes together is not enough. By alternating the polarities between the two nodes, we are able to transmit data back to the remote receiver,” Rademacher explains.

    When building the Van Atta array, the researchers found that if the connected nodes were too close, they would block each other’s signals. They devised a new design with staggered nodes that enables signals to reach the array from any direction. With this scalable design, the more nodes an array has, the greater its communication range.

    They tested the array in more than 1,500 experimental trials in the Charles River in Cambridge, Massachusetts, and in the Atlantic Ocean, off the coast of Falmouth, Massachusetts, in collaboration with the Woods Hole Oceanographic Institution. The device achieved communication ranges of 300 meters, more than 15 times longer than they previously demonstrated.

    However, they had to cut the experiments short because they ran out of space on the dock.

    Modeling the maximum

    That inspired the researchers to build an analytical model to determine the theoretical and practical communication limits of this new underwater backscatter technology.

    Building off their group’s work on RFIDs, the team carefully crafted a model that captured the impact of system parameters, like the size of the piezoelectric nodes and the input power of the signal, on the underwater operation range of the device.

    “It is not a traditional communication technology, so you need to understand how you can quantify the reflection. What are the roles of the different components in that process?” Akbar says.

    For instance, the researchers needed to derive a function that captures the amount of signal reflected out of an underwater piezoelectric node with a specific size, which was among the biggest challenges of developing the model, he adds.

    They used these insights to create a plug-and-play model into a which a user could enter information like input power and piezoelectric node dimensions and receive an output that shows the expected range of the system.

    They evaluated the model on data from their experimental trials and found that it could accurately predict the range of retrodirected acoustic signals with an average error of less than one decibel.

    Using this model, they showed that an underwater backscatter array can potentially achieve kilometer-long communication ranges.

    “We are creating a new ocean technology and propelling it into the realm of the things we have been doing for 6G cellular networks. For us, it is very rewarding because we are starting to see this now very close to reality,” Adib says.

    The researchers plan to continue studying underwater backscatter Van Atta arrays, perhaps using boats so they could evaluate longer communication ranges. Along the way, they intend to release tools and datasets so other researchers can build on their work. At the same time, they are beginning to move toward commercialization of this technology.

    “Limited range has been an open problem in underwater backscatter networks, preventing them from being used in real-world applications. This paper takes a significant step forward in the future of underwater communication, by enabling them to operate on minimum energy while achieving long range,” says Omid Abari, assistant professor of computer science at the University of California at Los Angeles, who was not involved with this work. “The paper is the first to bring Van Atta Reflector array technique into underwater backscatter settings and demonstrate its benefits in improving the communication range by orders of magnitude. This can take battery-free underwater communication one step closer to reality, enabling applications such as underwater climate change monitoring and coastal monitoring.”

    This research was funded, in part, by the Office of Naval Research, the Sloan Research Fellowship, the National Science Foundation, the MIT Media Lab, and the Doherty Chair in Ocean Utilization. More

  • in

    Uncovering how biomes respond to climate change

    Before Leila Mirzagholi arrived at MIT’s Department of Civil and Environmental Engineering (CEE) to begin her postdoc appointment, she had spent most of her time in academia building cosmological models to detect properties of gravitational waves in the cosmos.

    But as a member of Assistant Professor César Terrer’s lab in CEE, Mirzagholi uses her physics and mathematical background to improve our understanding of the different factors that influence how much carbon land ecosystems can store under climate change.

    “What was always important to me was thinking about how to solve a problem and putting all the pieces together and building something from scratch,” Mirzagholi says, adding this was one of the reasons that it was possible for her to switch fields — and what drives her today as a climate scientist.

    Growing up in Iran, Mirzagholi knew she wanted to be a scientist from an early age. As a kid, she became captivated by physics, spending most of her free time in a local cultural center that hosted science events. “I remember in that center there was an observatory that held observational tours and it drew me into science,” says Mirzgholi. She also remembers a time when she was a kid watching the science fiction film “Contact” that introduces a female scientist character who finds evidence of extraterrestrial life and builds a spaceship to make first contact: “After that movie my mind was set on pursuing astrophysics.”

    With the encouragement of her parents to develop a strong mathematical background before pursuing physics, she earned a bachelor’s degree in mathematics from Tehran University. Then she completed a one-year master class in mathematics at Utrecht University before completing her PhD in theoretical physics at Max Planck Institute for Astrophysics in Munich. There, Mirzgholi’s thesis focused on developing cosmological models with a focus on phenomenological aspects like propagation of gravitational waves on the cosmic microwave background.

    Midway through her PhD, Mirzgholi became discouraged with building models to explain the dynamics of the early universe because there is little new data. “It starts to get personal and becomes a game of: ‘Is it my model or your model?’” she explains. She grew frustrated not knowing when the models she’d built would ever be tested.

    It was at this time that Mirzgholi started reading more about the topics of climate change and climate science. “I was really motivated by the problems and the nature of the problems, especially to make global terrestrial ecology more quantitative,” she says. She also liked the idea of contributing to a global problem that we are all facing. She started to think, “maybe I can do my part, I can work on research beneficial for society and the planet.”

    She made the switch following her PhD and started as a postdoc in the Crowther Lab at ETH Zurich, working on understanding the effects of environmental changes on global vegetation activity. After a stint at ETH, where her colleagues collaborated on projects with the Terrer Lab, she relocated to Cambridge, Massachusetts, to join the lab and CEE.

    Her latest article in Science, which was published in July and co-authored by researchers from ETH, shows how global warming affects the timing of autumn leaf senescence. “It’s important to understand the length of the growing season, and how much the forest or other biomes will have the capacity to take in carbon from the atmosphere.” Using remote sensing data, she was able to understand when the growing season will end under a warming climate. “We distinguish two dates — when autumn is onsetting and the leaves are starting to turn yellow, versus when the leaves are 50 percent yellow — to represent the progression of leaf senescence,” she says.

    In the context of rising temperature, when the warming is happening plays a crucial role. If warming temperatures happen before the summer solstice, it triggers trees to begin their seasonal cycles faster, leading to reduced photosynthesis, ending in an earlier autumn. On the other hand, if the warming happens after the summer solstice, it delays the discoloration process, making autumn last longer. “For every degree Celsius of pre-solstice warming, the onset of leaf senescence advances by 1.9 days, while each degree Celsius of post-solstice warming delays the senescence process by 2.6 days,” she explains. Understanding the timing of autumn leaf senescence is essential in efforts to predict carbon storage capacity when modeling global carbon cycles.

    Another problem she’s working on in the Terrer Lab is discovering how deforestation is changing our local climate. How much is it cooling or warming the temperature, and how is the hydrological cycle changing because of deforestation? Investigating these questions will give insight into how much we can depend on natural solutions for carbon uptake to help mitigate climate change. “Quantitatively, we want to put a number to the amount of carbon uptake from various natural solutions, as opposed to other solutions,” she says.

    With year-and-a-half left in her postdoc appointment, Mirzagholi has begun considering her next career steps. She likes the idea of applying to climate scientist jobs in industry or national labs, as well as tenure track faculty positions. Whether she pursues a career in academia or industry, Mirzagholi aims to continue conducting fundamental climate science research. Her multidisciplinary background in physics, mathematics, and climate science has given her a multifaceted perspective, which she applies to every research problem.

    “Looking back, I’m grateful for all my educational experiences from spending time in the cultural center as a kid, my background in physics, the support from colleagues at the Crowther lab at ETH who facilitated my transition from physics to ecology, and now working at MIT alongside Professor Terrer, because it’s shaped my career path and the researcher I am today.” More

  • in

    A new dataset of Arctic images will spur artificial intelligence research

    As the U.S. Coast Guard (USCG) icebreaker Healy takes part in a voyage across the North Pole this summer, it is capturing images of the Arctic to further the study of this rapidly changing region. Lincoln Laboratory researchers installed a camera system aboard the Healy while at port in Seattle before it embarked on a three-month science mission on July 11. The resulting dataset, which will be one of the first of its kind, will be used to develop artificial intelligence tools that can analyze Arctic imagery.

    “This dataset not only can help mariners navigate more safely and operate more efficiently, but also help protect our nation by providing critical maritime domain awareness and an improved understanding of how AI analysis can be brought to bear in this challenging and unique environment,” says Jo Kurucar, a researcher in Lincoln Laboratory’s AI Software Architectures and Algorithms Group, which led this project.

    As the planet warms and sea ice melts, Arctic passages are opening up to more traffic, both to military vessels and ships conducting illegal fishing. These movements may pose national security challenges to the United States. The opening Arctic also leaves questions about how its climate, wildlife, and geography are changing.

    Today, very few imagery datasets of the Arctic exist to study these changes. Overhead images from satellites or aircraft can only provide limited information about the environment. An outward-looking camera attached to a ship can capture more details of the setting and different angles of objects, such as other ships, in the scene. These types of images can then be used to train AI computer-vision tools, which can help the USCG plan naval missions and automate analysis. According to Kurucar, USCG assets in the Arctic are spread thin and can benefit greatly from AI tools, which can act as a force multiplier.

    The Healy is the USCG’s largest and most technologically advanced icebreaker. Given its current mission, it was a fitting candidate to be equipped with a new sensor to gather this dataset. The laboratory research team collaborated with the USCG Research and Development Center to determine the sensor requirements. Together, they developed the Cold Region Imaging and Surveillance Platform (CRISP).

    “Lincoln Laboratory has an excellent relationship with the Coast Guard, especially with the Research and Development Center. Over a decade, we’ve established ties that enabled the deployment of the CRISP system,” says Amna Greaves, the CRISP project lead and an assistant leader in the AI Software Architectures and Algorithms Group. “We have strong ties not only because of the USCG veterans working at the laboratory and in our group, but also because our technology missions are complementary. Today it was deploying infrared sensing in the Arctic; tomorrow it could be operating quadruped robot dogs on a fast-response cutter.”

    The CRISP system comprises a long-wave infrared camera, manufactured by Teledyne FLIR (for forward-looking infrared), that is designed for harsh maritime environments. The camera can stabilize itself during rough seas and image in complete darkness, fog, and glare. It is paired with a GPS-enabled time-synchronized clock and a network video recorder to record both video and still imagery along with GPS-positional data.  

    The camera is mounted at the front of the ship’s fly bridge, and the electronics are housed in a ruggedized rack on the bridge. The system can be operated manually from the bridge or be placed into an autonomous surveillance mode, in which it slowly pans back and forth, recording 15 minutes of video every three hours and a still image once every 15 seconds.

    “The installation of the equipment was a unique and fun experience. As with any good project, our expectations going into the install did not meet reality,” says Michael Emily, the project’s IT systems administrator who traveled to Seattle for the install. Working with the ship’s crew, the laboratory team had to quickly adjust their route for running cables from the camera to the observation station after they discovered that the expected access points weren’t in fact accessible. “We had 100-foot cables made for this project just in case of this type of scenario, which was a good thing because we only had a few inches to spare,” Emily says.

    The CRISP project team plans to publicly release the dataset, anticipated to be about 4 terabytes in size, once the USCG science mission concludes in the fall.

    The goal in releasing the dataset is to enable the wider research community to develop better tools for those operating in the Arctic, especially as this region becomes more navigable. “Collecting and publishing the data allows for faster and greater progress than what we could accomplish on our own,” Kurucar adds. “It also enables the laboratory to engage in more advanced AI applications while others make more incremental advances using the dataset.”

    On top of providing the dataset, the laboratory team plans to provide a baseline object-detection model, from which others can make progress on their own models. More advanced AI applications planned for development are classifiers for specific objects in the scene and the ability to identify and track objects across images.

    Beyond assisting with USCG missions, this project could create an influential dataset for researchers looking to apply AI to data from the Arctic to help combat climate change, says Paul Metzger, who leads the AI Software Architectures and Algorithms Group.

    Metzger adds that the group was honored to be a part of this project and is excited to see the advances that come from applying AI to novel challenges facing the United States: “I’m extremely proud of how our group applies AI to the highest-priority challenges in our nation, from predicting outbreaks of Covid-19 and assisting the U.S. European Command in their support of Ukraine to now employing AI in the Arctic for maritime awareness.”

    Once the dataset is available, it will be free to download on the Lincoln Laboratory dataset website. More

  • in

    Study: The ocean’s color is changing as a consequence of climate change

    The ocean’s color has changed significantly over the last 20 years, and the global trend is likely a consequence of human-induced climate change, report scientists at MIT, the National Oceanography Center in the U.K., and elsewhere.  

    In a study appearing today in Nature, the team writes that they have detected changes in ocean color over the past two decades that cannot be explained by natural, year-to-year variability alone. These color shifts, though subtle to the human eye, have occurred over 56 percent of the world’s oceans — an expanse that is larger than the total land area on Earth.

    In particular, the researchers found that tropical ocean regions near the equator have become steadily greener over time. The shift in ocean color indicates that ecosystems within the surface ocean must also be changing, as the color of the ocean is a literal reflection of the organisms and materials in its waters.

    At this point, the researchers cannot say how exactly marine ecosystems are changing to reflect the shifting color. But they are pretty sure of one thing: Human-induced climate change is likely the driver.

    “I’ve been running simulations that have been telling me for years that these changes in ocean color are going to happen,” says study co-author Stephanie Dutkiewicz, senior research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences and the Center for Global Change Science. “To actually see it happening for real is not surprising, but frightening. And these changes are consistent with man-induced changes to our climate.”

    “This gives additional evidence of how human activities are affecting life on Earth over a huge spatial extent,” adds lead author B. B. Cael PhD ’19 of the National Oceanography Center in Southampton, U.K. “It’s another way that humans are affecting the biosphere.”

    The study’s co-authors also include Stephanie Henson of the National Oceanography Center, Kelsey Bisson at Oregon State University, and Emmanuel Boss of the University of Maine.

    Above the noise

    The ocean’s color is a visual product of whatever lies within its upper layers. Generally, waters that are deep blue reflect very little life, whereas greener waters indicate the presence of ecosystems, and mainly phytoplankton — plant-like microbes that are abundant in upper ocean and that contain the green pigment chlorophyll. The pigment helps plankton harvest sunlight, which they use to capture carbon dioxide from the atmosphere and convert it into sugars.

    Phytoplankton are the foundation of the marine food web that sustains progressively more complex organisms, on up to krill, fish, and seabirds and marine mammals. Phytoplankton are also a powerful muscle in the ocean’s ability to capture and store carbon dioxide. Scientists are therefore keen to monitor phytoplankton across the surface oceans and to see how these essential communities might respond to climate change. To do so, scientists have tracked changes in chlorophyll, based on the ratio of how much blue versus green light is reflected from the ocean surface, which can be monitored from space

    But around a decade ago, Henson, who is a co-author of the current study, published a paper with others, which showed that, if scientists were tracking chlorophyll alone, it would take at least 30 years of continuous monitoring to detect any trend that was driven specifically by climate change. The reason, the team argued, was that the large, natural variations in chlorophyll from year to year would overwhelm any anthropogenic influence on chlorophyll concentrations. It would therefore take several decades to pick out a meaningful, climate-change-driven signal amid the normal noise.

    In 2019, Dutkiewicz and her colleagues published a separate paper, showing through a new model that the natural variation in other ocean colors is much smaller compared to that of chlorophyll. Therefore, any signal of climate-change-driven changes should be easier to detect over the smaller, normal variations of other ocean colors. They predicted that such changes should be apparent within 20, rather than 30 years of monitoring.

    “So I thought, doesn’t it make sense to look for a trend in all these other colors, rather than in chlorophyll alone?” Cael says. “It’s worth looking at the whole spectrum, rather than just trying to estimate one number from bits of the spectrum.”

     The power of seven

    In the current study, Cael and the team analyzed measurements of ocean color taken by the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite, which has been monitoring ocean color for 21 years. MODIS takes measurements in seven visible wavelengths, including the two colors researchers traditionally use to estimate chlorophyll.

    The differences in color that the satellite picks up are too subtle for human eyes to differentiate. Much of the ocean appears blue to our eye, whereas the true color may contain a mix of subtler wavelengths, from blue to green and even red.

    Cael carried out a statistical analysis using all seven ocean colors measured by the satellite from 2002 to 2022 together. He first looked at how much the seven colors changed from region to region during a given year, which gave him an idea of their natural variations. He then zoomed out to see how these annual variations in ocean color changed over a longer stretch of two decades. This analysis turned up a clear trend, above the normal year-to-year variability.

    To see whether this trend is related to climate change, he then looked to Dutkiewicz’s model from 2019. This model simulated the Earth’s oceans under two scenarios: one with the addition of greenhouse gases, and the other without it. The greenhouse-gas model predicted that a significant trend should show up within 20 years and that this trend should cause changes to ocean color in about 50 percent of the world’s surface oceans — almost exactly what Cael found in his analysis of real-world satellite data.

    “This suggests that the trends we observe are not a random variation in the Earth system,” Cael says. “This is consistent with anthropogenic climate change.”

    The team’s results show that monitoring ocean colors beyond chlorophyll could give scientists a clearer, faster way to detect climate-change-driven changes to marine ecosystems.

    “The color of the oceans has changed,” Dutkiewicz says. “And we can’t say how. But we can say that changes in color reflect changes in plankton communities, that will impact everything that feeds on plankton. It will also change how much the ocean will take up carbon, because different types of plankton have different abilities to do that. So, we hope people take this seriously. It’s not only models that are predicting these changes will happen. We can now see it happening, and the ocean is changing.”

    This research was supported, in part, by NASA. More

  • in

    MIT engineering students take on the heat of Miami

    Think back to the last time you had to wait for a bus. How miserable were you? If you were in Boston, your experience might have included punishing wind and icy sleet — or, more recently, a punch of pollen straight to the sinuses. But in Florida’s Miami-Dade County, where the effects of climate change are both drastic and intensifying, commuters have to contend with an entirely different set of challenges: blistering temperatures and scorching humidity, making long stints waiting in the sun nearly unbearable.

    One of Miami’s most urgent transportation needs is shared by car-clogged Boston: coaxing citizens to use the municipal bus network, rather than the emissions-heavy individual vehicles currently contributing to climate change. But buses can be a tough sell in a sunny city where humidity hovers between 60 and 80 percent year-round. 

    Enter MIT’s Department of Electrical Engineering and Computer Science (EECS) and the MIT Priscilla King Gray (PKG) Public Service Center. The result of close collaboration between the two organizations, class 6.900 (Engineering For Impact) challenges EECS students to apply their engineering savvy to real-world problems beyond the MIT campus.

    This spring semester, the real-world problem was heat. 

    Miami-Dade County Department of Transportation and Public Works Chief Innovation Officer Carlos Cruz-Casas explains: “We often talk about the city we want to live in, about how the proper mix of public transportation, on-demand transit, and other mobility solutions, such as e-bikes and e-scooters, could help our community live a car-light life. However, none of this will be achievable if the riders are not comfortable when doing so.” 

    “When people think of South Florida and climate change, they often think of sea level rise,” says Juan Felipe Visser, deputy director of equity and engagement within the Office of the Mayor in Miami-Dade. “But heat really is the silent killer. So the focus of this class, on heat at bus stops, is very apt.” With little tree cover to give relief at some of the hottest stops, Miami-Dade commuters cluster in tiny patches of shade behind bus stops, sometimes giving up when the heat becomes unbearable. 

    A more conventional electrical engineering course might use temperature monitoring as an abstract example, building sample monitors in isolation and grading them as a merely academic exercise. But Professor Joel Volman, EECS faculty head of electrical engineering, and Joe Steinmeyer, senior lecturer in EECS, had something more impactful in mind.

    “Miami-Dade has a large population of people who are living in poverty, undocumented, or who are otherwise marginalized,” says Voldman. “Waiting, sometimes for a very long time, in scorching heat for the bus is just one aspect of how a city population can be underserved, but by measuring patterns in how many people are waiting for a bus, how long they wait, and in what conditions, we can begin to see where services are not keeping up with demand.”

    Only after that gap is quantified can the work of city and transportation planners begin, Cruz-Casas explains: “We needed to quantify the time riders are exposed to extreme heat and prioritize improvements, including on-time performance improvements, increasing service frequency, or looking to enhance the tree canopy near the bus stop.” 

    Quantifying that time — and the subjective experience of the wait — proved tricky, however. With over 7,500 bus stops along 101 bus routes, Miami-Dade’s transportation network presents a considerable data-collection challenge. A network of physical temperature monitors could be useful, but only if it were carefully calibrated to meet the budgetary, environmental, privacy, and implementation requirements of the city. But how do you work with city officials — not to mention all of bus-riding Miami — from over 2,000 miles away? 

    This is where the PKG Center comes in. “We are a hub and a connector and facilitator of best practices,” explains Jill Bassett, associate dean and director of the center, who worked with Voldman and Steinmeyer to find a municipal partner organization for the course. “We bring knowledge of current pedagogy around community-engaged learning, which includes: help with framing a partnership that centers community-identified concerns and is mutually beneficial; identifying and learning from a community partner; talking through ways to build in opportunities for student learners to reflect on power dynamics, reciprocity, systems thinking, long-term planning, continuity, ethics, all the types of things that come up with this kind of shared project.”

    Through a series of brainstorming conversations, Bassett helped Voldman and Steinmeyer structure a well-defined project plan, as Cruz-Casas weighed in on the county’s needed technical specifications (including affordability, privacy protection, and implementability).

    “This course brings together a lot of subject area experts,” says Voldman. “We brought in guest lecturers, including Abby Berenson from the Sloan Leadership Center, to talk about working in teams; engineers from BOSE to talk about product design, certification, and environmental resistance; the co-founder and head of engineering from MIT spinout Butlr to talk about their low-power occupancy sensor; Tony Hu from MIT IDM [Integrated Design and Management] to talk about industrial design; and Katrina LaCurts from EECS to talk about communications and networking.”

    With the support of two generous donations and a gift of software from Altium, 6.900 developed into a hands-on exercise in hardware/software product development with a tangible goal in sight: build a better bus monitor.

    The challenges involved in this undertaking became apparent as soon as the 6.900 students began designing their monitors. “The most challenging requirement to meet was that the monitor be able to count how many people were waiting — and for how long they’d been standing there — while still maintaining privacy,” says Fabian Velazquez ’23 a recent EECS graduate. The task was complicated by commuters’ natural tendency to stand where the shade goes — whether beneath a tree or awning or snaking against a nearby wall in a line — rather than directly next to the bus sign or inside the bus shelter. “Accurately measuring people count with a camera — the most straightforward choice — is already quite difficult since you have to incorporate machine learning to identify which objects in frame are people. Maintaining privacy added an extra layer of constraint … since there is no guarantee the collected data wouldn’t be vulnerable.”

    As the groups weighed various privacy-preserving options, including lidar, radar, and thermal imaging, the class realized that Wi-Fi “sniffers,” which count the number of Wi-Fi enabled signals in the immediate area, were their best option to count waiting passengers. “We were all excited and ready for this amazing, answer-to-all-our-problems radar sensor to count people,” says Velasquez. “That component was extremely complex, however, and the complexity would have ultimately made my team use a lot of time and resources to integrate with our system. We also had a short time-to-market for this system we developed. We made the trade-off of complexity for robustness.” 

    The weather also posed its own set of challenges. “Environmental conditions were big factors on the structure and design of our devices,” says Yong Yan (Crystal) Liang, a rising junior majoring in EECS. “We incorporated humidity and temperature sensors into our data to show the weather at individual stops. Additionally, we also considered how our enclosure may be affected by extreme heat or potential hurricanes.”

    The heat variable proved problematic in multiple ways. “People detection was especially difficult, for in the Miami heat, thermal cameras may not be able to distinguish human body temperature from the surrounding air temperature, and the glare of the sun off of other surfaces in the area makes most forms of imaging very buggy,” says Katherine Mohr ’23. “My team had considered using mmWave sensors to get around these constraints, but we found the processing to be too difficult, and (like the rest of the class), we decided to only move forward with Wi-Fi/BLE [Bluetooth Low Energy] sniffers.”

    The most valuable component of the new class may well have been the students’ exposure to real-world hardware/software engineering product development, where limitations on time and budget always exist, and where client requests must be carefully considered.  “Having an actual client to work with forced us to learn how to turn their wants into more specific technical specifications,” says Mohr. “We chose deliverables each week to complete by Friday, prioritizing tasks which would get us to a minimum viable product, as well as tasks that would require extra manufacturing time, like designing the printed-circuit board and enclosure.”

    Play video

    Joel Voldman, who co-designed 6.900 (Engineering For Impact) with Joe Steinmeyer and MIT’s Priscilla King Gray (PKG) Public Service Center, describes how the course allowed students help develop systems for the public good. Voldman is the winner of the 2023 Teaching with Digital Technology Award, which is co-sponsored by MIT Open Learning and the Office of the Vice Chancellor. Video: MIT Open Learning

    Crystal Liang counted her conversations with city representatives as among her most valuable 6.900 experiences. “We generated a lot of questions and were able to communicate with the community leaders of this project from Miami-Dade, who made time to answer all of them and gave us ideas from the goals they were trying to achieve,” she reports. “This project gave me a new perspective on problem-solving because it taught me to see things from the community members’ point of view.” Some of those community leaders, including Marta Viciedo, co-founder of Transit Alliance Miami, joined the class’s final session on May 16 to review the students’ proposed solutions. 

    The students’ thoughtful approach paid off when it was time to present the heat monitors to the class’s client. In a group conference call with Miami-Dade officials toward the end of the semester, the student teams shared their findings and the prototypes they’d created, along with videos of the devices at work. Juan Felipe Visser was among those in attendance. “This is a lot of work,” he told the students following their presentation. “So first of all, thank you for doing that, and for presenting to us. I love the concept. I took the bus this morning, as I do every morning, and was battered by the sun and the heat. So I personally appreciated the focus.” 

    Cruz-Casas agreed: “I am pleasantly surprised by the diverse approach the students are taking. We presented a challenge, and they have responded to it and managed to think beyond the problem at hand. I’m very optimistic about how the outcomes of this project will have a long-lasting impact for our community. At a minimum, I’m thinking that the more awareness we raise about this topic, the more opportunities we have to have the brightest minds seeking for a solution.”

    The creators of 6.900 agree, and hope that their class helps more MIT engineers to broaden their perspective on the meaning and application of their work. 

    “We are really excited about students applying their skills within a real-world, complex environment that will impact real people,” says Bassett. “We are excited that they are learning that it’s not just the design of technology that matters, but that climate; environment and built environment; and issues around socioeconomics, race, and equity, all come into play. There are layers and layers to the creation and deployment of technology in a demographically diverse multilingual community that is at the epicenter of climate change.” More