More stories

  • in

    Advancing technology for aquaculture

    According to the National Oceanic and Atmospheric Administration, aquaculture in the United States represents a $1.5 billion industry annually. Like land-based farming, shellfish aquaculture requires healthy seed production in order to maintain a sustainable industry. Aquaculture hatchery production of shellfish larvae — seeds — requires close monitoring to track mortality rates and assess health from the earliest stages of life. 

    Careful observation is necessary to inform production scheduling, determine effects of naturally occurring harmful bacteria, and ensure sustainable seed production. This is an essential step for shellfish hatcheries but is currently a time-consuming manual process prone to human error. 

    With funding from MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS), MIT Sea Grant is working with Associate Professor Otto Cordero of the MIT Department of Civil and Environmental Engineering, Professor Taskin Padir and Research Scientist Mark Zolotas at the Northeastern University Institute for Experiential Robotics, and others at the Aquaculture Research Corporation (ARC), and the Cape Cod Commercial Fishermen’s Alliance, to advance technology for the aquaculture industry. Located on Cape Cod, ARC is a leading shellfish hatchery, farm, and wholesaler that plays a vital role in providing high-quality shellfish seed to local and regional growers.

    Two MIT students have joined the effort this semester, working with Robert Vincent, MIT Sea Grant’s assistant director of advisory services, through the Undergraduate Research Opportunities Program (UROP). 

    First-year student Unyime Usua and sophomore Santiago Borrego are using microscopy images of shellfish seed from ARC to train machine learning algorithms that will help automate the identification and counting process. The resulting user-friendly image recognition tool aims to aid aquaculturists in differentiating and counting healthy, unhealthy, and dead shellfish larvae, improving accuracy and reducing time and effort.

    Vincent explains that AI is a powerful tool for environmental science that enables researchers, industry, and resource managers to address challenges that have long been pinch points for accurate data collection, analysis, predictions, and streamlining processes. “Funding support from programs like J-WAFS enable us to tackle these problems head-on,” he says. 

    ARC faces challenges with manually quantifying larvae classes, an important step in their seed production process. “When larvae are in their growing stages they are constantly being sized and counted,” explains Cheryl James, ARC larval/juvenile production manager. “This process is critical to encourage optimal growth and strengthen the population.” 

    Developing an automated identification and counting system will help to improve this step in the production process with time and cost benefits. “This is not an easy task,” says Vincent, “but with the guidance of Dr. Zolotas at the Northeastern University Institute for Experiential Robotics and the work of the UROP students, we have made solid progress.” 

    The UROP program benefits both researchers and students. Involving MIT UROP students in developing these types of systems provides insights into AI applications that they might not have considered, providing opportunities to explore, learn, and apply themselves while contributing to solving real challenges.

    Borrego saw this project as an opportunity to apply what he’d learned in class 6.390 (Introduction to Machine Learning) to a real-world issue. “I was starting to form an idea of how computers can see images and extract information from them,” he says. “I wanted to keep exploring that.”

    Usua decided to pursue the project because of the direct industry impacts it could have. “I’m pretty interested in seeing how we can utilize machine learning to make people’s lives easier. We are using AI to help biologists make this counting and identification process easier.” While Usua wasn’t familiar with aquaculture before starting this project, she explains, “Just hearing about the hatcheries that Dr. Vincent was telling us about, it was unfortunate that not a lot of people know what’s going on and the problems that they’re facing.”

    On Cape Cod alone, aquaculture is an $18 million per year industry. But the Massachusetts Division of Marine Fisheries estimates that hatcheries are only able to meet 70–80 percent of seed demand annually, which impacts local growers and economies. Through this project, the partners aim to develop technology that will increase seed production, advance industry capabilities, and help understand and improve the hatchery microbiome.

    Borrego explains the initial challenge of having limited data to work with. “Starting out, we had to go through and label all of the data, but going through that process helped me learn a lot.” In true MIT fashion, he shares his takeaway from the project: “Try to get the best out of what you’re given with the data you have to work with. You’re going to have to adapt and change your strategies depending on what you have.”

    Usua describes her experience going through the research process, communicating in a team, and deciding what approaches to take. “Research is a difficult and long process, but there is a lot to gain from it because it teaches you to look for things on your own and find your own solutions to problems.”

    In addition to increasing seed production and reducing the human labor required in the hatchery process, the collaborators expect this project to contribute to cost savings and technology integration to support one of the most underserved industries in the United States. 

    Borrego and Usua both plan to continue their work for a second semester with MIT Sea Grant. Borrego is interested in learning more about how technology can be used to protect the environment and wildlife. Usua says she hopes to explore more projects related to aquaculture. “It seems like there’s an infinite amount of ways to tackle these issues.” More

  • in

    Using deep learning to image the Earth’s planetary boundary layer

    Although the troposphere is often thought of as the closest layer of the atmosphere to the Earth’s surface, the planetary boundary layer (PBL) — the lowest layer of the troposphere — is actually the part that most significantly influences weather near the surface. In the 2018 planetary science decadal survey, the PBL was raised as an important scientific issue that has the potential to enhance storm forecasting and improve climate projections.  

    “The PBL is where the surface interacts with the atmosphere, including exchanges of moisture and heat that help lead to severe weather and a changing climate,” says Adam Milstein, a technical staff member in Lincoln Laboratory’s Applied Space Systems Group. “The PBL is also where humans live, and the turbulent movement of aerosols throughout the PBL is important for air quality that influences human health.” 

    Although vital for studying weather and climate, important features of the PBL, such as its height, are difficult to resolve with current technology. In the past four years, Lincoln Laboratory staff have been studying the PBL, focusing on two different tasks: using machine learning to make 3D-scanned profiles of the atmosphere, and resolving the vertical structure of the atmosphere more clearly in order to better predict droughts.  

    This PBL-focused research effort builds on more than a decade of related work on fast, operational neural network algorithms developed by Lincoln Laboratory for NASA missions. These missions include the Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of Smallsats (TROPICS) mission as well as Aqua, a satellite that collects data about Earth’s water cycle and observes variables such as ocean temperature, precipitation, and water vapor in the atmosphere. These algorithms retrieve temperature and humidity from the satellite instrument data and have been shown to significantly improve the accuracy and usable global coverage of the observations over previous approaches. For TROPICS, the algorithms help retrieve data that are used to characterize a storm’s rapidly evolving structures in near-real time, and for Aqua, it has helped increase forecasting models, drought monitoring, and fire prediction. 

    These operational algorithms for TROPICS and Aqua are based on classic “shallow” neural networks to maximize speed and simplicity, creating a one-dimensional vertical profile for each spectral measurement collected by the instrument over each location. While this approach has improved observations of the atmosphere down to the surface overall, including the PBL, laboratory staff determined that newer “deep” learning techniques that treat the atmosphere over a region of interest as a three-dimensional image are needed to improve PBL details further.

    “We hypothesized that deep learning and artificial intelligence (AI) techniques could improve on current approaches by incorporating a better statistical representation of 3D temperature and humidity imagery of the atmosphere into the solutions,” Milstein says. “But it took a while to figure out how to create the best dataset — a mix of real and simulated data; we needed to prepare to train these techniques.”

    The team collaborated with Joseph Santanello of the NASA Goddard Space Flight Center and William Blackwell, also of the Applied Space Systems Group, in a recent NASA-funded effort showing that these retrieval algorithms can improve PBL detail, including more accurate determination of the PBL height than the previous state of the art. 

    While improved knowledge of the PBL is broadly useful for increasing understanding of climate and weather, one key application is prediction of droughts. According to a Global Drought Snapshot report released last year, droughts are a pressing planetary issue that the global community needs to address. Lack of humidity near the surface, specifically at the level of the PBL, is the leading indicator of drought. While previous studies using remote-sensing techniques have examined the humidity of soil to determine drought risk, studying the atmosphere can help predict when droughts will happen.  

    In an effort funded by Lincoln Laboratory’s Climate Change Initiative, Milstein, along with laboratory staff member Michael Pieper, are working with scientists at NASA’s Jet Propulsion Laboratory (JPL) to use neural network techniques to improve drought prediction over the continental United States. While the work builds off of existing operational work JPL has done incorporating (in part) the laboratory’s operational “shallow” neural network approach for Aqua, the team believes that this work and the PBL-focused deep learning research work can be combined to further improve the accuracy of drought prediction. 

    “Lincoln Laboratory has been working with NASA for more than a decade on neural network algorithms for estimating temperature and humidity in the atmosphere from space-borne infrared and microwave instruments, including those on the Aqua spacecraft,” Milstein says. “Over that time, we have learned a lot about this problem by working with the science community, including learning about what scientific challenges remain. Our long experience working on this type of remote sensing with NASA scientists, as well as our experience with using neural network techniques, gave us a unique perspective.”

    According to Milstein, the next step for this project is to compare the deep learning results to datasets from the National Oceanic and Atmospheric Administration, NASA, and the Department of Energy collected directly in the PBL using radiosondes, a type of instrument flown on a weather balloon. “These direct measurements can be considered a kind of ‘ground truth’ to quantify the accuracy of the techniques we have developed,” Milstein says.

    This improved neural network approach holds promise to demonstrate drought prediction that can exceed the capabilities of existing indicators, Milstein says, and to be a tool that scientists can rely on for decades to come. More

  • in

    New major crosses disciplines to address climate change

    Lauren Aguilar knew she wanted to study energy systems at MIT, but before Course 1-12 (Climate System Science and Engineering) became a new undergraduate major, she didn’t see an obvious path to study the systems aspects of energy, policy, and climate associated with the energy transition.

    Aguilar was drawn to the new major that was jointly launched by the departments of Civil and Environmental Engineering (CEE) and Earth, Atmospheric and Planetary Sciences (EAPS) in 2023. She could take engineering systems classes and gain knowledge in climate.

    “Having climate knowledge enriches my understanding of how to build reliable and resilient energy systems for climate change mitigation. Understanding upon what scale we can forecast and predict climate change is crucial to build the appropriate level of energy infrastructure,” says Aguilar.

    The interdisciplinary structure of the 1-12 major has students engaging with and learning from professors in different disciplines across the Institute. The blended major was designed to provide a foundational understanding of the Earth system and engineering principles — as well as an understanding of human and institutional behavior as it relates to the climate challenge. Students learn the fundamental sciences through subjects like an atmospheric chemistry class focused on the global carbon cycle or a physics class on low-carbon energy systems. The major also covers topics in data science and machine learning as they relate to forecasting climate risks and building resilience, in addition to policy, economics, and environmental justice studies.

    Junior Ananda Figueiredo was one of the first students to declare the 1-12 major. Her decision to change majors stemmed from a motivation to improve people’s lives, especially when it comes to equality. “I like to look at things from a systems perspective, and climate change is such a complicated issue connected to many different pieces of our society,” says Figueiredo.

    A multifaceted field of study

    The 1-12 major prepares students with the necessary foundational expertise across disciplines to confront climate change. Andrew Babbin, an academic advisor in the new degree program and the Cecil and Ida Green Career Development Associate Professor in EAPS, says the new major harnesses rigorous training encompassing science, engineering, and policy to design and execute a way forward for society.

    Within its first year, Course 1-12 has attracted students with a diverse set of interests, ranging from machine learning for sustainability to nature-based solutions for carbon management to developing the next renewable energy technology and integrating it into the power system.

    Academic advisor Michael Howland, the Esther and Harold E. Edgerton Assistant Professor of Civil and Environmental Engineering, says the best part of this degree is the students, and the enthusiasm and optimism they bring to the climate challenge.

    “We have students seeking to impact policy and students double-majoring in computer science. For this generation, climate change is a challenge for today, not for the future. Their actions inside and outside the classroom speak to the urgency of the challenge and the promise that we can solve it,” Howland says.

    The degree program also leaves plenty of space for students to develop and follow their interests. Sophomore Katherine Kempff began this spring semester as a 1-12 major interested in sustainability and renewable energy. Kempff was worried she wouldn’t be able to finish 1-12 once she made the switch to a different set of classes, but Howland assured her there would be no problems, based on the structure of 1-12.

    “I really like how flexible 1-12 is. There’s a lot of classes that satisfy the requirements, and you are not pigeonholed. I feel like I’m going to be able to do what I’m interested in, rather than just following a set path of a major,” says Kempff.

    Kempff is leveraging her skills she developed this semester and exploring different career interests. She is interviewing for sustainability and energy-sector internships in Boston and MIT this summer, and is particularly interested in assisting MIT in meeting its new sustainability goals.

    Engineering a sustainable future

    The new major dovetail’s MIT’s commitment to address climate change with its steps in prioritizing and enhancing climate education. As the Institute continues making strides to accelerate solutions, students can play a leading role in changing the future.   

    “Climate awareness is critical to all MIT students, most of whom will face the consequences of the projection models for the end of the century,” says Babbin. “One-12 will be a focal point of the climate education mission to train the brightest and most creative students to engineer a better world and understand the complex science necessary to design and verify any solutions they invent.”

    Justin Cole, who transferred to MIT in January from the University of Colorado, served in the U.S. Air Force for nine years. Over the course of his service, he had a front row seat to the changing climate. From helping with the wildfire cleanup in Black Forest, Colorado — after the state’s most destructive fire at the time — to witnessing two category 5 typhoons in Japan in 2018, Cole’s experiences of these natural disasters impressed upon him that climate security was a prerequisite to international security. 

    Cole was recently accepted into the MIT Energy and Climate Club Launchpad initiative where he will work to solve real-world climate and energy problems with professionals in industry.

    “All of the dots are connecting so far in my classes, and all the hopes that I have for studying the climate crisis and the solutions to it at MIT are coming true,” says Cole.

    With a career path that is increasingly growing, there is a rising demand for scientists and engineers who have both deep knowledge of environmental and climate systems and expertise in methods for climate change mitigation.

    “Climate science must be coupled with climate solutions. As we experience worsening climate change, the environmental system will increasingly behave in new ways that we haven’t seen in the past,” says Howland. “Solutions to climate change must go beyond good engineering of small-scale components. We need to ensure that our system-scale solutions are maximally effective in reducing climate change, but are also resilient to climate change. And there is no time to waste,” he says. More

  • in

    A home where world-changing innovations take flight

    In a large, open space on the first floor of 750 Main Street in Cambridge, Massachusetts, a carbon-capture company is heating up molten salts to 600 degrees Celsius right next to a quantum computing company’s device for supercooling qubits. The difference is about 900 degrees across 15 feet.

    It doesn’t take long in the tour of The Engine Accelerator to realize this isn’t your typical co-working space. Companies here are working at the extremes to develop new technologies with world-changing impact — what The Engine Accelerator’s leaders call “tough tech.”

    Comprising four floors and 150,000 square feet next door to MIT’s campus, the new space offers startups specialized lab equipment, advanced machining, fabrication facilities, office space, and a range of startup support services.

    The goal is to give young companies merging science and engineering all of the resources they need to move ideas from the lab bench to their own mass manufacturing lines.

    “The infrastructure has always been a really important accelerant for getting these kinds of companies off and running,” The Engine Accelerator President Emily Knight says. “Now you can start a company and, on day one, start building. Real estate is such a big factor. Our thought was, let’s make this investment in the infrastructure for the founders. It’s an agile lease that enables them to be very flexible as they grow.”

    Since the new facility opened its doors in the summer of 2022, the Accelerator has welcomed around 100 companies that employ close to 1,000 people. In addition to the space, members enjoy educational workshops on topics like fundraising and hiring, events, and networking opportunities that the Accelerator team hopes foster a sense of community among people working in the tough tech space overall.

    “We’re not just advocates for the startups in the space,” Knight says. “We’re advocates for tough tech as a whole. We think it’s important for the state of Massachusetts to create a tough tech hub here, and we think it’s important for national competitiveness.”

    Tough tech gets a home

    The Engine was spun out of MIT in 2016 as a public benefit corporation with the mission of bridging the gap between discovery and commercialization. Since its inception, it has featured an investment component, now known as Engine Ventures, and a shared services component.

    From the moment The Engine opened its doors to startups in its original headquarters on Massachusetts Avenue in Cambridge, the services team got a firsthand look at the unique challenges faced by tough tech startups. After speaking with founders, they realized their converted office space would need more power, stronger floors, and full lab accommodations.

    The team rose to the challenge. They turned a closet into a bio lab. They turned an unused wellness room into a laser lab. They managed to accommodate Commonwealth Fusion Systems when the founders informed them a 5,000-pound magnet would soon arrive for testing.

    But supporting ambitious founders in their quest to build world-changing companies was always going to require a bigger boat. As early as 2017, MIT’s leaders were considering turning the old Polaroid building, which had sat empty next to MIT’s campus for nearly 20 years, into the new home for tough tech.

    Speaking of tough, construction crews began the extensive building renovations for the Accelerator at the end of 2019, a few months before the Covid-19 pandemic. The team managed to avoid the worst of the supply chain disruptions, but they quickly learned the building has its quirks. Each floor is a different ceiling height, and massive pillars known as mushroom columns punctuate each floor.

    Based on conversations with founders, The Engine’s Accelerator team outfitted the renovated building with office and co-working space, a full machine shop, labs for biology and chemistry work, an array of 3D printers, bike storage, and, perhaps most important, cold brew on tap.

    “I think of the Accelerator as a really great Airbnb host rather than a landlord, where maybe you rented a bedroom in a large house, but you feel like you rented the whole thing because you have access to all kinds of amazing equipment,” says Bernardo Cervantes PhD ’20, co-founder of Concerto Biosciences, which is developing microbes for a variety of uses in human health and agriculture.

    The Engine Accelerator’s team credits MIT leadership with helping them manage the project, noting that the MIT Environment, Health and Safety office was particularly helpful.

    A week after the Accelerator opened its doors in August 2022, on a single sweltering day, 35 companies moved in. By 2023, the Accelerator was home to 55 companies. Since then, the Accelerator’s team has done everything they could to continue to grow.

    “At one point, one of our team members came to me with her tail between her legs and sheepishly said, ‘I gave our office space to a startup,’” Knight recalls. “I said, ‘Yes! That means you get it! We don’t need an office — we can sit anywhere.’”

    The first floor holds some of the largest machinery, including that molten salt device (developed by Mantel Capture) and the quantum computer (developed by Atlantic Quantum). On the next level, a machine shop and a fabrication space featuring every 3D printer imaginable offer ways for companies to quickly build prototype products or parts. Another floor is dubbed “the Avenue” and features a kitchen and tables for networking and serendipitous meetings. The Avenue is lined by huge garage doors that open to accommodate larger crowds for workshops and meeting spaces.

    “Even though the founders are working in different spaces, we wanted to create an area where people can connect and run into each other and get help with 3D printing or hiring or anything else,” Knight says. “It fosters those casual interactions that are very important for startups.”

    An ecosystem to change the world

    Only about one-fifth of the companies in the Accelerator space are portfolio companies of Engine Ventures. The two entities operate separately, but they pool their shared learning about supporting tough tech, and Engine Ventures has an office in the Accelerator’s space.

    Engine Ventures CEO Katie Rae sees it as a symbiotic partnership.

    “We needed to have all these robust services for everyone in tough tech, not just the portfolio companies,” Rae says. “We’ll always work together and produce the Tough Tech Summit together because of our overarching missions. It’s very much like a rising tide lifts all boats. All of these companies are working to change the world in their own verticals, so we’re just focusing on the impact they’re trying to have and making that the story.”

    Rae says MIT has helped both of The Engine’s teams think through the best way to support tough tech startups.

    “Being a partner with MIT, which understands innovation and safety better than anyone, has allowed us to say yes to more things and have more flexibility,” Rae says. “If you’re going to go at breakneck speed to solve global problems, you better have a mentality of getting things done fast and safely, and I think that’s been a core tenet of The Engine.”

    Meanwhile, Knight says her team hasn’t stopped learning from the tough tech community and will continue to adapt.

    “There’s just a waterfall of information coming from these companies,” Knight says. “It’s about iterating on our services to best support them, so we can go to people on our team and ask, ‘Can you learn to run this type of program, because we just learned these five founders need it?’ Every founder we know in the area has a badge so they can come in. We want to create a hub for tough tech within this Kendall Square area that’s already a hub in so many ways.” More

  • in

    Has remote work changed how people travel in the U.S?

    The prevalence of remote work since the start of the Covid-19 pandemic has significantly changed urban transportation patterns in the U.S., according to new study led by MIT researchers.

    The research finds significant variation between the effects of remote work on vehicle miles driven and on mass-transit ridership across the U.S.

    “A 1 percent decrease in onsite workers leads to a roughly 1 percent reduction in [automobile] vehicle miles driven, but a 2.3 percent reduction in mass transit ridership,” says Yunhan Zheng SM ’21, PhD ’24, an MIT postdoc who is co-author of a the study.

    “This is one of the first studies that identifies the causal effect of remote work on vehicle miles traveled and transit ridership across the U.S.,” adds Jinhua Zhao, an MIT professor and another co-author of the paper.

    By accounting for many of the nuances of the issue, across the lower 48 states and the District of Columbia as well as 217 metropolitan areas, the scholars believe they have arrived at a robust conclusion demonstrating the effects of working from home on larger mobility patterns.

    The paper, “Impacts of remote work on vehicle miles traveled and transit ridership in the USA,” appears today in the journal Nature Cities. The authors are Zheng, a doctoral graduate of MIT’s Department of Civil and Environmental Engineering and a postdoc at the Singapore–MIT Alliance for Research and Technology (SMART); Shenhao Wang PhD ’20, an assistant professor at the University of Florida; Lun Liu, an assistant professor at Peking University; Jim Aloisi, a lecturer in MIT’s Department of Urban Studies and Planning (DUSP); and Zhao, the Professor of Cities and Transportation, founder of the MIT Mobility Initiative, and director of MIT’s JTL Urban Mobility Lab and Transit Lab.

    The researchers gathered data on the prevalence of remote work from multiple sources, including Google location data, travel data from the U.S. Federal Highway Administration and the National Transit Database, and the monthly U.S. Survey of Working Arrangements and Attitudes (run jointly by Stanford University, the University of Chicago, ITAM, and MIT).

    The study reveals significant variation among U.S. states when it comes to how much the rise of remote work has affected mileage driven.

    “The impact of a 1 percent change in remote work on the reduction of vehicle miles traveled in New York state is only about one-quarter of that in Texas,” Zheng observes. “There is real variation there.”

    At the same time, remote work has had the biggest effect on mass-transit revenues in places with widely used systems, with New York City, Chicago, San Francisco, Boston, and Philadelphia making up the top five hardest-hit metro areas.

    The overall effect is surprisingly consistent over time, from early 2020 through late 2022.

    “In terms of the temporal variation, we found that the effect is quite consistent across our whole study period,” Zheng says. “It’s not just significant in the early stage of the pandemic, when remote work was a necessity for many. The magnitude remains consistent into the later period, when many people have the flexibility to choose where they want to work. We think this may have long-term implications.”

    Additionally, the study estimates the impact that still larger numbers of remote workers could have on the environment and mass transit.

    “On a national basis, we estimate that a 10 percent decrease in the number of onsite workers compared to prepandemic levels will reduce the annual total vehicle-related CO2 emissions by 191.8 million metric tons,” Wang says.

    The study also projects that across the 217 metropolitan areas in the study, a 10 percent decrease in the number of onsite workers, compared to prepandemic levels, would lead to an annual loss of 2.4 billion transit trips and $3.7 billion in fare revenue — equal to roughly 27 percent of the annual transit ridership and fare revenue in 2019.

    “The substantial influence of remote work on transit ridership highlights the need for transit agencies to adapt their services accordingly, investing in services tailored to noncommuting trips and implementing more flexible schedules to better accommodate the new demand patterns,” Zhao says.

    The research received support from the MIT Energy Initiative; the Barr Foundation; the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise program; the Research Opportunity Seed Fund 2023 from the University of Florida; and the Beijing Social Science Foundation. More

  • in

    Shining a light on oil fields to make them more sustainable

    Operating an oil field is complex and there is a staggeringly long list of things that can go wrong.

    One of the most common problems is spills of the salty brine that’s a toxic byproduct of pumping oil. Another is over- or under-pumping that can lead to machine failure and methane leaks. (The oil and gas industry is the largest industrial emitter of methane in the U.S.) Then there are extreme weather events, which range from winter frosts to blazing heat, that can put equipment out of commission for months. One of the wildest problems Sebastien Mannai SM ’14, PhD ’18 has encountered are hogs that pop open oil tanks with their snouts to enjoy on-demand oil baths.

    Mannai helps oil field owners detect and respond to these problems while optimizing the operation of their machinery to prevent the issues from occurring in the first place. He is the founder and CEO of Amplified Industries, a company selling oil field monitoring and control tools that help make the industry more efficient and sustainable.

    Amplified Industries’ sensors and analytics give oil well operators real-time alerts when things go wrong, allowing them to respond to issues before they become disasters.

    “We’re able to find 99 percent of the issues affecting these machines, from mechanical failures to human errors, including issues happening thousands of feet underground,” Mannai explains. “With our AI solution, operators can put the wells on autopilot, and the system automatically adjusts or shuts the well down as soon as there’s an issue.”

    Amplified currently works with private companies in states spanning from Texas to Wyoming, that own and operate as many as 3,000 wells. Such companies make up the majority of oil well operators in the U.S. and operate both new and older, more failure-prone equipment that has been in the field for decades.

    Such operators also have a harder time responding to environmental regulations like the Environmental Protection Agency’s new methane guidelines, which seek to dramatically reduce emissions of the potent greenhouse gas in the industry over the next few years.

    “These operators don’t want to be releasing methane,” Mannai explains. “Additionally, when gas gets into the pumping equipment, it leads to premature failures. We can detect gas and slow the pump down to prevent it. It’s the best of both worlds: The operators benefit because their machines are working better, saving them money while also giving them a smaller environmental footprint with fewer spills and methane leaks.”

    Leveraging “every MIT resource I possibly could”

    Mannai learned about the cutting-edge technology used in the space and aviation industries as he pursued his master’s degree at the Gas Turbine Laboratory in MIT’s Department of Aeronautics and Astronautics. Then, during his PhD at MIT, he worked with an oil services company and discovered the oil and gas industry was still relying on decades-old technologies and equipment.

    “When I first traveled to the field, I could not believe how old-school the actual operations were,” says Mannai, who has previously worked in rocket engine and turbine factories. “A lot of oil wells have to be adjusted by feel and rules of thumb. The operators have been let down by industrial automation and data companies.”

    Monitoring oil wells for problems typically requires someone in a pickup truck to drive hundreds of miles between wells looking for obvious issues, Mannai says. The sensors that are deployed are expensive and difficult to replace. Over time, they’re also often damaged in the field to the point of being unusable, forcing technicians to make educated guesses about the status of each well.

    “We often see that equipment unplugged or programmed incorrectly because it is incredibly over-complicated and ill-designed for the reality of the field,” Mannai says. “Workers on the ground often have to rip it out and bypass the control system to pump by hand. That’s how you end up with so many spills and wells pumping at suboptimal levels.”

    To build a better oil field monitoring system, Mannai received support from the MIT Sandbox Innovation Fund and the Venture Mentoring Service (VMS). He also participated in the delta V summer accelerator at the Martin Trust Center for MIT Entrepreneurship, the fuse program during IAP, and the MIT I-Corps program, and took a number of classes at the MIT Sloan School of Management. In 2019, Amplified Industries — which operated under the name Acoustic Wells until recently — won the MIT $100K Entrepreneurship competition.

    “My approach was to sign up to every possible entrepreneurship related program and to leverage every MIT resource I possibly could,” Mannai says. “MIT was amazing for us.”

    Mannai officially launched the company after his postdoc at MIT, and Amplified raised its first round of funding in early 2020. That year, Amplified’s small team moved into the Greentown Labs startup incubator in Somerville.

    Mannai says building the company’s battery-powered, low-cost sensors was a huge challenge. The sensors run machine-learning inference models and their batteries last for 10 years. They also had to be able to handle extreme conditions, from the scorching hot New Mexico desert to the swamps of Louisiana and the freezing cold winters in North Dakota.

    “We build very rugged, resilient hardware; it’s a must in those environments” Mannai says. “But it’s also very simple to deploy, so if a device does break, it’s like changing a lightbulb: We ship them a new one and it takes them a couple of minutes to swap it out.”

    Customers equip each well with four or five of Amplified’s sensors, which attach to the well’s cables and pipes to measure variables like tension, pressure, and amps. Vast amounts of data are then sent to Amplified’s cloud and processed by their analytics engine. Signal processing methods and AI models are used to diagnose problems and control the equipment in real-time, while generating notifications for the operators when something goes wrong. Operators can then remotely adjust the well or shut it down.

    “That’s where AI is important, because if you just record everything and put it in a giant dashboard, you create way more work for people,” Mannai says. “The critical part is the ability to process and understand this newly recorded data and make it readily usable in the real world.”

    Amplified’s dashboard is customized for different people in the company, so field technicians can quickly respond to problems and managers or owners can get a high-level view of how everything is running.

    Mannai says often when Amplified’s sensors are installed, they’ll immediately start detecting problems that were unknown to engineers and technicians in the field. To date, Amplified has prevented hundreds of thousands of gallons worth of brine water spills, which are particularly damaging to surrounding vegetation because of their high salt and sulfur content.

    Preventing those spills is only part of Amplified’s positive environmental impact; the company is now turning its attention toward the detection of methane leaks.

    Helping a changing industry

    The EPA’s proposed new Waste Emissions Charge for oil and gas companies would start at $900 per metric ton of reported methane emissions in 2024 and increase to $1,500 per metric ton in 2026 and beyond.

    Mannai says Amplified is well-positioned to help companies comply with the new rules. Its equipment has already showed it can detect various kinds of leaks across the field, purely based on analytics of existing data.

    “Detecting methane leaks typically requires someone to walk around every valve and piece of piping with a thermal camera or sniffer, but these operators often have thousands of valves and hundreds of miles of pipes,” Mannai says. “What we see in the field is that a lot of times people don’t know where the pipes are because oil wells change owners so frequently, or they will miss an intermittent leak.”

    Ultimately Mannai believes a strong data backend and modernized sensing equipment will become the backbone of the industry, and is a necessary prerequisite to both improving efficiency and cleaning up the industry.

    “We’re selling a service that ensures your equipment is working optimally all the time,” Mannai says. “That means a lot fewer fines from the EPA, but it also means better-performing equipment. There’s a mindset change happening across the industry, and we’re helping make that transition as easy and affordable as possible.” More

  • in

    Atmospheric observations in China show rise in emissions of a potent greenhouse gas

    To achieve the aspirational goal of the Paris Agreement on climate change — limiting the increase in global average surface temperature to 1.5 degrees Celsius above preindustrial levels — will require its 196 signatories to dramatically reduce their greenhouse gas (GHG) emissions. Those greenhouse gases differ widely in their global warming potential (GWP), or ability to absorb radiative energy and thereby warm the Earth’s surface. For example, measured over a 100-year period, the GWP of methane is about 28 times that of carbon dioxide (CO2), and the GWP of sulfur hexafluoride (SF6) is 24,300 times that of CO2, according to the Intergovernmental Panel on Climate Change (IPCC) Sixth Assessment Report. 

    Used primarily in high-voltage electrical switchgear in electric power grids, SF6 is one of the most potent greenhouse gases on Earth. In the 21st century, atmospheric concentrations of SF6 have risen sharply along with global electric power demand, threatening the world’s efforts to stabilize the climate. This heightened demand for electric power is particularly pronounced in China, which has dominated the expansion of the global power industry in the past decade. Quantifying China’s contribution to global SF6 emissions — and pinpointing its sources in the country — could lead that nation to implement new measures to reduce them, and thereby reduce, if not eliminate, an impediment to the Paris Agreement’s aspirational goal. 

    To that end, a new study by researchers at the MIT Joint Program on the Science and Policy of Global Change, Fudan University, Peking University, University of Bristol, and Meteorological Observation Center of China Meteorological Administration determined total SF6 emissions in China over 2011-21 from atmospheric observations collected from nine stations within a Chinese network, including one station from the Advanced Global Atmospheric Gases Experiment (AGAGE) network. For comparison, global total emissions were determined from five globally distributed, relatively unpolluted “background” AGAGE stations, involving additional researchers from the Scripps Institution of Oceanography and CSIRO, Australia’s National Science Agency.

    The researchers found that SF6 emissions in China almost doubled from 2.6 gigagrams (Gg) per year in 2011, when they accounted for 34 percent of global SF6 emissions, to 5.1 Gg per year in 2021, when they accounted for 57 percent of global total SF6 emissions. This increase from China over the 10-year period — some of it emerging from the country’s less-populated western regions — was larger than the global total SF6 emissions rise, highlighting the importance of lowering SF6 emissions from China in the future.

    The open-access study, which appears in the journal Nature Communications, explores prospects for future SF6 emissions reduction in China.

    “Adopting maintenance practices that minimize SF6 leakage rates or using SF6-free equipment or SF6 substitutes in the electric power grid will benefit greenhouse-gas mitigation in China,” says Minde An, a postdoc at the MIT Center for Global Change Science (CGCS) and the study’s lead author. “We see our findings as a first step in quantifying the problem and identifying how it can be addressed.”

    Emissions of SF6 are expected to last more than 1,000 years in the atmosphere, raising the stakes for policymakers in China and around the world.

    “Any increase in SF6 emissions this century will effectively alter our planet’s radiative budget — the balance between incoming energy from the sun and outgoing energy from the Earth — far beyond the multi-decadal time frame of current climate policies,” says MIT Joint Program and CGCS Director Ronald Prinn, a coauthor of the study. “So it’s imperative that China and all other nations take immediate action to reduce, and ultimately eliminate, their SF6 emissions.”

    The study was supported by the National Key Research and Development Program of China and Shanghai B&R Joint Laboratory Project, the U.S. National Aeronautics and Space Administration, and other funding agencies.   More

  • in

    MIT-derived algorithm helps forecast the frequency of extreme weather

    To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades, and even centuries, forward in time, but only at a coarse resolution. These models might be used to gauge, for instance, future climate conditions for the northeastern U.S., but not specifically for Boston.

    To estimate Boston’s future risk of extreme weather such as flooding, policymakers can combine a coarse model’s large-scale predictions with a finer-resolution model, tuned to estimate how often Boston is likely to experience damaging floods as the climate warms. But this risk analysis is only as accurate as the predictions from that first, coarser climate model.

    “If you get those wrong for large-scale environments, then you miss everything in terms of what extreme events will look like at smaller scales, such as over individual cities,” says Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering.

    Sapsis and his colleagues have now developed a method to “correct” the predictions from coarse climate models. By combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales. When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades, compared to predictions made without the correction scheme.

    Play video

    This animation shows the evolution of storms around the northern hemisphere, as a result of a high-resolution storm model, combined with the MIT team’s corrected global climate model. The simulation improves the modeling of extreme values for wind, temperature, and humidity, which typically have significant errors in coarse scale models. Credit: Courtesy of Ruby Leung and Shixuan Zhang, PNNL

    Sapsis says the new correction scheme is general in form and can be applied to any global climate model. Once corrected, the models can help to determine where and how often extreme weather will strike as global temperatures rise over the coming years. 

    “Climate change will have an effect on every aspect of human life, and every type of life on the planet, from biodiversity to food security to the economy,” Sapsis says. “If we have capabilities to know accurately how extreme weather will change, especially over specific locations, it can make a lot of difference in terms of preparation and doing the right engineering to come up with solutions. This is the method that can open the way to do that.”

    The team’s results appear today in the Journal of Advances in Modeling Earth Systems. The study’s MIT co-authors include postdoc Benedikt Barthel Sorensen and Alexis-Tzianni Charalampopoulos SM ’19, PhD ’23, with Shixuan Zhang, Bryce Harrop, and Ruby Leung of the Pacific Northwest National Laboratory in Washington state.

    Over the hood

    Today’s large-scale climate models simulate weather features such as the average temperature, humidity, and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power, and in order to simulate how weather features will interact and evolve over periods of decades or longer, models average out features every 100 kilometers or so.

    “It’s a very heavy computation requiring supercomputers,” Sapsis notes. “But these models still do not resolve very important processes like clouds or storms, which occur over smaller scales of a kilometer or less.”

    To improve the resolution of these coarse climate models, scientists typically have gone under the hood to try and fix a model’s underlying dynamical equations, which describe how phenomena in the atmosphere and oceans should physically interact.

    “People have tried to dissect into climate model codes that have been developed over the last 20 to 30 years, which is a nightmare, because you can lose a lot of stability in your simulation,” Sapsis explains. “What we’re doing is a completely different approach, in that we’re not trying to correct the equations but instead correct the model’s output.”

    The team’s new approach takes a model’s output, or simulation, and overlays an algorithm that nudges the simulation toward something that more closely represents real-world conditions. The algorithm is based on a machine-learning scheme that takes in data, such as past information for temperature and humidity around the world, and learns associations within the data that represent fundamental dynamics among weather features. The algorithm then uses these learned associations to correct a model’s predictions.

    “What we’re doing is trying to correct dynamics, as in how an extreme weather feature, such as the windspeeds during a Hurricane Sandy event, will look like in the coarse model, versus in reality,” Sapsis says. “The method learns dynamics, and dynamics are universal. Having the correct dynamics eventually leads to correct statistics, for example, frequency of rare extreme events.”

    Climate correction

    As a first test of their new approach, the team used the machine-learning scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM), a climate model run by the U.S. Department of Energy, that simulates climate patterns around the world at a resolution of 110 kilometers. The researchers used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They then ran the climate model forward in time for about 36 years and applied the trained algorithm to the model’s simulations. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years, not used for training.

    “We’re not talking about huge differences in absolute terms,” Sapsis says. “An extreme event in the uncorrected simulation might be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.”

    When the team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones, they found the approach accurately reproduced the frequency of extreme storms in specific locations around the world.

    “We now have a coarse model that can get you the right frequency of events, for the present climate. It’s much more improved,” Sapsis says. “Once we correct the dynamics, this is a relevant correction, even when you have a different average global temperature, and it can be used for understanding how forest fires, flooding events, and heat waves will look in a future climate. Our ongoing work is focusing on analyzing future climate scenarios.”

    “The results are particularly impressive as the method shows promising results on E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group at the University of Chicago and was not involved with the study. “It would be interesting to see what climate change projections this framework yields once future greenhouse-gas emission scenarios are incorporated.”

    This work was supported, in part, by the U.S. Defense Advanced Research Projects Agency. More