More stories

  • in

    Fieldwork class examines signs of climate change in Hawaii

    When Joy Domingo-Kameenui spent two weeks in her native Hawaii as part of MIT class 1.091 (Traveling Research Environmental eXperiences), she was surprised to learn about the number of invasive and endangered species. “I knew about Hawaiian ecology from middle and high school but wasn’t fully aware to the extent of how invasive species and diseases have resulted in many of Hawaii’s endemic species becoming threatened,” says Domingo-Kameenui.  

    Domingo-Kameenui was part of a group of MIT students who conducted field research on the Big Island of Hawaii in the Traveling Research Environmental eXperiences (TREX) class offered by the Department of Civil and Environmental Engineering. The class provides undergraduates an opportunity to gain hands-on environmental fieldwork experience using Hawaii’s geology, chemistry, and biology to address two main topics of climate change concern: sulfur dioxide (SO2) emissions and forest health.

    “Hawaii is this great system for studying the effects of climate change,” says David Des Marais, the Cecil and Ida Green Career Development Professor of Civil and Environmental Engineering and lead instructor of TREX. “Historically, Hawaii has had occasional mild droughts that are related to El Niño, but the droughts are getting stronger and more frequent. And we know these types of extreme weather events are going to happen worldwide.”

    Climate change impacts on forests

    The frequency and intensity of extreme events are also becoming more of a problem for forests and plant life. Forests have a certain distribution of vegetation and as you get higher in elevation, the trees gradually turn into shrubs, and then rock. Trees don’t grow above the timberline, where the temperature and precipitation changes dramatically at the high elevations. “But unlike the Sierra Nevada or the Rockies, where the trees gradually change as you go up the mountains, in Hawaii, they gradually change, and then they just stop,” says Des Marais.

    “Why this is an interesting model for climate change,” explains Des Marais, “is that line where trees stop [growing] is going to move, and it’s going to become more unstable as the trade winds are affected by global patterns of air circulation, which are changing because of climate change.”

    The research question that Des Marais asks students to explore — How is the Hawaiian forest going to be affected by climate change? — uses Hawaii as a model for broader patterns in climate change for forests.

    To dive deeper into this question, students trekked up the mountain taking ground-level measurements of canopy cover with a camera app on their cellphones, estimating how much tree coverage blankets the sky when looking up, and observing how the canopy cover thins until they see no tree coverage at all as they go further up the mountain. Drones also flew above the forest to measure chlorophyll and how much plant matter remains. And then satellite data products from NASA and the European Space Agency were used to measure the distribution of chlorophyll, climate, and precipitation data from space.

    They also worked directly with community stakeholders at three locations around the island to access the forests and use technology to assess the ecology and biodiversity challenges. One of those stakeholders was the Kamehameha Schools Natural and Cultural Ecosystems Division, whose mission is to preserve the land and manage it in a sustainable way. Students worked with their plant biologists to help address and think about what management decisions will support the future health of their forests.

    “Across the island, rising temperatures and abnormal precipitation patterns are the main drivers of drought, which really has significant impacts on biodiversity, and overall human health,” says Ava Gillikin, a senior in civil and environmental engineering.

    Gillikin adds that “a good proportion of the island’s water system relies on rainwater catchment, exposing vulnerabilities to fluctuations in rain patterns that impact many people’s lives.”

    Deadly threats to native plants

    The other threats to Hawaii’s forests are invasive species causing ecological harm, from the prevalence of non-indigenous mosquitoes leading to increases in avian malaria and native bird death that threaten the native ecosystem, to a plant called strawberry guava.

    Strawberry guava is taking over Hawaii’s native ōhiʻa trees, which Domingo-Kameenui says is also contributing to Hawaii’s water production. “The plants absorb water quickly so there’s less water runoff for groundwater systems.”

    A fungal pathogen is also infecting native ōhiʻa trees. The disease, called rapid ʻohiʻa death (ROD), kills the tree within a few days to weeks. The pathogen was identified by researchers on the island in 2014 from the fungal genus, Ceratocystis. The fungal pathogen was likely carried into the forests by humans on their shoes, or contaminated tools, gear, and vehicles traveling from one location to another. The fungal disease is also transmitted by beetles that bore into trees and create a fine powder-like dust. This dust from an infected tree is then mixed with the fungal spores and can easily spread to other trees by wind, or contaminated soil.

    For Gillikin, seeing the effects of ROD in the field highlighted the impact improper care and preparation can have on native forests. “The ‘ohi’a tree is one of the most prominent native trees, and ROD can kill the trees very rapidly by putting a strain on its vascular system and preventing water from reaching all parts of the tree,” says Gillikin.

    Before entering the forests, students sprayed their shoes and gear with ethanol frequently to prevent the spread.

    Uncovering chemical and particle formation

    A second research project in TREX studied volcanic smog (vog) that plagues the air, making visibility problematic at times and causing a lot of health problems for people in Hawaii. The active Kilauea volcano releases SO2 into the atmosphere. When the SO2 mixes with other gasses emitted from the volcano and interacts with sunlight and the atmosphere, particulate matter forms.

    Students in the Kroll Group, led by Jesse Kroll, professor of civil and environmental engineering and chemical engineering, have been studying SO2 and particulate matter over the years, but not the chemistry directly in how those chemical transformations occur.

    “There’s a hypothesis that there is a functional connection between the SO2 and particular matter, but that’s never been directly demonstrated,” says Des Marais.

    Testing that hypothesis, the students were able to measure two different sizes of particulate matter formed from the SO2 and develop a model to show how much vog is generated downstream of the volcano.

    They spent five days at two sites from sunrise to late morning measuring particulate matter formation as the sun comes up and starts creating new particles. Using a combination of data sources for meteorology, such as UV index, wind speed, and humidity, the students built a model that demonstrates all the pieces of an equation that can calculate when new particles are formed.

    “You can build what you think that equation is based on first-principle understanding of the chemical composition, but what they did was measured it in real time with measurements of the chemical reagents,” says Des Marias.

    The students measured what was going to catalyze the chemical reaction of particulate matter — for instance, things like sunlight and ozone — and then calculated numbers to the outputs.

    “What they found, and what seems to be happening, is that the chemical reagents are accumulating overnight,” says Des Marais. “Then as soon as the sun rises in the morning all the transformation happens in the atmosphere. A lot of the reagents are used up and the wind blows everything away, leaving the other side of the island with polluted air,” adds Des Marais.

    “I found the vog particle formation fieldwork a surprising research learning,” adds Domingo-Kameenui who did some atmospheric chemistry research in the Kroll Group. “I just thought particle formation happened in the air, but we found wind direction and wind speed at a certain time of the day was extremely important to particle formation. It’s not just chemistry you need to look at, but meteorology and sunlight,” she adds.

    Both Domingo-Kameenui and Gillikin found the fieldwork class an important and memorable experience with new insight that they will carry with them beyond MIT.  

    How Gillikin approaches fieldwork or any type of community engagement in another culture is what she will remember most. “When entering another country or culture, you are getting the privilege to be on their land, to learn about their history and experiences, and to connect with so many brilliant people,” says Gillikin. “Everyone we met in Hawaii had so much passion for their work, and approaching those environments with respect and openness to learn is what I experienced firsthand and will take with me throughout my career.” More

  • in

    Michael Howland gives wind energy a lift

    Michael Howland was in his office at MIT, watching real-time data from a wind farm 7,000 miles away in northwest India, when he noticed something odd: Some of the turbines weren’t producing the expected amount of electricity.

    Howland, the Esther and Harold E. Edgerton Assistant Professor of Civil and Environmental Engineering, studies the physics of the Earth’s atmosphere and how that information can optimize renewable energy systems. To accomplish this, he and his team develop and use predictive models, supercomputer simulations, and real-life data from wind farms, such as the one in India.

    The global wind power market is one of the most cost-competitive and resilient power sources across the world, the Global Wind Energy Council reported last year. The year 2020 saw record growth in wind power capacity, thanks to a surge of installations in China and the United States. Yet wind power needs to grow three times faster in the coming decade to address the worst impacts of climate change and achieve federal and state climate goals, the report says.

    “Optimal wind farm design and the resulting cost of energy are dependent on the wind,” Howland says. “But wind farms are often sited and designed based on short-term historical climate records.”

    In October 2021, Howland received a Seed Fund grant from the MIT Energy Initiative (MITEI) to account for how climate change might affect the wind of the future. “Our initial results suggest that considering the uncertainty in the winds in the design and operation of wind farms can lead to more reliable energy production,” he says.

    Most recently, Howland and his team came up with a model that predicts the power produced by each individual turbine based on the physics of the wind farm as a whole. The model can inform decisions that may boost a farm’s overall output.

    The state of the planet

    Growing up in a suburb of Philadelphia, the son of neuroscientists, Howland’s childhood wasn’t especially outdoorsy. Later, he’d become an avid hiker with a deep appreciation for nature, but a ninth-grade class assignment made him think about the state of the planet, perhaps for the first time.

    A history teacher had asked the class to write a report on climate change. “I remember arguing with my high school classmates about whether humans were the leading cause of climate change, but the teacher didn’t want to get into that debate,” Howland recalls. “He said climate change was happening, whether or not you accept that it’s anthropogenic, and he wanted us to think about the impacts of global warming, and solutions. I was one of his vigorous defenders.”

    As part of a research internship after his first year of college, Howland visited a wind farm in Iowa, where wind produces more than half of the state’s electricity. “The turbines look tall from the highway, but when you’re underneath them, you’re really struck by their scale,” he says. “That’s where you get a sense of how colossal they really are.” (Not a fan of heights, Howland opted not to climb the turbine’s internal ladder to snap a photo from the top.)

    After receiving an undergraduate degree from Johns Hopkins University and master’s and PhD degrees in mechanical engineering from Stanford University, he joined MIT’s Department of Civil and Environmental Engineering to focus on the intersection of fluid mechanics, weather, climate, and energy modeling. His goal is to enhance renewable energy systems.

    An added bonus to being at MIT is the opportunity to inspire the next generation, much like his ninth-grade history teacher did for him. Howland’s graduate-level introduction to the atmospheric boundary layer is geared primarily to engineers and physicists, but as he sees it, climate change is such a multidisciplinary and complex challenge that “every skill set that exists in human society can be relevant to mitigating it.”

    “There are the physics and engineering questions that our lab primarily works on, but there are also questions related to social sciences, public acceptance, policymaking, and implementation,” he says. “Careers in renewable energy are rapidly growing. There are far more job openings than we can hire for right now. In many areas, we don’t yet have enough people to address the challenges in renewable energy and climate change mitigation that need to be solved.

    “I encourage my students — really, everyone I interact with — to find a way to impact the climate change problem,” he says.

    Unusual conditions

    In fall 2021, Howland was trying to explain the odd data coming in from India.

    Based on sensor feedback, wind turbines’ software-driven control systems constantly tweak the speed and the angle of the blades, and what’s known as yaw — the orientation of the giant blades in relation to the wind direction.

    Existing utility-scale turbines are controlled “greedily,” which means that every turbine in the farm automatically turns into the wind to maximize its own power production.

    Though the turbines in the front row of the Indian wind farm were reacting appropriately to the wind direction, their power output was all over the place. “Not what we would expect based on the existing models,” Howland says.

    These massive turbine towers stood at 100 meters, about the length of a football field, with blades the length of an Olympic swimming pool. At their highest point, the blade tips lunged almost 200 meters into the sky.

    Then there’s the speed of the blades themselves: The tips move many times faster than the wind, around 80 to 100 meters per second — up to a quarter or a third of the speed of sound.

    Using a state-of-the-art sensor that measures the speed of incoming wind before it interacts with the massive rotors, Howland’s team saw an unexpectedly complex airflow effect. He covers the phenomenon in his class. The data coming in from India, he says, displayed “quite remarkable wind conditions stemming from the effects of Earth’s rotation and the physics of buoyancy 
that you don’t always see.”

    Traditionally, wind turbines operate in the lowest 10 percent of the atmospheric boundary layer — the so-called surface layer — which is affected primarily by ground conditions. The Indian turbines, Howland realized, were operating in regions of the atmosphere that turbines haven’t historically accessed.

    Trending taller

    Howland knew that airflow interactions can persist for kilometers. The interaction of high winds with the front-row turbines was generating wakes in the air similar to the way boats generate wakes in the water.

    To address this, Howland’s model trades off the efficiency of upwind turbines to benefit downwind ones. By misaligning some of the upwind turbines in certain conditions, the downwind units experience less wake turbulence, increasing the overall energy output of the wind farm by as much as 1 percent to 3 percent, without requiring additional costs. If a 1.2 percent energy increase was applied to the world’s existing wind farms, it would be the equivalent of adding more than 3,600 new wind turbines — enough to power about 3 million homes.

    Even a modest boost could mean fewer turbines generating the same output, or the ability to place more units into a smaller space, because negative interactions between the turbines can be diminished.

    Howland says the model can predict potential benefits in a variety of scenarios at different types of wind farms. “The part that’s important and exciting is that it’s not just particular to this wind farm. We can apply the collective control method across the wind farm fleet,” he says, which is growing taller and wider.

    By 2035, the average hub height for offshore turbines in the United States is projected to grow from 100 meters to around 150 meters — the height of the Washington Monument.

    “As we continue to build larger wind turbines and larger wind farms, we need to revisit the existing practice for their design and control,” Howland says. “We can use our predictive models to ensure that we build and operate the most efficient renewable generators possible.”

    Looking to the future

    Howland and other climate watchers have reason for optimism with the passage in August 2022 of the Inflation Reduction Act, which calls for a significant investment in domestic energy production and for reducing carbon emissions by roughly 40 percent by 2030.

    But Howland says the act itself isn’t sufficient. “We need to continue pushing the envelope in research and development as well as deployment,” he says. The model he created with his team can help, especially for offshore wind farms experiencing low wind turbulence and larger wake interactions.

    Offshore wind can face challenges of public acceptance. Howland believes that researchers, policymakers, and the energy industry need to do more to get the public on board by addressing concerns through open public dialogue, outreach, and education.

    Howland once wrote and illustrated a children’s book, inspired by Dr. Seuss’s “The Lorax,” that focused on renewable energy. Howland recalls his “really terrible illustrations,” but he believes he was onto something. “I was having some fun helping people interact with alternative energy in a more natural way at an earlier age,” he says, “and recognize that these are not nefarious technologies, but remarkable feats of human ingenuity.” More

  • in

    Helping the cause of environmental resilience

    Haruko Wainwright, the Norman C. Rasmussen Career Development Professor in Nuclear Science and Engineering (NSE) and assistant professor in civil and environmental engineering at MIT, grew up in rural Japan, where many nuclear facilities are located. She remembers worrying about the facilities as a child. Wainwright was only 6 at the time of the Chernobyl accident in 1986, but still recollects it vividly.

    Those early memories have contributed to Wainwright’s determination to research how technologies can mold environmental resilience — the capability of mitigating the consequences of accidents and recovering from contamination.

    Wainwright believes that environmental monitoring can help improve resilience. She co-leads the U.S. Department of Energy (DOE)’s Advanced Long-term Environmental Monitoring Systems (ALTEMIS) project, which integrates technologies such as in situ sensors, geophysics, remote sensing, simulations, and artificial intelligence to establish new paradigms for monitoring. The project focuses on soil and groundwater contamination at more than 100 U.S. sites that were used for nuclear weapons production.

    As part of this research, which was featured last year in Environmental Science & Technology Journal, Wainwright is working on a machine learning framework for improving environmental monitoring strategies. She hopes the ALTEMIS project will enable the rapid detection of anomalies while ensuring the stability of residual contamination and waste disposal facilities.

    Childhood in rural Japan

    Even as a child, Wainwright was interested in physics, history, and a variety of other subjects.

    But growing up in a rural area was not ideal for someone interested in STEM. There were no engineers or scientists in the community and no science museums, either. “It was not so cool to be interested in science, and I never talked about my interest with anyone,” Wainwright recalls.

    Television and books were the only door to the world of science. “I did not study English until middle school and I had never been on a plane until college. I sometimes find it miraculous that I am now working in the U.S. and teaching at MIT,” she says.

    As she grew a little older, Wainwright heard a lot of discussions about nuclear facilities in the region and many stories about Hiroshima and Nagasaki.

    At the same time, giants like Marie Curie inspired her to pursue science. Nuclear physics was particularly fascinating. “At some point during high school, I started wondering ‘what are radiations, what is radioactivity, what is light,’” she recalls. Reading Richard Feynman’s books and trying to understand quantum mechanics made her want to study physics in college.

    Pursuing research in the United States

    Wainwright pursued an undergraduate degree in engineering physics at Kyoto University. After two research internships in the United States, Wainwright was impressed by the dynamic and fast-paced research environment in the country.

    And compared to Japan, there were “more women in science and engineering,” Wainwright says. She enrolled at the University of California at Berkeley in 2005, where she completed her doctorate in nuclear engineering with minors in statistics and civil and environmental engineering.

    Before moving to MIT NSE in 2022, Wainwright was a staff scientist in the Earth and Environmental Area at Lawrence Berkeley National Laboratory (LBNL). She worked on a variety of topics, including radioactive contamination, climate science, CO2 sequestration, precision agriculture, and watershed science. Her time at LBNL helped Wainwright build a solid foundation about a variety of environmental sensors and monitoring and simulation methods across different earth science disciplines.   

    Empowering communities through monitoring

    One of the most compelling takeaways from Wainwright’s early research: People trust actual measurements and data as facts, even though they are skeptical about models and predictions. “I talked with many people living in Fukushima prefecture. Many of them have dosimeters and measure radiation levels on their own. They might not trust the government, but they trust their own data and are then convinced that it is safe to live there and to eat local food,” Wainwright says.

    She has been impressed that area citizens have gained significant knowledge about radiation and radioactivity through these efforts. “But they are often frustrated that people living far away, in cities like Tokyo, still avoid agricultural products from Fukushima,” Wainwright says.

    Wainwright thinks that data derived from environmental monitoring — through proper visualization and communication — can address misconceptions and fake news that often hurt people near contaminated sites.

    Wainwright is now interested in how these technologies — tested with real data at contaminated sites — can be proactively used for existing and future nuclear facilities “before contamination happens,” as she explored for Nuclear News. “I don’t think it is a good idea to simply dismiss someone’s concern as irrational. Showing credible data has been much more effective to provide assurance. Or a proper monitoring network would enable us to minimize contamination or support emergency responses when accidents happen,” she says.

    Educating communities and students

    Part of empowering communities involves improving their ability to process science-based information. “Potentially hazardous facilities always end up in rural regions; minorities’ concerns are often ignored. The problem is that these regions don’t produce so many scientists or policymakers; they don’t have a voice,” Wainwright says, “I am determined to dedicate my time to improve STEM education in rural regions and to increase the voice in these regions.”

    In a project funded by DOE, she collaborates with the team of researchers at the University of Alaska — the Alaska Center for Energy and Power and Teaching Through Technology program — aiming to improve STEM education for rural and indigenous communities. “Alaska is an important place for energy transition and environmental justice,” Wainwright says. Micro-nuclear reactors can potentially improve the life of rural communities who bear the brunt of the high cost of fuel and transportation. However, there is a distrust of nuclear technologies, stemming from past nuclear weapon testing. At the same time, Alaska has vast metal mining resources for renewable energy and batteries. And there are concerns about environmental contamination from mining and various sources. The teams’ vision is much broader, she points out. “The focus is on broader environmental monitoring technologies and relevant STEM education, addressing general water and air qualities,” Wainwright says.

    The issues also weave into the courses Wainwright teaches at MIT. “I think it is important for engineering students to be aware of environmental justice related to energy waste and mining as well as past contamination events and their recovery,” she says. “It is not OK just to send waste to, or develop mines in, rural regions, which could be a special place for some people. We need to make sure that these developments will not harm the environment and health of local communities.” Wainwright also hopes that this knowledge will ultimately encourage students to think creatively about engineering designs that minimize waste or recycle material.

    The last question of the final quiz of one of her recent courses was: Assume that you store high-level radioactive waste in your “backyard.” What technical strategies would make you and your family feel safe? “All students thought about this question seriously and many suggested excellent points, including those addressing environmental monitoring,” Wainwright says, “that made me hopeful about the future.” More

  • in

    Tackling counterfeit seeds with “unclonable” labels

    Average crop yields in Africa are consistently far below those expected, and one significant reason is the prevalence of counterfeit seeds whose germination rates are far lower than those of the genuine ones. The World Bank estimates that as much as half of all seeds sold in some African countries are fake, which could help to account for crop production that is far below potential.

    There have been many attempts to prevent this counterfeiting through tracking labels, but none have proved effective; among other issues, such labels have been vulnerable to hacking because of the deterministic nature of their encoding systems. But now, a team of MIT researchers has come up with a kind of tiny, biodegradable tag that can be applied directly to the seeds themselves, and that provides a unique randomly created code that cannot be duplicated.

    The new system, which uses minuscule dots of silk-based material, each containing a unique combination of different chemical signatures, is described today in the journal Science Advances in a paper by MIT’s dean of engineering Anantha Chandrakasan, professor of civil and environmental engineering Benedetto Marelli, postdoc Hui Sun, and graduate student Saurav Maji.

    The problem of counterfeiting is an enormous one globally, the researchers point out, affecting everything from drugs to luxury goods, and many different systems have been developed to try to combat this. But there has been less attention to the problem in the area of agriculture, even though the consequences can be severe. In sub-Saharan Africa, for example, the World Bank estimates that counterfeit seeds are a significant factor in crop yields that average less than one-fifth of the potential for maize, and less than one-third for rice.

    Marelli explains that a key to the new system is creating a randomly-produced physical object whose exact composition is virtually impossible to duplicate. The labels they create “leverage randomness and uncertainty in the process of application, to generate unique signature features that can be read, and that cannot be replicated,” he says.

    What they’re dealing with, Sun adds, “is the very old job of trying, basically, not to get your stuff stolen. And you can try as much as you can, but eventually somebody is always smart enough to figure out how to do it, so nothing is really unbreakable. But the idea is, it’s almost impossible, if not impossible, to replicate it, or it takes so much effort that it’s not worth it anymore.”

    The idea of an “unclonable” code was originally developed as a way of protecting the authenticity of computer chips, explains Chandrakasan, who is the Vannevar Bush Professor of Electrical Engineering and Computer Science. “In integrated circuits, individual transistors have slightly different properties coined device variations,” he explains, “and you could then use that variability and combine that variability with higher-level circuits to create a unique ID for the device. And once you have that, then you can use that unique ID as a part of a security protocol. Something like transistor variability is hard to replicate from device to device, so that’s what gives it its uniqueness, versus storing a particular fixed ID.” The concept is based on what are known as physically unclonable functions, or PUFs.

    The team decided to try to apply that PUF principle to the problem of fake seeds, and the use of silk proteins was a natural choice because the material is not only harmless to the environment but also classified by the Food and Drug Administration in the “generally recognized as safe” category, so it requires no special approval for use on food products.

    “You could coat it on top of seeds,” Maji says, “and if you synthesize silk in a certain way, it will also have natural random variations. So that’s the idea, that every seed or every bag could have a unique signature.”

    Developing effective secure system solutions has long been one of Chandrakasan’s specialties, while Marelli has spent many years developing systems for applying silk coatings to a variety of fruits, vegetables, and seeds, so their collaboration was a natural for developing such a silk-based coding system toward enhanced security.

    “The challenge was what type of form factor to give to silk,” Sun says, “so that it can be fabricated very easily.” They developed a simple drop-casting approach that produces tags that are less than one-tenth of an inch in diameter. The second challenge was to develop “a way where we can read the uniqueness, in also a very high throughput and easy way.”

    For the unique silk-based codes, Marelli says, “eventually we found a way to add a color to these microparticles so that they assemble in random structures.” The resulting unique patterns can be read out not only by a spectrograph or a portable microscope, but even by an ordinary cellphone camera with a macro lens. This image can be processed locally to generate the PUF code and then sent to the cloud and compared with a secure database to ensure the authenticity of the product. “It’s random so that people cannot easily replicate it,” says Sun. “People cannot predict it without measuring it.”

    And the number of possible permutations that could result from the way they mix four basic types of colored silk nanoparticles is astronomical. “We were able to show that with a minimal amount of silk, we were able to generate 128 random bits of security,” Maji says. “So this gives rise to 2 to the power 128 possible combinations, which is extremely difficult to crack given the computational capabilities of the state-of-the-art computing systems.”

    Marelli says that “for us, it’s a good test bed in order to think out-of-the-box, and how we can have a path that somehow is more democratic.” In this case, that means “something that you can literally read with your phone, and you can fabricate by simply drop casting a solution, without using any advanced manufacturing technique, without going in a clean room.”

    Some additional work will be needed to make this a practical commercial product, Chandrakasan says. “There will have to be a development for at-scale reading” via smartphones. “So, that’s clearly a future opportunity.” But the principle now shows a clear path to the day when “a farmer could at least, maybe not every seed, but could maybe take some random seeds in a particular batch and verify them,” he says.

    The research was partially supported by the U.S. Office of Naval research and the National Science Foundation, Analog Devices Inc., an EECS Mathworks fellowship, and a Paul M. Cook Career Development Professorship. More

  • in

    Detailed images from space offer clearer picture of drought effects on plants

    “MIT is a place where dreams come true,” says César Terrer, an assistant professor in the Department of Civil and Environmental Engineering. Here at MIT, Terrer says he’s given the resources needed to explore ideas he finds most exciting, and at the top of his list is climate science. In particular, he is interested in plant-soil interactions, and how the two can mitigate impacts of climate change. In 2022, Terrer received seed grant funding from the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) to produce drought monitoring systems for farmers. The project is leveraging a new generation of remote sensing devices to provide high-resolution plant water stress at regional to global scales.

    Growing up in Granada, Spain, Terrer always had an aptitude and passion for science. He studied environmental science at the University of Murcia, where he interned in the Department of Ecology. Using computational analysis tools, he worked on modeling species distribution in response to human development. Early on in his undergraduate experience, Terrer says he regarded his professors as “superheroes” with a kind of scholarly prowess. He knew he wanted to follow in their footsteps by one day working as a faculty member in academia. Of course, there would be many steps along the way before achieving that dream. 

    Upon completing his undergraduate studies, Terrer set his sights on exciting and adventurous research roles. He thought perhaps he would conduct field work in the Amazon, engaging with native communities. But when the opportunity arose to work in Australia on a state-of-the-art climate change experiment that simulates future levels of carbon dioxide, he headed south to study how plants react to CO2 in a biome of native Australian eucalyptus trees. It was during this experience that Terrer started to take a keen interest in the carbon cycle and the capacity of ecosystems to buffer rising levels of CO2 caused by human activity.

    Around 2014, he began to delve deeper into the carbon cycle as he began his doctoral studies at Imperial College London. The primary question Terrer sought to answer during his PhD was “will plants be able to absorb predicted future levels of CO2 in the atmosphere?” To answer the question, Terrer became an early adopter of artificial intelligence, machine learning, and remote sensing to analyze data from real-life, global climate change experiments. His findings from these “ground truth” values and observations resulted in a paper in the journal Science. In it, he claimed that climate models most likely overestimated how much carbon plants will be able to absorb by the end of the century, by a factor of three. 

    After postdoctoral positions at Stanford University and the Universitat Autonoma de Barcelona, followed by a prestigious Lawrence Fellowship, Terrer says he had “too many ideas and not enough time to accomplish all those ideas.” He knew it was time to lead his own group. Not long after applying for faculty positions, he landed at MIT. 

    New ways to monitor drought

    Terrer is employing similar methods to those he used during his PhD to analyze data from all over the world for his J-WAFS project. He and postdoc Wenzhe Jiao collect data from remote sensing satellites and field experiments and use machine learning to come up with new ways to monitor drought. Terrer says Jiao is a “remote sensing wizard,” who fuses data from different satellite products to understand the water cycle. With Jiao’s hydrology expertise and Terrer’s knowledge of plants, soil, and the carbon cycle, the duo is a formidable team to tackle this project.

    According to the U.N. World Meteorological Organization, the number and duration of droughts has increased by 29 percent since 2000, as compared to the two previous decades. From the Horn of Africa to the Western United States, drought is devastating vegetation and severely stressing water supplies, compromising food production and spiking food insecurity. Drought monitoring can offer fundamental information on drought location, frequency, and severity, but assessing the impact of drought on vegetation is extremely challenging. This is because plants’ sensitivity to water deficits varies across species and ecosystems. 

    Terrer and Jiao are able to obtain a clearer picture of how drought is affecting plants by employing the latest generation of remote sensing observations, which offer images of the planet with incredible spatial and temporal resolution. Satellite products such as Sentinel, Landsat, and Planet can provide daily images from space with such high resolution that individual trees can be discerned. Along with the images and datasets from satellites, the team is using ground-based observations from meteorological data. They are also using the MIT SuperCloud at MIT Lincoln Laboratory to process and analyze all of the data sets. The J-WAFS project is among one of the first to leverage high-resolution data to quantitatively measure plant drought impacts in the United States with the hopes of expanding to a global assessment in the future.

    Assisting farmers and resource managers 

    Every week, the U.S. Drought Monitor provides a map of drought conditions in the United States. The map has zero resolution and is more of a drought recap or summary, unable to predict future drought scenarios. The lack of a comprehensive spatiotemporal evaluation of historic and future drought impacts on global vegetation productivity is detrimental to farmers both in the United States and worldwide.  

    Terrer and Jiao plan to generate metrics for plant water stress at an unprecedented resolution of 10-30 meters. This means that they will be able to provide drought monitoring maps at the scale of a typical U.S. farm, giving farmers more precise, useful data every one to two days. The team will use the information from the satellites to monitor plant growth and soil moisture, as well as the time lag of plant growth response to soil moisture. In this way, Terrer and Jiao say they will eventually be able to create a kind of “plant water stress forecast” that may be able to predict adverse impacts of drought four weeks in advance. “According to the current soil moisture and lagged response time, we hope to predict plant water stress in the future,” says Jiao. 

    The expected outcomes of this project will give farmers, land and water resource managers, and decision-makers more accurate data at the farm-specific level, allowing for better drought preparation, mitigation, and adaptation. “We expect to make our data open-access online, after we finish the project, so that farmers and other stakeholders can use the maps as tools,” says Jiao. 

    Terrer adds that the project “has the potential to help us better understand the future states of climate systems, and also identify the regional hot spots more likely to experience water crises at the national, state, local, and tribal government scales.” He also expects the project will enhance our understanding of global carbon-water-energy cycle responses to drought, with applications in determining climate change impacts on natural ecosystems as a whole. More

  • in

    Exploring the nanoworld of biogenic gems

    A new research collaboration with The Bahrain Institute for Pearls and Gemstones (DANAT) will seek to develop advanced characterization tools for the analysis of the properties of pearls and to explore technologies to assign unique identifiers to individual pearls.

    The three-year project will be led by Admir Mašić, associate professor of civil and environmental engineering, in collaboration with Vladimir Bulović, the Fariborz Maseeh Chair in Emerging Technology and professor of electrical engineering and computer science.

    “Pearls are extremely complex and fascinating hierarchically ordered biological materials that are formed by a wide range of different species,” says Mašić. “Working with DANAT provides us a unique opportunity to apply our lab’s multi-scale materials characterization tools to identify potentially species-specific pearl fingerprints, while simultaneously addressing scientific research questions regarding the underlying biomineralization processes that could inform advances in sustainable building materials.”

    DANAT is a gemological laboratory specializing in the testing and study of natural pearls as a reflection of Bahrain’s pearling history and desire to protect and advance Bahrain’s pearling heritage. DANAT’s gemologists support clients and students through pearl, gemstone, and diamond identification services, as well as educational courses.

    Like many other precious gemstones, pearls have been human-made through scientific experimentation, says Noora Jamsheer, chief executive officer at DANAT. Over a century ago, cultured pearls entered markets as a competitive product to natural pearls, similar in appearance but different in value.

    “Gemological labs have been innovating scientific testing methods to differentiate between natural pearls and all other pearls that exist because of direct or indirect human intervention. Today the world knows natural pearls and cultured pearls. However, there are also pearls that fall in between these two categories,” says Jamsheer. “DANAT has the responsibility, as the leading gemological laboratory for pearl testing, to take the initiative necessary to ensure that testing methods keep pace with advances in the science of pearl cultivation.”

    Titled “Exploring the Nanoworld of Biogenic Gems,” the project will aim to improve the process of testing and identifying pearls by identifying morphological, micro-structural, optical, and chemical features sufficient to distinguish a pearl’s area of origin, method of growth, or both. MIT.nano, MIT’s open-access center for nanoscience and nanoengineering will be the organizational home for the project, where Mašić and his team will utilize the facility’s state-of-the-art characterization tools.

    In addition to discovering new methodologies for establishing a pearl’s origin, the project aims to utilize machine learning to automate pearl classification. Furthermore, researchers will investigate techniques to create a unique identifier associated with an individual pearl.

    The initial sponsored research project is expected to last three years, with potential for continued collaboration based on key findings or building upon the project’s success to open new avenues for research into the structure, properties, and growth of pearls. More

  • in

    Integrating humans with AI in structural design

    Modern fabrication tools such as 3D printers can make structural materials in shapes that would have been difficult or impossible using conventional tools. Meanwhile, new generative design systems can take great advantage of this flexibility to create innovative designs for parts of a new building, car, or virtually any other device.

    But such “black box” automated systems often fall short of producing designs that are fully optimized for their purpose, such as providing the greatest strength in proportion to weight or minimizing the amount of material needed to support a given load. Fully manual design, on the other hand, is time-consuming and labor-intensive.

    Now, researchers at MIT have found a way to achieve some of the best of both of these approaches. They used an automated design system but stopped the process periodically to allow human engineers to evaluate the work in progress and make tweaks or adjustments before letting the computer resume its design process. Introducing a few of these iterations produced results that performed better than those designed by the automated system alone, and the process was completed more quickly compared to the fully manual approach.

    The results are reported this week in the journal Structural and Multidisciplinary Optimization, in a paper by MIT doctoral student Dat Ha and assistant professor of civil and environmental engineering Josephine Carstensen.

    The basic approach can be applied to a broad range of scales and applications, Carstensen explains, for the design of everything from biomedical devices to nanoscale materials to structural support members of a skyscraper. Already, automated design systems have found many applications. “If we can make things in a better way, if we can make whatever we want, why not make it better?” she asks.

    “It’s a way to take advantage of how we can make things in much more complex ways than we could in the past,” says Ha, adding that automated design systems have already begun to be widely used over the last decade in automotive and aerospace industries, where reducing weight while maintaining structural strength is a key need.

    “You can take a lot of weight out of components, and in these two industries, everything is driven by weight,” he says. In some cases, such as internal components that aren’t visible, appearance is irrelevant, but for other structures aesthetics may be important as well. The new system makes it possible to optimize designs for visual as well as mechanical properties, and in such decisions the human touch is essential.

    As a demonstration of their process in action, the researchers designed a number of structural load-bearing beams, such as might be used in a building or a bridge. In their iterations, they saw that the design has an area that could fail prematurely, so they selected that feature and required the program to address it. The computer system then revised the design accordingly, removing the highlighted strut and strengthening some other struts to compensate, and leading to an improved final design.

    The process, which they call Human-Informed Topology Optimization, begins by setting out the needed specifications — for example, a beam needs to be this length, supported on two points at its ends, and must support this much of a load. “As we’re seeing the structure evolve on the computer screen in response to initial specification,” Carstensen says, “we interrupt the design and ask the user to judge it. The user can select, say, ‘I’m not a fan of this region, I’d like you to beef up or beef down this feature size requirement.’ And then the algorithm takes into account the user input.”

    While the result is not as ideal as what might be produced by a fully rigorous yet significantly slower design algorithm that considers the underlying physics, she says it can be much better than a result generated by a rapid automated design system alone. “You don’t get something that’s quite as good, but that was not necessarily the goal. What we can show is that instead of using several hours to get something, we can use 10 minutes and get something much better than where we started off.”

    The system can be used to optimize a design based on any desired properties, not just strength and weight. For example, it can be used to minimize fracture or buckling, or to reduce stresses in the material by softening corners.

    Carstensen says, “We’re not looking to replace the seven-hour solution. If you have all the time and all the resources in the world, obviously you can run these and it’s going to give you the best solution.” But for many situations, such as designing replacement parts for equipment in a war zone or a disaster-relief area with limited computational power available, “then this kind of solution that catered directly to your needs would prevail.”

    Similarly, for smaller companies manufacturing equipment in essentially “mom and pop” businesses, such a simplified system might be just the ticket. The new system they developed is not only simple and efficient to run on smaller computers, but it also requires far less training to produce useful results, Carstensen says. A basic two-dimensional version of the software, suitable for designing basic beams and structural parts, is freely available now online, she says, as the team continues to develop a full 3D version.

    “The potential applications of Prof Carstensen’s research and tools are quite extraordinary,” says Christian Málaga-Chuquitaype, a professor of civil and environmental engineering at Imperial College London, who was not associated with this work. “With this work, her group is paving the way toward a truly synergistic human-machine design interaction.”

    “By integrating engineering ‘intuition’ (or engineering ‘judgement’) into a rigorous yet computationally efficient topology optimization process, the human engineer is offered the possibility of guiding the creation of optimal structural configurations in a way that was not available to us before,” he adds. “Her findings have the potential to change the way engineers tackle ‘day-to-day’ design tasks.” More

  • in

    Study: Carbon-neutral pavements are possible by 2050, but rapid policy and industry action are needed

    Almost 2.8 million lane-miles, or about 4.6 million lane-kilometers, of the United States are paved.

    Roads and streets form the backbone of our built environment. They take us to work or school, take goods to their destinations, and much more.

    However, a new study by MIT Concrete Sustainability Hub (CSHub) researchers shows that the annual greenhouse gas (GHG) emissions of all construction materials used in the U.S. pavement network are 11.9 to 13.3 megatons. This is equivalent to the emissions of a gasoline-powered passenger vehicle driving about 30 billion miles in a year.

    As roads are built, repaved, and expanded, new approaches and thoughtful material choices are necessary to dampen their carbon footprint. 

    The CSHub researchers found that, by 2050, mixtures for pavements can be made carbon-neutral if industry and governmental actors help to apply a range of solutions — like carbon capture — to reduce, avoid, and neutralize embodied impacts. (A neutralization solution is any compensation mechanism in the value chain of a product that permanently removes the global warming impact of the processes after avoiding and reducing the emissions.) Furthermore, nearly half of pavement-related greenhouse gas (GHG) savings can be achieved in the short term with a negative or nearly net-zero cost.

    The research team, led by Hessam AzariJafari, MIT CSHub’s deputy director, closed gaps in our understanding of the impacts of pavements decisions by developing a dynamic model quantifying the embodied impact of future pavements materials demand for the U.S. road network. 

    The team first split the U.S. road network into 10-mile (about 16 kilometer) segments, forecasting the condition and performance of each. They then developed a pavement management system model to create benchmarks helping to understand the current level of emissions and the efficacy of different decarbonization strategies. 

    This model considered factors such as annual traffic volume and surface conditions, budget constraints, regional variation in pavement treatment choices, and pavement deterioration. The researchers also used a life-cycle assessment to calculate annual state-level emissions from acquiring pavement construction materials, considering future energy supply and materials procurement.

    The team considered three scenarios for the U.S. pavement network: A business-as-usual scenario in which technology remains static, a projected improvement scenario aligned with stated industry and national goals, and an ambitious improvement scenario that intensifies or accelerates projected strategies to achieve carbon neutrality. 

    If no steps are taken to decarbonize pavement mixtures, the team projected that GHG emissions of construction materials used in the U.S. pavement network would increase by 19.5 percent by 2050. Under the projected scenario, there was an estimated 38 percent embodied impact reduction for concrete and 14 percent embodied impact reduction for asphalt by 2050.

    The keys to making the pavement network carbon neutral by 2050 lie in multiple places. Fully renewable energy sources should be used for pavement materials production, transportation, and other processes. The federal government must contribute to the development of these low-carbon energy sources and carbon capture technologies, as it would be nearly impossible to achieve carbon neutrality for pavements without them. 

    Additionally, increasing pavements’ recycled content and improving their design and production efficiency can lower GHG emissions to an extent. Still, neutralization is needed to achieve carbon neutrality.

    Making the right pavement construction and repair choices would also contribute to the carbon neutrality of the network. For instance, concrete pavements can offer GHG savings across the whole life cycle as they are stiffer and stay smoother for longer, meaning they require less maintenance and have a lesser impact on the fuel efficiency of vehicles. 

    Concrete pavements have other use-phase benefits including a cooling effect through an intrinsically high albedo, meaning they reflect more sunlight than regular pavements. Therefore, they can help combat extreme heat and positively affect the earth’s energy balance through positive radiative forcing, making albedo a potential neutralization mechanism.

    At the same time, a mix of fixes, including using concrete and asphalt in different contexts and proportions, could produce significant GHG savings for the pavement network; decision-makers must consider scenarios on a case-by-case basis to identify optimal solutions. 

    In addition, it may appear as though the GHG emissions of materials used in local roads are dwarfed by the emissions of interstate highway materials. However, the study found that the two road types have a similar impact. In fact, all road types contribute heavily to the total GHG emissions of pavement materials in general. Therefore, stakeholders at the federal, state, and local levels must be involved if our roads are to become carbon neutral. 

    The path to pavement network carbon-neutrality is, therefore, somewhat of a winding road. It demands regionally specific policies and widespread investment to help implement decarbonization solutions, just as renewable energy initiatives have been supported. Providing subsidies and covering the costs of premiums, too, are vital to avoid shifts in the market that would derail environmental savings.

    When planning for these shifts, we must recall that pavements have impacts not just in their production, but across their entire life cycle. As pavements are used, maintained, and eventually decommissioned, they have significant impacts on the surrounding environment.

    If we are to meet climate goals such as the Paris Agreement, which demands that we reach carbon-neutrality by 2050 to avoid the worst impacts of climate change, we — as well as industry and governmental stakeholders — must come together to take a hard look at the roads we use every day and work to reduce their life cycle emissions. 

    The study was published in the International Journal of Life Cycle Assessment. In addition to AzariJafari, the authors include Fengdi Guo of the MIT Department of Civil and Environmental Engineering; Jeremy Gregory, executive director of the MIT Climate and Sustainability Consortium; and Randolph Kirchain, director of the MIT CSHub. More