More stories

  • in

    New visions for better transportation

    We typically experience transportation problems from the ground up. Waiting for a delayed bus, packing ourselves into a subway car, or crawling along in traffic, it is common to see such systems struggling at close range.

    Yet sometimes transportation solutions come from a high-level, top-down approach. That was the theme of the final talk in MIT’s Mobility Forum series, delivered on Friday by MIT Professor Thomas Magnanti, which centered on applying to transportation the same overarching analytical framework used in other domains, such as bioengineering.

    Magnanti’s remarks focused on a structured approach to problem-solving known as the 4M method — which stands for measuring, mining, modeling, and manipulating. In urban transportation planning, for instance, measuring and mining might involve understanding traffic flows. Modeling might simulate those traffic flows, and manipulating would mean engineering interventions: tolls, one-way streets, or other changes.

    “These are four things that interact quite a bit with each other,” said Magnanti, who is an Institute Professor — MIT’s highest faculty distinction — and a professor of operations research at the MIT Sloan School of Management. “And they provide us with a sense of how you can gather data and understand a system, but also how you can improve it.”

    Magnanti, a leading expert in operations research, pointed out that the 4M method can be applied to systems from physics to biomedical research. He outlined how it might be used to analyze transportations-related systems such as supply chains and warehouse movements.

    In all cases, he noted, applying the 4M concept to a system is an iterative process: Making changes to a system will likely produce new flows — of traffic and goods — and thus be subject to a new set of measurements.

    “One thing to notice here, once you manipulate the system, it changes the data,” Magnanti observed. “You’re doing this so you can hopefully improve operations, but it creates new data. So, you want to measure that new data again, you want to mine it, you want to model it again, and then manipulate it. … This is a continuing loop that we use in these systems.”

    Magnanti’s talk, “Understanding and Improving Transportation Systems,” was delivered online to a public audience of about 175 people. It was the 12th and final event of the MIT Mobility Forum in the fall 2021 semester. The event series is organized by the MIT Mobility Initiative, an Institute-wide effort to research and accelerate the evolution of transportation, at a time when decarbonization in the sector is critical.

    Other MIT Mobility Forum talks have focused on topics such as zero-environmental-impact aviation, measuring pedestrian flows in cities, autonomous vehicles, the impact of high-speed rail and subways on cities, values and equity in mobility design, and more.

    Overall, the forum “offers an opportunity to showcase the groundbreaking transportation research occurring across the Institute,” says Jinhua Zhao, an associate professor of transportation and city planning in MIT’s Department of Urban Studies and Planning, and director of the MIT Mobility Initiative.

    The initiative has held 39 such talks since it launched in 2020, and the series will continue again in the spring semester of 2022.

    One of the principal features of the forum, like the MIT Mobility Initiative in general, is that it “facilitates cross-disciplinary exchanges both within MIT and without,” Zhao says. Faculty and students from every school at MIT have participated in the forum, lending intellectual and methodological diversity to a broad field.

    For his part, Magnanti, who is both an engineer and operations researcher by training, embraced that interdisciplinary approach in his remarks, fielding a variety of audience questions after his talk, about research methods and other issues. Magnanti, who served from 2009 to 2017 as the founding president of the Singapore University of Technology and Design (with which MIT has had research collaborations), noted that the setting can heavily influence transportation research and progress.

    In Singapore, he noted, “They measure everything. They measure how people access the subway … and they use their data.” Of course, Singapore’s status as a city-state of modest size, among other factors, makes comprehensive transportation planning more feasible there. Still, Magnanti also noted that the infrastructure bill recently passed by the U.S. federal government is “going to provide lots of opportunities” for transportation improvements.

    And in general, Magnanti added, one of the best things academic leaders and research communities can do is to “continue to create a sense of excitement. Even when things are tough, the problems are going to be interesting.” More

  • in

    A tool to speed development of new solar cells

    In the ongoing race to develop ever-better materials and configurations for solar cells, there are many variables that can be adjusted to try to improve performance, including material type, thickness, and geometric arrangement. Developing new solar cells has generally been a tedious process of making small changes to one of these parameters at a time. While computational simulators have made it possible to evaluate such changes without having to actually build each new variation for testing, the process remains slow.

    Now, researchers at MIT and Google Brain have developed a system that makes it possible not just to evaluate one proposed design at a time, but to provide information about which changes will provide the desired improvements. This could greatly increase the rate for the discovery of new, improved configurations.

    The new system, called a differentiable solar cell simulator, is described in a paper published today in the journal Computer Physics Communications, written by MIT junior Sean Mann, research scientist Giuseppe Romano of MIT’s Institute for Soldier Nanotechnologies, and four others at MIT and at Google Brain.

    Traditional solar cell simulators, Romano explains, take the details of a solar cell configuration and produce as their output a predicted efficiency — that is, what percentage of the energy of incoming sunlight actually gets converted to an electric current. But this new simulator both predicts the efficiency and shows how much that output is affected by any one of the input parameters. “It tells you directly what happens to the efficiency if we make this layer a little bit thicker, or what happens to the efficiency if we for example change the property of the material,” he says.

    In short, he says, “we didn’t discover a new device, but we developed a tool that will enable others to discover more quickly other higher performance devices.” Using this system, “we are decreasing the number of times that we need to run a simulator to give quicker access to a wider space of optimized structures.” In addition, he says, “our tool can identify a unique set of material parameters that has been hidden so far because it’s very complex to run those simulations.”

    While traditional approaches use essentially a random search of possible variations, Mann says, with his tool “we can follow a trajectory of change because the simulator tells you what direction you want to be changing your device. That makes the process much faster because instead of exploring the entire space of opportunities, you can just follow a single path” that leads directly to improved performance.

    Since advanced solar cells often are composed of multiple layers interlaced with conductive materials to carry electric charge from one to the other, this computational tool reveals how changing the relative thicknesses of these different layers will affect the device’s output. “This is very important because the thickness is critical. There is a strong interplay between light propagation and the thickness of each layer and the absorption of each layer,” Mann explains.

    Other variables that can be evaluated include the amount of doping (the introduction of atoms of another element) that each layer receives, or the dielectric constant of insulating layers, or the bandgap, a measure of the energy levels of photons of light that can be captured by different materials used in the layers.

    This simulator is now available as an open-source tool that can be used immediately to help guide research in this field, Romano says. “It is ready, and can be taken up by industry experts.” To make use of it, researchers would couple this device’s computations with an optimization algorithm, or even a machine learning system, to rapidly assess a wide variety of possible changes and home in quickly on the most promising alternatives.

    At this point, the simulator is based on just a one-dimensional version of the solar cell, so the next step will be to expand its capabilities to include two- and three-dimensional configurations. But even this 1D version “can cover the majority of cells that are currently under production,” Romano says. Certain variations, such as so-called tandem cells using different materials, cannot yet be simulated directly by this tool, but “there are ways to approximate a tandem solar cell by simulating each of the individual cells,” Mann says.

    The simulator is “end-to-end,” Romano says, meaning it computes the sensitivity of the efficiency, also taking into account light absorption. He adds: “An appealing future direction is composing our simulator with advanced existing differentiable light-propagation simulators, to achieve enhanced accuracy.”

    Moving forward, Romano says, because this is an open-source code, “that means that once it’s up there, the community can contribute to it. And that’s why we are really excited.” Although this research group is “just a handful of people,” he says, now anyone working in the field can make their own enhancements and improvements to the code and introduce new capabilities.

    “Differentiable physics is going to provide new capabilities for the simulations of engineered systems,” says Venkat Viswanathan, an associate professor of mechanical engineering at Carnegie Mellon University, who was not associated with this work. “The  differentiable solar cell simulator is an incredible example of differentiable physics, that can now provide new capabilities to optimize solar cell device performance,” he says, calling the study “an exciting step forward.”

    In addition to Mann and Romano, the team included Eric Fadel and Steven Johnson at MIT, and Samuel Schoenholz and Ekin Cubuk at Google Brain. The work was supported in part by Eni S.p.A. and the MIT Energy Initiative, and the MIT Quest for Intelligence. More

  • in

    Q&A: More-sustainable concrete with machine learning

    As a building material, concrete withstands the test of time. Its use dates back to early civilizations, and today it is the most popular composite choice in the world. However, it’s not without its faults. Production of its key ingredient, cement, contributes 8-9 percent of the global anthropogenic CO2 emissions and 2-3 percent of energy consumption, which is only projected to increase in the coming years. With aging United States infrastructure, the federal government recently passed a milestone bill to revitalize and upgrade it, along with a push to reduce greenhouse gas emissions where possible, putting concrete in the crosshairs for modernization, too.

    Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in the MIT Department of Materials Science and Engineering, and Jie Chen, MIT-IBM Watson AI Lab research scientist and manager, think artificial intelligence can help meet this need by designing and formulating new, more sustainable concrete mixtures, with lower costs and carbon dioxide emissions, while improving material performance and reusing manufacturing byproducts in the material itself. Olivetti’s research improves environmental and economic sustainability of materials, and Chen develops and optimizes machine learning and computational techniques, which he can apply to materials reformulation. Olivetti and Chen, along with their collaborators, have recently teamed up for an MIT-IBM Watson AI Lab project to make concrete more sustainable for the benefit of society, the climate, and the economy.

    Q: What applications does concrete have, and what properties make it a preferred building material?

    Olivetti: Concrete is the dominant building material globally with an annual consumption of 30 billion metric tons. That is over 20 times the next most produced material, steel, and the scale of its use leads to considerable environmental impact, approximately 5-8 percent of global greenhouse gas (GHG) emissions. It can be made locally, has a broad range of structural applications, and is cost-effective. Concrete is a mixture of fine and coarse aggregate, water, cement binder (the glue), and other additives.

    Q: Why isn’t it sustainable, and what research problems are you trying to tackle with this project?

    Olivetti: The community is working on several ways to reduce the impact of this material, including alternative fuels use for heating the cement mixture, increasing energy and materials efficiency and carbon sequestration at production facilities, but one important opportunity is to develop an alternative to the cement binder.

    While cement is 10 percent of the concrete mass, it accounts for 80 percent of the GHG footprint. This impact is derived from the fuel burned to heat and run the chemical reaction required in manufacturing, but also the chemical reaction itself releases CO2 from the calcination of limestone. Therefore, partially replacing the input ingredients to cement (traditionally ordinary Portland cement or OPC) with alternative materials from waste and byproducts can reduce the GHG footprint. But use of these alternatives is not inherently more sustainable because wastes might have to travel long distances, which adds to fuel emissions and cost, or might require pretreatment processes. The optimal way to make use of these alternate materials will be situation-dependent. But because of the vast scale, we also need solutions that account for the huge volumes of concrete needed. This project is trying to develop novel concrete mixtures that will decrease the GHG impact of the cement and concrete, moving away from the trial-and-error processes towards those that are more predictive.

    Chen: If we want to fight climate change and make our environment better, are there alternative ingredients or a reformulation we could use so that less greenhouse gas is emitted? We hope that through this project using machine learning we’ll be able to find a good answer.

    Q: Why is this problem important to address now, at this point in history?

    Olivetti: There is urgent need to address greenhouse gas emissions as aggressively as possible, and the road to doing so isn’t necessarily straightforward for all areas of industry. For transportation and electricity generation, there are paths that have been identified to decarbonize those sectors. We need to move much more aggressively to achieve those in the time needed; further, the technological approaches to achieve that are more clear. However, for tough-to-decarbonize sectors, such as industrial materials production, the pathways to decarbonization are not as mapped out.

    Q: How are you planning to address this problem to produce better concrete?

    Olivetti: The goal is to predict mixtures that will both meet performance criteria, such as strength and durability, with those that also balance economic and environmental impact. A key to this is to use industrial wastes in blended cements and concretes. To do this, we need to understand the glass and mineral reactivity of constituent materials. This reactivity not only determines the limit of the possible use in cement systems but also controls concrete processing, and the development of strength and pore structure, which ultimately control concrete durability and life-cycle CO2 emissions.

    Chen: We investigate using waste materials to replace part of the cement component. This is something that we’ve hypothesized would be more sustainable and economic — actually waste materials are common, and they cost less. Because of the reduction in the use of cement, the final concrete product would be responsible for much less carbon dioxide production. Figuring out the right concrete mixture proportion that makes endurable concretes while achieving other goals is a very challenging problem. Machine learning is giving us an opportunity to explore the advancement of predictive modeling, uncertainty quantification, and optimization to solve the issue. What we are doing is exploring options using deep learning as well as multi-objective optimization techniques to find an answer. These efforts are now more feasible to carry out, and they will produce results with reliability estimates that we need to understand what makes a good concrete.

    Q: What kinds of AI and computational techniques are you employing for this?

    Olivetti: We use AI techniques to collect data on individual concrete ingredients, mix proportions, and concrete performance from the literature through natural language processing. We also add data obtained from industry and/or high throughput atomistic modeling and experiments to optimize the design of concrete mixtures. Then we use this information to develop insight into the reactivity of possible waste and byproduct materials as alternatives to cement materials for low-CO2 concrete. By incorporating generic information on concrete ingredients, the resulting concrete performance predictors are expected to be more reliable and transformative than existing AI models.

    Chen: The final objective is to figure out what constituents, and how much of each, to put into the recipe for producing the concrete that optimizes the various factors: strength, cost, environmental impact, performance, etc. For each of the objectives, we need certain models: We need a model to predict the performance of the concrete (like, how long does it last and how much weight does it sustain?), a model to estimate the cost, and a model to estimate how much carbon dioxide is generated. We will need to build these models by using data from literature, from industry, and from lab experiments.

    We are exploring Gaussian process models to predict the concrete strength, going forward into days and weeks. This model can give us an uncertainty estimate of the prediction as well. Such a model needs specification of parameters, for which we will use another model to calculate. At the same time, we also explore neural network models because we can inject domain knowledge from human experience into them. Some models are as simple as multi-layer perceptions, while some are more complex, like graph neural networks. The goal here is that we want to have a model that is not only accurate but also robust — the input data is noisy, and the model must embrace the noise, so that its prediction is still accurate and reliable for the multi-objective optimization.

    Once we have built models that we are confident with, we will inject their predictions and uncertainty estimates into the optimization of multiple objectives, under constraints and under uncertainties.

    Q: How do you balance cost-benefit trade-offs?

    Chen: The multiple objectives we consider are not necessarily consistent, and sometimes they are at odds with each other. The goal is to identify scenarios where the values for our objectives cannot be further pushed simultaneously without compromising one or a few. For example, if you want to further reduce the cost, you probably have to suffer the performance or suffer the environmental impact. Eventually, we will give the results to policymakers and they will look into the results and weigh the options. For example, they may be able to tolerate a slightly higher cost under a significant reduction in greenhouse gas. Alternatively, if the cost varies little but the concrete performance changes drastically, say, doubles or triples, then this is definitely a favorable outcome.

    Q: What kinds of challenges do you face in this work?

    Chen: The data we get either from industry or from literature are very noisy; the concrete measurements can vary a lot, depending on where and when they are taken. There are also substantial missing data when we integrate them from different sources, so, we need to spend a lot of effort to organize and make the data usable for building and training machine learning models. We also explore imputation techniques that substitute missing features, as well as models that tolerate missing features, in our predictive modeling and uncertainty estimate.

    Q: What do you hope to achieve through this work?

    Chen: In the end, we are suggesting either one or a few concrete recipes, or a continuum of recipes, to manufacturers and policymakers. We hope that this will provide invaluable information for both the construction industry and for the effort of protecting our beloved Earth.

    Olivetti: We’d like to develop a robust way to design cements that make use of waste materials to lower their CO2 footprint. Nobody is trying to make waste, so we can’t rely on one stream as a feedstock if we want this to be massively scalable. We have to be flexible and robust to shift with feedstocks changes, and for that we need improved understanding. Our approach to develop local, dynamic, and flexible alternatives is to learn what makes these wastes reactive, so we know how to optimize their use and do so as broadly as possible. We do that through predictive model development through software we have developed in my group to automatically extract data from literature on over 5 million texts and patents on various topics. We link this to the creative capabilities of our IBM collaborators to design methods that predict the final impact of new cements. If we are successful, we can lower the emissions of this ubiquitous material and play our part in achieving carbon emissions mitigation goals.

    Other researchers involved with this project include Stefanie Jegelka, the X-Window Consortium Career Development Associate Professor in the MIT Department of Electrical Engineering and Computer Science; Richard Goodwin, IBM principal researcher; Soumya Ghosh, MIT-IBM Watson AI Lab research staff member; and Kristen Severson, former research staff member. Collaborators included Nghia Hoang, former research staff member with MIT-IBM Watson AI Lab and IBM Research; and Jeremy Gregory, research scientist in the MIT Department of Civil and Environmental Engineering and executive director of the MIT Concrete Sustainability Hub.

    This research is supported by the MIT-IBM Watson AI Lab. More

  • in

    3 Questions: Tolga Durak on building a safety culture at MIT

    Environment, Health, and Safety Managing Director Tolga Durak heads a team working to build a strong safety culture at the Institute and to implement systems that lead to successful lab and makerspace operations. EHS is also pursuing new opportunities in the areas of safe and sustainable labs and applied makerspace research. 

    Durak holds a BS in mechanical engineering, a MS in industrial and systems engineering, and a PhD in building construction/environmental design and planning. He has over 20 years of experience in engineering and EHS in higher education, having served in such roles as authority having jurisdiction, responsible official, fire marshal, risk manager, radiation safety officer, laser safety officer, safety engineer, project manager, and emergency manager for government agencies, as well as universities with extensive health-care and research facilities.

    Q: What “words of wisdom” regarding lab/shop health and safety would you like to share with the research community? 

    A: EHS staff always strive to help maintain the safety and well-being of the MIT community. When it comes to lab/shop safety or any areas with hazards, first and foremost, we encourage wearing the appropriate personal protective equipment (PPE) when handling potentially hazardous materials. While PPE needs depend on the hazards and the space, common PPE includes safety glasses, lab coats, gloves, clothes that cover your skin, and closed-toe shoes. Shorts and open-toe shoes have no place in the lab/shop setting when hazardous materials are stored or used. Accidents will and do happen. The severity of injuries due to accidental exposures can be minimized when researchers are wearing PPE. Remember, there is only one you!   

    Overall, be aware of your surroundings, be knowledgeable about the hazards of the materials and equipment you are using, and be prepared for the unexpected. Ask yourself, “What’s the worst thing that can happen during this experiment or procedure?” Prepare by doing a thorough risk assessment, ask others who may be knowledgeable for their ideas and help, and standardize procedures where possible. Be prepared to respond appropriately when an emergency arises. 

    Safety in our classrooms, labs, and makerspaces is paramount and requires a collaborative effort. 

    Q: What are the established programs within EHS that students and researchers should be aware of, and what opportunities and challenges do you face trying to advance a healthy safety culture at MIT? 

    A: The EHS program staff in Biosafety, Industrial Hygiene, Environmental Management, Occupational and Construction Safety, and Radiation Protection are ready to assist with risk assessments, chemical safety, physical hazards, hazard-specific training, materials management, and hazardous waste disposal and reuse/recycling. Locally, each department, laboratory, and center has an EHS coordinator, as well as an assigned EHS team, to assist in the implementation of required EHS programs. Each lab/shop also has a designated EHS representative — someone who has local knowledge of your lab/shop and can help you with safety requirements specific to your work area.  

    One of the biggest challenges we face is that due to the decentralized nature of the Institute, no one size fits all when it comes to implementing successful safety practices. We also view this as an opportunity to enhance our safety culture. A strong safety culture is reflected at MIT when all lab and makerspace members are willing to look out for each other, challenge the status quo when necessary, and do the right thing even when no one is looking. In labs/shops with a strong safety culture, faculty and researchers discuss safety topics at group meetings, group members remind each other to wear the appropriate PPE (lab coats, safety glasses, etc.), more experienced team members mentor the newcomers, and riskier operations are reviewed and assessed to make them as safe as possible.  

    Q: Can you describe the new Safe and Sustainable Laboratories (S2L) efforts and the makerspace operational research programs envisioned for the future? 

    A: The MIT EHS Office has a plan for renewing its dedication to sustainability and climate action. We are dedicated to doing our part to promote a research environment that assures the highest level of health and safety but also strives to reduce energy, water, and waste through educating and supporting faculty, students, and researchers. With the goal of integrating sustainability across the lab sector of campus and bridging that with the Institute’s climate action goals, EHS has partnered with the MIT Office of Sustainability, Department of Facilities, vice president for finance, and vice president for campus services and stewardship to relaunch the “green” labs sustainability efforts under a new Safe and Sustainable Labs program.

    Part of that plan is to implement a Sustainable Labs Certification program. The process is designed to be as easy as possible for the lab groups. We are starting with simple actions like promoting the use of equipment timers in certain locations to conserve energy, fume hood/ventilation management, preventative maintenance for ultra-low-temperature freezers, increasing recycling, and helping labs update their central chemical inventory system, which can help forecast MIT’s potential waste streams. 

    EHS has also partnered with Project Manus to build a test-bed lab to study potential health and environmental exposures present in makerspaces as a result of specialized equipment and processes with our new Applied Makerspace Research Initiative.   More

  • in

    Climate modeling confirms historical records showing rise in hurricane activity

    When forecasting how storms may change in the future, it helps to know something about their past. Judging from historical records dating back to the 1850s, hurricanes in the North Atlantic have become more frequent over the last 150 years.

    However, scientists have questioned whether this upward trend is a reflection of reality, or simply an artifact of lopsided record-keeping. If 19th-century storm trackers had access to 21st-century technology, would they have recorded more storms? This inherent uncertainty has kept scientists from relying on storm records, and the patterns within them, for clues to how climate influences storms.

    A new MIT study published today in Nature Communications has used climate modeling, rather than storm records, to reconstruct the history of hurricanes and tropical cyclones around the world. The study finds that North Atlantic hurricanes have indeed increased in frequency over the last 150 years, similar to what historical records have shown.

    In particular, major hurricanes, and hurricanes in general, are more frequent today than in the past. And those that make landfall appear to have grown more powerful, carrying more destructive potential.

    Curiously, while the North Atlantic has seen an overall increase in storm activity, the same trend was not observed in the rest of the world. The study found that the frequency of tropical cyclones globally has not changed significantly in the last 150 years.

    “The evidence does point, as the original historical record did, to long-term increases in North Atlantic hurricane activity, but no significant changes in global hurricane activity,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. “It certainly will change the interpretation of climate’s effects on hurricanes — that it’s really the regionality of the climate, and that something happened to the North Atlantic that’s different from the rest of the globe. It may have been caused by global warming, which is not necessarily globally uniform.”

    Chance encounters

    The most comprehensive record of tropical cyclones is compiled in a database known as the International Best Track Archive for Climate Stewardship (IBTrACS). This historical record includes modern measurements from satellites and aircraft that date back to the 1940s. The database’s older records are based on reports from ships and islands that happened to be in a storm’s path. These earlier records date back to 1851, and overall the database shows an increase in North Atlantic storm activity over the last 150 years.

    “Nobody disagrees that that’s what the historical record shows,” Emanuel says. “On the other hand, most sensible people don’t really trust the historical record that far back in time.”

    Recently, scientists have used a statistical approach to identify storms that the historical record may have missed. To do so, they consulted all the digitally reconstructed shipping routes in the Atlantic over the last 150 years and mapped these routes over modern-day hurricane tracks. They then estimated the chance that a ship would encounter or entirely miss a hurricane’s presence. This analysis found a significant number of early storms were likely missed in the historical record. Accounting for these missed storms, they concluded that there was a chance that storm activity had not changed over the last 150 years.

    But Emanuel points out that hurricane paths in the 19th century may have looked different from today’s tracks. What’s more, the scientists may have missed key shipping routes in their analysis, as older routes have not yet been digitized.

    “All we know is, if there had been a change (in storm activity), it would not have been detectable, using digitized ship records,” Emanuel says “So I thought, there’s an opportunity to do better, by not using historical data at all.”

    Seeding storms

    Instead, he estimated past hurricane activity using dynamical downscaling — a technique that his group developed and has applied over the last 15 years to study climate’s effect on hurricanes. The technique starts with a coarse global climate simulation and embeds within this model a finer-resolution model that simulates features as small as hurricanes. The combined models are then fed with real-world measurements of atmospheric and ocean conditions. Emanuel then scatters the realistic simulation with hurricane “seeds” and runs the simulation forward in time to see which seeds bloom into full-blown storms.

    For the new study, Emanuel embedded a hurricane model into a climate “reanalysis” — a type of climate model that combines observations from the past with climate simulations to generate accurate reconstructions of past weather patterns and climate conditions. He used a particular subset of climate reanalyses that only accounts for observations collected from the surface — for instance from ships, which have recorded weather conditions and sea surface temperatures consistently since the 1850s, as opposed to from satellites, which only began systematic monitoring in the 1970s.

    “We chose to use this approach to avoid any artificial trends brought about by the introduction of progressively different observations,” Emanuel explains.

    He ran an embedded hurricane model on three different climate reanalyses, simulating tropical cyclones around the world over the past 150 years. Across all three models, he observed “unequivocal increases” in North Atlantic hurricane activity.

    “There’s been this quite large increase in activity in the Atlantic since the mid-19th century, which I didn’t expect to see,” Emanuel says.

    Within this overall rise in storm activity, he also observed a “hurricane drought” — a period during the 1970s and 80s when the number of yearly hurricanes momentarily dropped. This pause in storm activity can also be seen in historical records, and Emanuel’s group proposes a cause: sulfate aerosols, which were byproducts of fossil fuel combustion, likely set off a cascade of climate effects that cooled the North Atlantic and temporarily suppressed hurricane formation.

    “The general trend over the last 150 years was increasing storm activity, interrupted by this hurricane drought,” Emanuel notes. “And at this point, we’re more confident of why there was a hurricane drought than why there is an ongoing, long-term increase in activity that began in the 19th century. That is still a mystery, and it bears on the question of how global warming might affect future Atlantic hurricanes.”

    This research was supported, in part, by the National Science Foundation. More

  • in

    Scientists and musicians tackle climate change together

    Audiences may travel long distances to see their favorite musical acts in concert or to attend large music festivals, which can add to their personal carbon footprint of emissions that are steadily warming the planet. But these same audiences, and the performers they follow, are often quite aware of the dangers of climate change and eager to contribute to ways of curbing those emissions.

    How should the industry reconcile these two perspectives, and how should it harness the enormous influence that musicians have on their fans to help promote action on climate change?

    That was the focus of a wide-ranging discussion on Monday hosted by MIT’s Environmental Solutions Initiative, titled “Artists and scientists together on climate solutions.” The event, which was held live at the Media Lab’s Bartos Theater and streamed online, featured John Fernandez, director of ESI; Dava Newman, director of the Media Lab; Tony McGuinness, a musician with the group Above and Beyond; and Anna Johnson, the sustainability and environment officer at Involved Group, an organization dedicated to embedding sustainability in business operations in the arts and culture fields.

    Fernandez pointed out in opening the discussion that when it comes to influencing people’s attitudes and behavior, changes tend to come about not just through information from a particular field, but rather from a whole culture. “We started thinking about how we might work with artists, how to have scientists and engineers, inventors, and designers working with artists on the challenges that we really need to face,” he said.

    Dealing with the climate change issue, he said, “is not about 2050 or 2100. This is about 2030. This is about this decade. This is about the next two or three years, really shifting that curve” to lowering the world’s greenhouse gas emissions. “It’s not going to be done just with science and engineering,” he added. “It’s got to be done with artists and business and everyone else. It’s not only ‘all of the above’ solutions, it’s ‘all of the above’ people, coming together to solve this problem.”

    Newman, who is also a professor in MIT’s Department of Aeronautics and Astronautics and has served as a NASA deputy administrator, said that while scientists and engineers can produce vast amounts of useful data that clearly demonstrate the dramatic changes the Earth’s climate is undergoing, communicating that information effectively is often a challenge for these specialists. “That data is just the data, but that doesn’t change the hearts and minds,” she said.

    “As scientists, having the data from our satellites, looking down, but also flying airplanes into the atmosphere, … we have the sensors, and then what can we do with it all? … How do we change human behavior? That’s the part I don’t know how to do,” Newman said. “I can have the technology, I can get precision measurements, I can study it, but really at the end of the day, we have to change human behavior, and that is so hard.”

    And that’s where the world of art and music can play a part, she said. “The best way that I know how to do it is with artistic experiences. You can have one moving experience and when you wake up tomorrow, maybe you’re going to do something a little different.” To help generate the compassion and empathy needed to affect behavior positively, she said, “that’s where we turn to the storytellers. We turn to the visionaries.”

    McGuinness, whose electronic music trio has performed for millions of people around the world, said that his own awareness of the urgency of the climate issue came from his passion for scuba diving, and the dramatic changes he has seen over the last two decades. In diving at a coral reef off Palau in the South Pacific, he returned to what had been a lush, brightly colored ecosystem, and found that “immediately when you put your face under the water, you’re looking at the surface of the moon. It was a horrible shock to see this.”

    After this and other similar diving experiences, he said, “I just came away shocked and stunned,” and realizing the kinds of underwater experiences he had enjoyed would no longer exist for his children. After reading more on the subject of global warming,  “that really sort of tipped me over the edge. And I was like, this is probably the most important thing for living beings now. And that’s sort of where I’ve remained ever since.”

    While his group Above and Beyond has performed one song specifically related to global warming, he doesn’t expect that to be the most impactful way of using their influence. Rather, they’re trying to lead by example, he said, by paying more attention to everything from the supply chains of the merchandise sold at concerts to the emissions generated by travel to the concerts. They’re also being selective about concert venues and making an effort to find performance spaces that are making a significant effort to curb their emissions.

    “If people start voting with their wallets,” McGuinness said, “and there are companies that are doing better than others and are doing the right thing, maybe it’ll catch on. I guess that’s what we can hope for.”

    Understanding these kinds of issues, involving supply chains, transportation, and facilities associated within the music industry, has been the focus of much of Johnson’s work, through the organization Involved Group, which has entered into a collaboration with MIT through the Environmental Solutions Initiative. “It’s these kinds of novel partnerships that have so much potential to catalyze the change that we need to see at an incredible pace,” she said. Already, her group has worked with MIT on mapping out where emissions occur throughout the various aspects of the music industry.

    At a recent music festival in London, she said, the group interviewed hundreds of participants, including audience members, band members, and the crew. “We explored people’s level of awareness of the issues around climate change and environmental degradation,” she said. “And what was really interesting was that there was clearly a lot of awareness of the issue across those different stakeholders, and what felt like a real, genuine level of concern and also of motivation, to want to deepen their understanding of what their contribution on a personal level really meant.”

    Working together across the boundaries of different disciplines and areas of expertise could be crucial to winning the battle against global warming, Newman said. “That’s usually how breakthroughs work,” she said. “If we’re really looking to have impact, it’s going to be from teams of people who are trained across the disciplines.” She pointed out that 90 percent of MIT students are also musicians: “It does go together!” she said. “I think going forward, we have to create new academia, new opportunities that are truly multidisciplinary.” More

  • in

    SMART researchers develop method for early detection of bacterial infection in crops

    Researchers from the Disruptive and Sustainable Technologies for Agricultural Precision (DiSTAP) Interdisciplinary Research Group (IRG) ofSingapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, and their local collaborators from Temasek Life Sciences Laboratory (TLL), have developed a rapid Raman spectroscopy-based method for detecting and quantifying early bacterial infection in crops. The Raman spectral biomarkers and diagnostic algorithm enable the noninvasive and early diagnosis of bacterial infections in crop plants, which can be critical for the progress of plant disease management and agricultural productivity.

    Due to the increasing demand for global food supply and security, there is a growing need to improve agricultural production systems and increase crop productivity. Globally, bacterial pathogen infection in crop plants is one of the major contributors to agricultural yield losses. Climate change also adds to the problem by accelerating the spread of plant diseases. Hence, developing methods for rapid and early detection of pathogen-infected crops is important to improve plant disease management and reduce crop loss.

    The breakthrough by SMART and TLL researchers offers a faster and more accurate method to detect bacterial infection in crop plants at an earlier stage, as compared to existing techniques. The new results appear in a paper titled “Rapid detection and quantification of plant innate immunity response using Raman spectroscopy” published in the journal Frontiers in Plant Science.

    “The early detection of pathogen-infected crop plants is a significant step to improve plant disease management,” says Chua Nam Hai, DiSTAP co-lead principal investigator, professor, TLL deputy chair, and co-corresponding author. “It will allow the fast and selective removal of pathogen load and curb the further spread of disease to other neighboring crops.”

    Traditionally, plant disease diagnosis involves a simple visual inspection of plants for disease symptoms and severity. “Visual inspection methods are often ineffective, as disease symptoms usually manifest only at relatively later stages of infection, when the pathogen load is already high and reparative measures are limited. Hence, new methods are required for rapid and early detection of bacterial infection. The idea would be akin to having medical tests to identify human diseases at an early stage, instead of waiting for visual symptoms to show, so that early intervention or treatment can be applied,” says MIT Professor Rajeev Ram, who is a DiSTAP principal investigator and co-corresponding author on the paper.

    While existing techniques, such as current molecular detection methods, can detect bacterial infection in plants, they are often limited in their use. Molecular detection methods largely depend on the availability of pathogen-specific gene sequences or antibodies to identify bacterial infection in crops; the implementation is also time-consuming and nonadaptable for on-site field application due to the high cost and bulky equipment required, making it impractical for use in agricultural farms.

    “At DiSTAP, we have developed a quantitative Raman spectroscopy-based algorithm that can help farmers to identify bacterial infection rapidly. The developed diagnostic algorithm makes use of Raman spectral biomarkers and can be easily implemented in cloud-based computing and prediction platforms. It is more effective than existing techniques as it enables accurate identification and early detection of bacterial infection, both of which are crucial to saving crop plants that would otherwise be destroyed,” explains Gajendra Pratap Singh, scientific director and principal investigator at DiSTAP and co-lead author.

    A portable Raman system can be used on farms and provides farmers with an accurate and simple yes-or-no response when used to test for the presence of bacterial infections in crops. The development of this rapid and noninvasive method could improve plant disease management and have a transformative impact on agricultural farms by efficiently reducing agricultural yield loss and increasing productivity.

    “Using the diagnostic algorithm method, we experimented on several edible plants such as choy sum,” says DiSTAP and TLL principal investigator and co-corresponding author Rajani Sarojam. “The results showed that the Raman spectroscopy-based method can swiftly detect and quantify innate immunity response in plants infected with bacterial pathogens. We believe that this technology will be beneficial for agricultural farms to increase their productivity by reducing their yield loss due to plant diseases.”

    The researchers are currently working on the development of high-throughput, custom-made portable or hand-held Raman spectrometers that will allow Raman spectral analysis to be quickly and easily performed on field-grown crops.

    SMART and TLL developed and discovered the diagnostic algorithm and Raman spectral biomarkers. TLL also confirmed and validated the detection method through mutant plants. The research is carried out by SMART and supported by the National Research Foundation of Singapore under its Campus for Research Excellence And Technological Enterprise (CREATE) program.

    SMART was established by MIT and the NRF in 2007. The first entity in CREATE developed by NRF, SMART serves as an intellectual and innovation hub for research interactions between MIT and Singapore, undertaking cutting-edge research projects in areas of interest to both Singapore and MIT. SMART currently comprises an Innovation Center and five IRGs: Antimicrobial Resistance, Critical Analytics for Manufacturing Personalized-Medicine, DiSTAP, Future Urban Mobility, and Low Energy Electronic Systems. SMART research is funded by the NRF under the CREATE program.

    Led by Professor Michael Strano of MIT and Professor Chua Nam Hai of Temasek Lifesciences Laboratory, the DiSTAP program addresses deep problems in food production in Singapore and the world by developing a suite of impactful and novel analytical, genetic, and biomaterial technologies. The goal is to fundamentally change how plant biosynthetic pathways are discovered, monitored, engineered, and ultimately translated to meet the global demand for food and nutrients. Scientists from MIT, TTL, Nanyang Technological University, and National University of Singapore are collaboratively developing new tools for the continuous measurement of important plant metabolites and hormones for novel discovery, deeper understanding and control of plant biosynthetic pathways in ways not yet possible, especially in the context of green leafy vegetables; leveraging these new techniques to engineer plants with highly desirable properties for global food security, including high-yield density production, and drought and pathogen resistance; and applying these technologies to improve urban farming. More

  • in

    Energy hackers give a glimpse of a postpandemic future

    After going virtual in 2020, the MIT EnergyHack was back on campus last weekend in a brand-new hybrid format that saw teams participate both in person and virtually from across the globe. While the hybrid format presented new challenges to the organizing team, it also allowed for one of the most diverse and inspiring iterations of the event to date.

    “Organizing a hybrid event was a challenging but important goal in 2021 as we slowly come out of the pandemic, but it was great to realize the benefits of the format this year,” says Kailin Graham, a graduate student in MIT’s Technology and Policy Program and one of the EnergyHack communications directors. “Not only were we able to get students back on campus and taking advantage of those important in-person interactions, but preserving a virtual avenue meant that we were still able to hear brilliant ideas from those around the world who might not have had the opportunity to contribute otherwise, and that’s what the EnergyHack is really about.”

    In fact, of the over 300 participants registered for the event, more than a third participated online, and two of the three grand prize winners participated entirely virtually. Teams of students at any degree level from any institution were welcome, and the event saw an incredible range of backgrounds and expertise, from undergraduates to MBAs, put their heads together to create innovative solutions.

    This year’s event was supported by a host of energy partners both in industry and within MIT. The MIT Energy and Climate Club worked with sponsoring organizations Smartflower, Chargepoint, Edison Energy, Line Vision, Chevron, Shell, and Sterlite Power to develop seven problem statements for hackers, with each judged by representatives form their respective organization. The challenges ranged from envisioning the future of electric vehicle fueling to quantifying the social and environmental benefits of renewable energy projects.

    Hackers had 36 hours to come up with a solution to one challenge, and teams then presented these solutions in a short pitch to a judging panel. Finalists from each challenge progressed to the final judging round to pitch against each other in pursuit of three grand prizes. Team COPrs came in third, receiving $1,000 for their solution to the Line Vision challenge; Crown Joules snagged second place and $1,500 for their approach to the Chargepoint problem; and Feel AMPowered took out first place and $2,000 for their innovative solution to the Smartflower challenge.

    In addition to a new format, this year’s EnergyHack also featured a new emphasis on climate change impacts and the energy transition. According to Arina Khotimsky, co-managing director of EnergyHack 2021, “Moving forward after this year’s rebranding of the MIT Energy and Climate Club, we were hoping to carry this aim to EnergyHack. It was incredibly exciting to have ChargePoint and SmartFlower leading as our Sustainability Circle-tier sponsors and bringing their impactful innovations to the conversations at EnergyHack 2021.”

    To the organizing team, whose members from sophomores to MBAs, this aspect of the event was especially important, and their hope was for the event to inspire a generation of young energy and climate leaders — a hope, according to them, that seems to have been fulfilled.

    “I was floored by the positive feedback we received from hackers, both in-person and virtual, about how much they enjoyed the hackathon,” says Graham. “It’s all thanks to our team of incredibly hardworking organizing directors who made EnergyHack 2021 what it was. It was incredibly rewarding seeing everyone’s impact on the event, and we are looking forward to seeing how it evolves in the future.”­­­ More