More stories

  • in

    Engineers enlist AI to help scale up advanced solar cell manufacturing

    Perovskites are a family of materials that are currently the leading contender to potentially replace today’s silicon-based solar photovoltaics. They hold the promise of panels that are far thinner and lighter, that could be made with ultra-high throughput at room temperature instead of at hundreds of degrees, and that are cheaper and easier to transport and install. But bringing these materials from controlled laboratory experiments into a product that can be manufactured competitively has been a long struggle.

    Manufacturing perovskite-based solar cells involves optimizing at least a dozen or so variables at once, even within one particular manufacturing approach among many possibilities. But a new system based on a novel approach to machine learning could speed up the development of optimized production methods and help make the next generation of solar power a reality.

    The system, developed by researchers at MIT and Stanford University over the last few years, makes it possible to integrate data from prior experiments, and information based on personal observations by experienced workers, into the machine learning process. This makes the outcomes more accurate and has already led to the manufacturing of perovskite cells with an energy conversion efficiency of 18.5 percent, a competitive level for today’s market.

    The research is reported today in the journal Joule, in a paper by MIT professor of mechanical engineering Tonio Buonassisi, Stanford professor of materials science and engineering Reinhold Dauskardt, recent MIT research assistant Zhe Liu, Stanford doctoral graduate Nicholas Rolston, and three others.

    Perovskites are a group of layered crystalline compounds defined by the configuration of the atoms in their crystal lattice. There are thousands of such possible compounds and many different ways of making them. While most lab-scale development of perovskite materials uses a spin-coating technique, that’s not practical for larger-scale manufacturing, so companies and labs around the world have been searching for ways of translating these lab materials into a practical, manufacturable product.

    “There’s always a big challenge when you’re trying to take a lab-scale process and then transfer it to something like a startup or a manufacturing line,” says Rolston, who is now an assistant professor at Arizona State University. The team looked at a process that they felt had the greatest potential, a method called rapid spray plasma processing, or RSPP.

    The manufacturing process would involve a moving roll-to-roll surface, or series of sheets, on which the precursor solutions for the perovskite compound would be sprayed or ink-jetted as the sheet rolled by. The material would then move on to a curing stage, providing a rapid and continuous output “with throughputs that are higher than for any other photovoltaic technology,” Rolston says.

    “The real breakthrough with this platform is that it would allow us to scale in a way that no other material has allowed us to do,” he adds. “Even materials like silicon require a much longer timeframe because of the processing that’s done. Whereas you can think of [this approach as more] like spray painting.”

    Within that process, at least a dozen variables may affect the outcome, some of them more controllable than others. These include the composition of the starting materials, the temperature, the humidity, the speed of the processing path, the distance of the nozzle used to spray the material onto a substrate, and the methods of curing the material. Many of these factors can interact with each other, and if the process is in open air, then humidity, for example, may be uncontrolled. Evaluating all possible combinations of these variables through experimentation is impossible, so machine learning was needed to help guide the experimental process.

    But while most machine-learning systems use raw data such as measurements of the electrical and other properties of test samples, they don’t typically incorporate human experience such as qualitative observations made by the experimenters of the visual and other properties of the test samples, or information from other experiments reported by other researchers. So, the team found a way to incorporate such outside information into the machine learning model, using a probability factor based on a mathematical technique called Bayesian Optimization.

    Using the system, he says, “having a model that comes from experimental data, we can find out trends that we weren’t able to see before.” For example, they initially had trouble adjusting for uncontrolled variations in humidity in their ambient setting. But the model showed them “that we could overcome our humidity challenges by changing the temperature, for instance, and by changing some of the other knobs.”

    The system now allows experimenters to much more rapidly guide their process in order to optimize it for a given set of conditions or required outcomes. In their experiments, the team focused on optimizing the power output, but the system could also be used to simultaneously incorporate other criteria, such as cost and durability — something members of the team are continuing to work on, Buonassisi says.

    The researchers were encouraged by the Department of Energy, which sponsored the work, to commercialize the technology, and they’re currently focusing on tech transfer to existing perovskite manufacturers. “We are reaching out to companies now,” Buonassisi says, and the code they developed has been made freely available through an open-source server. “It’s now on GitHub, anyone can download it, anyone can run it,” he says. “We’re happy to help companies get started in using our code.”

    Already, several companies are gearing up to produce perovskite-based solar panels, even though they are still working out the details of how to produce them, says Liu, who is now at the Northwestern Polytechnical University in Xi’an, China. He says companies there are not yet doing large-scale manufacturing, but instead starting with smaller, high-value applications such as building-integrated solar tiles where appearance is important. Three of these companies “are on track or are being pushed by investors to manufacture 1 meter by 2-meter rectangular modules [comparable to today’s most common solar panels], within two years,” he says.

    ‘The problem is, they don’t have a consensus on what manufacturing technology to use,” Liu says. The RSPP method, developed at Stanford, “still has a good chance” to be competitive, he says. And the machine learning system the team developed could prove to be important in guiding the optimization of whatever process ends up being used.

    “The primary goal was to accelerate the process, so it required less time, less experiments, and less human hours to develop something that is usable right away, for free, for industry,” he says.

    “Existing work on machine-learning-driven perovskite PV fabrication largely focuses on spin-coating, a lab-scale technique,” says Ted Sargent, University Professor at the University of Toronto, who was not associated with this work, which he says demonstrates “a workflow that is readily adapted to the deposition techniques that dominate the thin-film industry. Only a handful of groups have the simultaneous expertise in engineering and computation to drive such advances.” Sargent adds that this approach “could be an exciting advance for the manufacture of a broader family of materials” including LEDs, other PV technologies, and graphene, “in short, any industry that uses some form of vapor or vacuum deposition.” 

    The team also included Austin Flick and Thomas Colburn at Stanford and Zekun Ren at the Singapore-MIT Alliance for Science and Technology (SMART). In addition to the Department of Energy, the work was supported by a fellowship from the MIT Energy Initiative, the Graduate Research Fellowship Program from the National Science Foundation, and the SMART program. More

  • in

    Computing our climate future

    On Monday, MIT announced five multiyear flagship projects in the first-ever Climate Grand Challenges, a new initiative to tackle complex climate problems and deliver breakthrough solutions to the world as quickly as possible. This article is the first in a five-part series highlighting the most promising concepts to emerge from the competition, and the interdisciplinary research teams behind them.

    With improvements to computer processing power and an increased understanding of the physical equations governing the Earth’s climate, scientists are continually working to refine climate models and improve their predictive power. But the tools they’re refining were originally conceived decades ago with only scientists in mind. When it comes to developing tangible climate action plans, these models remain inscrutable to the policymakers, public safety officials, civil engineers, and community organizers who need their predictive insight most.

    “What you end up having is a gap between what’s typically used in practice, and the real cutting-edge science,” says Noelle Selin, a professor in the Institute for Data, Systems and Society and the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and co-lead with Professor Raffaele Ferrari on the MIT Climate Grand Challenges flagship project “Bringing Computation to the Climate Crisis.” “How can we use new computational techniques, new understandings, new ways of thinking about modeling, to really bridge that gap between state-of-the-art scientific advances and modeling, and people who are actually needing to use these models?”

    Using this as a driving question, the team won’t just be trying to refine current climate models, they’re building a new one from the ground up.

    This kind of game-changing advancement is exactly what the MIT Climate Grand Challenges is looking for, which is why the proposal has been named one of the five flagship projects in the ambitious Institute-wide program aimed at tackling the climate crisis. The proposal, which was selected from 100 submissions and was among 27 finalists, will receive additional funding and support to further their goal of reimagining the climate modeling system. It also brings together contributors from across the Institute, including the MIT Schwarzman College of Computing, the School of Engineering, and the Sloan School of Management.

    When it comes to pursuing high-impact climate solutions that communities around the world can use, “it’s great to do it at MIT,” says Ferrari, EAPS Cecil and Ida Green Professor of Oceanography. “You’re not going to find many places in the world where you have the cutting-edge climate science, the cutting-edge computer science, and the cutting-edge policy science experts that we need to work together.”

    The climate model of the future

    The proposal builds on work that Ferrari began three years ago as part of a joint project with Caltech, the Naval Postgraduate School, and NASA’s Jet Propulsion Lab. Called the Climate Modeling Alliance (CliMA), the consortium of scientists, engineers, and applied mathematicians is constructing a climate model capable of more accurately projecting future changes in critical variables, such as clouds in the atmosphere and turbulence in the ocean, with uncertainties at least half the size of those in existing models.

    To do this, however, requires a new approach. For one thing, current models are too coarse in resolution — at the 100-to-200-kilometer scale — to resolve small-scale processes like cloud cover, rainfall, and sea ice extent. But also, explains Ferrari, part of this limitation in resolution is due to the fundamental architecture of the models themselves. The languages most global climate models are coded in were first created back in the 1960s and ’70s, largely by scientists for scientists. Since then, advances in computing driven by the corporate world and computer gaming have given rise to dynamic new computer languages, powerful graphics processing units, and machine learning.

    For climate models to take full advantage of these advancements, there’s only one option: starting over with a modern, more flexible language. Written in Julia, a part of Julialab’s Scientific Machine Learning technology, and spearheaded by Alan Edelman, a professor of applied mathematics in MIT’s Department of Mathematics, CliMA will be able to harness far more data than the current models can handle.

    “It’s been real fun finally working with people in computer science here at MIT,” Ferrari says. “Before it was impossible, because traditional climate models are in a language their students can’t even read.”

    The result is what’s being called the “Earth digital twin,” a climate model that can simulate global conditions on a large scale. This on its own is an impressive feat, but the team wants to take this a step further with their proposal.

    “We want to take this large-scale model and create what we call an ‘emulator’ that is only predicting a set of variables of interest, but it’s been trained on the large-scale model,” Ferrari explains. Emulators are not new technology, but what is new is that these emulators, being referred to as the “Earth digital cousins,” will take advantage of machine learning.

    “Now we know how to train a model if we have enough data to train them on,” says Ferrari. Machine learning for projects like this has only become possible in recent years as more observational data become available, along with improved computer processing power. The goal is to create smaller, more localized models by training them using the Earth digital twin. Doing so will save time and money, which is key if the digital cousins are going to be usable for stakeholders, like local governments and private-sector developers.

    Adaptable predictions for average stakeholders

    When it comes to setting climate-informed policy, stakeholders need to understand the probability of an outcome within their own regions — in the same way that you would prepare for a hike differently if there’s a 10 percent chance of rain versus a 90 percent chance. The smaller Earth digital cousin models will be able to do things the larger model can’t do, like simulate local regions in real time and provide a wider range of probabilistic scenarios.

    “Right now, if you wanted to use output from a global climate model, you usually would have to use output that’s designed for general use,” says Selin, who is also the director of the MIT Technology and Policy Program. With the project, the team can take end-user needs into account from the very beginning while also incorporating their feedback and suggestions into the models, helping to “democratize the idea of running these climate models,” as she puts it. Doing so means building an interactive interface that eventually will give users the ability to change input values and run the new simulations in real time. The team hopes that, eventually, the Earth digital cousins could run on something as ubiquitous as a smartphone, although developments like that are currently beyond the scope of the project.

    The next thing the team will work on is building connections with stakeholders. Through participation of other MIT groups, such as the Joint Program on the Science and Policy of Global Change and the Climate and Sustainability Consortium, they hope to work closely with policymakers, public safety officials, and urban planners to give them predictive tools tailored to their needs that can provide actionable outputs important for planning. Faced with rising sea levels, for example, coastal cities could better visualize the threat and make informed decisions about infrastructure development and disaster preparedness; communities in drought-prone regions could develop long-term civil planning with an emphasis on water conservation and wildfire resistance.

    “We want to make the modeling and analysis process faster so people can get more direct and useful feedback for near-term decisions,” she says.

    The final piece of the challenge is to incentivize students now so that they can join the project and make a difference. Ferrari has already had luck garnering student interest after co-teaching a class with Edelman and seeing the enthusiasm students have about computer science and climate solutions.

    “We’re intending in this project to build a climate model of the future,” says Selin. “So it seems really appropriate that we would also train the builders of that climate model.” More

  • in

    Embracing ancient materials and 21st-century challenges

    When Sophia Mittman was 10 years old, she wanted to be an artist. But instead of using paint, she preferred the mud in her backyard. She sculpted it into pots and bowls like the ones she had seen at the archaeological museums, transforming the earthly material into something beautiful.

    Now an MIT senior studying materials science and engineering, Mittman seeks modern applications for sustainable materials in ways that benefit the community around her.

    Growing up in San Diego, California, Mittman was homeschooled, and enjoyed the process of teaching herself new things. After taking a pottery class in seventh grade, she became interested in sculpture, teaching herself how to make fused glass. From there, Mittman began making pottery and jewelry. This passion to create new things out of sustainable materials led her to pursue materials science, a subject she didn’t even know was originally offered at the Institute.

    “I didn’t know the science behind why those materials had the properties they did. And materials science explained it,” she says.

    During her first year at MIT, Mittman took 2.00b (Toy Product Design), which she considers one of her most memorable classes at the Institute. She remembers learning about the mechanical side of building, using drill presses and sanding machines to create things. However, her favorite part was the seminars on the weekends, where she learned how to make things such as stuffed animals or rolling wooden toys. She appreciated the opportunity to learn how to use everyday materials like wood to construct new and exciting gadgets.

    From there, Mittman got involved in the Glass Club, using blowtorches to melt rods of glass to make things like marbles and little fish decorations. She also took a few pottery and ceramics classes on campus, learning how to hone her skills to craft new things. Understanding MIT’s hands-on approach to learning, Mittman was excited to use her newly curated skills in the various workshops on campus to apply them to the real world.

    In the summer after her first year, Mittman became an undergraduate field and conservation science researcher for the Department of Civil and Environmental Engineering. She traveled to various cities across Italy to collaborate with international art restorers, conservation scientists, and museum curators to study archaeological materials and their applications to modern sustainability. One of her favorite parts was restoring the Roman baths, and studying the mosaics on the ground. She did a research project on Egyptian Blue, one of the first synthetic pigments, which has modern applications because of its infrared luminescence, which can be used for detecting fingerprints in crime scenes. The experience was eye-opening for Mittman; she got to directly experience what she had been learning in the classroom about sustainable materials and how she could preserve and use them for modern applications.

    The next year, upon returning to campus, Mittman joined Incredible Foods as a polymeric food science and technology intern. She learned how to create and apply a polymer coating to natural fruit snacks to replicate real berries. “It was fun to see the breadth of material science because I had learned about polymers in my material science classes, but then never thought that it could be applied to making something as fun as fruit snacks,” she says.

    Venturing into yet another new area of materials science, Mittman last year pursued an internship with Phoenix Tailings, which aims to be the world’s first “clean” mining company. In the lab, she helped develop and analyze chemical reactions to physically and chemically extract rare earth metals and oxides from mining waste. She also worked to engineer bright-colored, high-performance pigments using nontoxic chemicals. Mittman enjoyed the opportunity to explore a mineralogically sustainable method for mining, something she hadn’t previously explored as a branch of materials science research.

    “I’m still able to contribute to environmental sustainability and to try to make a greener world, but it doesn’t solely have to be through energy because I’m dealing with dirt and mud,” she says.

    Outside of her academic work, Mittman is involved with the Tech Catholic Community (TCC) on campus. She has held roles as the music director, prayer chair, and social committee chair, organizing and managing social events for over 150 club members. She says the TCC is the most supportive community in her campus life, as she can meet people who have similar interests as her, though are in different majors. “There are a lot of emotional aspects of being at MIT, and there’s a spiritual part that so many students wrestle with. The TCC is where I’ve been able to find so much comfort, support, and encouragement; the closest friends I have are in the Tech Catholic Community,” she says.

    Mittman is also passionate about teaching, which allows her to connect to students and teach them material in new and exciting ways. In the fall of her junior and senior years, she was a teaching assistant for 3.091 (Introduction to Solid State Chemistry), where she taught two recitations of 20 students and offered weekly private tutoring. She enjoyed helping students tackle difficult course material in ways that are enthusiastic and encouraging, as she appreciated receiving the same help in her introductory courses.

    Looking ahead, Mittman plans to work fulltime at Phoenix Tailings as a materials scientist following her graduation. In this way, she feels like she has come full circle: from playing in the mud as a kid to working with it as a materials scientist to extract materials to help build a sustainable future for nearby and international communities.

    “I want to be able to apply what I’m enthusiastic about, which is materials science, by way of mineralogical sustainability, so that it can help mines here in America but also mines in Brazil, Austria, Jamaica — all over the world, because ultimately, I think that will help more people live better lives,” she says. More

  • in

    MIT announces five flagship projects in first-ever Climate Grand Challenges competition

    MIT today announced the five flagship projects selected in its first-ever Climate Grand Challenges competition. These multiyear projects will define a dynamic research agenda focused on unraveling some of the toughest unsolved climate problems and bringing high-impact, science-based solutions to the world on an accelerated basis.

    Representing the most promising concepts to emerge from the two-year competition, the five flagship projects will receive additional funding and resources from MIT and others to develop their ideas and swiftly transform them into practical solutions at scale.

    “Climate Grand Challenges represents a whole-of-MIT drive to develop game-changing advances to confront the escalating climate crisis, in time to make a difference,” says MIT President L. Rafael Reif. “We are inspired by the creativity and boldness of the flagship ideas and by their potential to make a significant contribution to the global climate response. But given the planet-wide scale of the challenge, success depends on partnership. We are eager to work with visionary leaders in every sector to accelerate this impact-oriented research, implement serious solutions at scale, and inspire others to join us in confronting this urgent challenge for humankind.”

    Brief descriptions of the five Climate Grand Challenges flagship projects are provided below.

    Bringing Computation to the Climate Challenge

    This project leverages advances in artificial intelligence, machine learning, and data sciences to improve the accuracy of climate models and make them more useful to a variety of stakeholders — from communities to industry. The team is developing a digital twin of the Earth that harnesses more data than ever before to reduce and quantify uncertainties in climate projections.

    Research leads: Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in the Department of Earth, Atmospheric and Planetary Sciences, and director of the Program in Atmospheres, Oceans, and Climate; and Noelle Eckley Selin, director of the Technology and Policy Program and professor with a joint appointment in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences

    Center for Electrification and Decarbonization of Industry

    This project seeks to reinvent and electrify the processes and materials behind hard-to-decarbonize industries like steel, cement, ammonia, and ethylene production. A new innovation hub will perform targeted fundamental research and engineering with urgency, pushing the technological envelope on electricity-driven chemical transformations.

    Research leads: Yet-Ming Chiang, the Kyocera Professor of Materials Science and Engineering, and Bilge Yıldız, the Breene M. Kerr Professor in the Department of Nuclear Science and Engineering and professor in the Department of Materials Science and Engineering

    Preparing for a new world of weather and climate extremes

    This project addresses key gaps in knowledge about intensifying extreme events such as floods, hurricanes, and heat waves, and quantifies their long-term risk in a changing climate. The team is developing a scalable climate-change adaptation toolkit to help vulnerable communities and low-carbon energy providers prepare for these extreme weather events.

    Research leads: Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Sciences and co-director of the MIT Lorenz Center; Miho Mazereeuw, associate professor of architecture and urbanism in the Department of Architecture and director of the Urban Risk Lab; and Paul O’Gorman, professor in the Program in Atmospheres, Oceans, and Climate in the Department of Earth, Atmospheric and Planetary Sciences

    The Climate Resilience Early Warning System

    The CREWSnet project seeks to reinvent climate change adaptation with a novel forecasting system that empowers underserved communities to interpret local climate risk, proactively plan for their futures incorporating resilience strategies, and minimize losses. CREWSnet will initially be demonstrated in southwestern Bangladesh, serving as a model for similarly threatened regions around the world.

    Research leads: John Aldridge, assistant leader of the Humanitarian Assistance and Disaster Relief Systems Group at MIT Lincoln Laboratory, and Elfatih Eltahir, the H.M. King Bhumibol Professor of Hydrology and Climate in the Department of Civil and Environmental Engineering

    Revolutionizing agriculture with low-emissions, resilient crops

    This project works to revolutionize the agricultural sector with climate-resilient crops and fertilizers that have the ability to dramatically reduce greenhouse gas emissions from food production.

    Research lead: Christopher Voigt, the Daniel I.C. Wang Professor in the Department of Biological Engineering

    “As one of the world’s leading institutions of research and innovation, it is incumbent upon MIT to draw on our depth of knowledge, ingenuity, and ambition to tackle the hard climate problems now confronting the world,” says Richard Lester, MIT associate provost for international activities. “Together with collaborators across industry, finance, community, and government, the Climate Grand Challenges teams are looking to develop and implement high-impact, path-breaking climate solutions rapidly and at a grand scale.”

    The initial call for ideas in 2020 yielded nearly 100 letters of interest from almost 400 faculty members and senior researchers, representing 90 percent of MIT departments. After an extensive evaluation, 27 finalist teams received a total of $2.7 million to develop comprehensive research and innovation plans. The projects address four broad research themes:

    To select the winning projects, research plans were reviewed by panels of international experts representing relevant scientific and technical domains as well as experts in processes and policies for innovation and scalability.

    “In response to climate change, the world really needs to do two things quickly: deploy the solutions we already have much more widely, and develop new solutions that are urgently needed to tackle this intensifying threat,” says Maria Zuber, MIT vice president for research. “These five flagship projects exemplify MIT’s strong determination to bring its knowledge and expertise to bear in generating new ideas and solutions that will help solve the climate problem.”

    “The Climate Grand Challenges flagship projects set a new standard for inclusive climate solutions that can be adapted and implemented across the globe,” says MIT Chancellor Melissa Nobles. “This competition propels the entire MIT research community — faculty, students, postdocs, and staff — to act with urgency around a worsening climate crisis, and I look forward to seeing the difference these projects can make.”

    “MIT’s efforts on climate research amid the climate crisis was a primary reason that I chose to attend MIT, and remains a reason that I view the Institute favorably. MIT has a clear opportunity to be a thought leader in the climate space in our own MIT way, which is why CGC fits in so well,” says senior Megan Xu, who served on the Climate Grand Challenges student committee and is studying ways to make the food system more sustainable.

    The Climate Grand Challenges competition is a key initiative of “Fast Forward: MIT’s Climate Action Plan for the Decade,” which the Institute published in May 2021. Fast Forward outlines MIT’s comprehensive plan for helping the world address the climate crisis. It consists of five broad areas of action: sparking innovation, educating future generations, informing and leveraging government action, reducing MIT’s own climate impact, and uniting and coordinating all of MIT’s climate efforts. More

  • in

    Finding the questions that guide MIT fusion research

    “One of the things I learned was, doing good science isn’t so much about finding the answers as figuring out what the important questions are.”

    As Martin Greenwald retires from the responsibilities of senior scientist and deputy director of the MIT Plasma Science and Fusion Center (PSFC), he reflects on his almost 50 years of science study, 43 of them as a researcher at MIT, pursuing the question of how to make the carbon-free energy of fusion a reality.

    Most of Greenwald’s important questions about fusion began after graduating from MIT with a BS in both physics and chemistry. Beginning graduate work at the University of California at Berkeley, he felt compelled to learn more about fusion as an energy source that could have “a real societal impact.” At the time, researchers were exploring new ideas for devices that could create and confine fusion plasmas. Greenwald worked on Berkeley’s “alternate concept” TORMAC, a Toroidal Magnetic Cusp. “It didn’t work out very well,” he laughs. “The first thing I was known for was making the measurements that shut down the program.”

    Believing the temperature of the plasma generated by the device would not be as high as his group leader expected, Greenwald developed hardware that could measure the low temperatures predicted by his own “back of the envelope calculations.” As he anticipated, his measurements showed that “this was not a fusion plasma; this was hardly a confined plasma at all.”

    With a PhD from Berkeley, Greenwald returned to MIT for a research position at the PSFC, attracted by the center’s “esprit de corps.”

    He arrived in time to participate in the final experiments on Alcator A, the first in a series of tokamaks built at MIT, all characterized by compact size and featuring high-field magnets. The tokamak design was then becoming favored as the most effective route to fusion: its doughnut-shaped vacuum chamber, surrounded by electromagnets, could confine the turbulent plasma long enough, while increasing its heat and density, to make fusion occur.

    Alcator A showed that the energy confinement time improves in relation to increasing plasma density. MIT’s succeeding device, Alcator C, was designed to use higher magnetic fields, boosting expectations that it would reach higher densities and better confinement. To attain these goals, however, Greenwald had to pursue a new technique that increased density by injecting pellets of frozen fuel into the plasma, a method he likens to throwing “snowballs in hell.” This work was notable for the creation of a new regime of enhanced plasma confinement on Alcator C. In those experiments, a confined plasma surpassed for the first time one of the two Lawson criteria — the minimum required value for the product of the plasma density and confinement time — for making net power from fusion. This had been a milestone for fusion research since their publication by John Lawson in 1957.

    Greenwald continued to make a name for himself as part of a larger study into the physics of the Compact Ignition Tokamak — a high-field burning plasma experiment that the U.S. program was proposing to build in the late 1980s. The result, unexpectedly, was a new scaling law, later known as the “Greenwald Density Limit,” and a new theory for the mechanism of the limit. It has been used to accurately predict performance on much larger machines built since.

    The center’s next tokamak, Alcator C-Mod, started operation in 1993 and ran for more than 20 years, with Greenwald as the chair of its Experimental Program Committee. Larger than Alcator C, the new device supported a highly shaped plasma, strong radiofrequency heating, and an all-metal plasma-facing first wall. All of these would eventually be required in a fusion power system.

    C-Mod proved to be MIT’s most enduring fusion experiment to date, producing important results for 20 years. During that time Greenwald contributed not only to the experiments, but to mentoring the next generation. Research scientist Ryan Sweeney notes that “Martin quickly gained my trust as a mentor, in part due to his often casual dress and slightly untamed hair, which are embodiments of his transparency and his focus on what matters. He can quiet a room of PhDs and demand attention not by intimidation, but rather by his calmness and his ability to bring clarity to complicated problems, be they scientific or human in nature.”

    Greenwald worked closely with the group of students who, in PSFC Director Dennis Whyte’s class, came up with the tokamak concept that evolved into SPARC. MIT is now pursuing this compact, high-field tokamak with Commonwealth Fusion Systems, a startup that grew out of the collective enthusiasm for this concept, and the growing realization it could work. Greenwald now heads the Physics Group for the SPARC project at MIT. He has helped confirm the device’s physics basis in order to predict performance and guide engineering decisions.

    “Martin’s multifaceted talents are thoroughly embodied by, and imprinted on, SPARC” says Whyte. “First, his leadership in its plasma confinement physics validation and publication place SPARC on a firm scientific footing. Secondly, the impact of the density limit he discovered, which shows that fuel density increases with magnetic field and decreasing the size of the tokamak, is critical in obtaining high fusion power density not just in SPARC, but in future power plants. Third, and perhaps most impressive, is Martin’s mentorship of the SPARC generation of leadership.”

    Greenwald’s expertise and easygoing personality have made him an asset as head of the PSFC Office for Computer Services and group leader for data acquisition and computing, and sought for many professional committees. He has been an APS Fellow since 2000, and was an APS Distinguished Lecturer in Plasma Physics (2001-02). He was also presented in 2014 with a Leadership Award from Fusion Power Associates. He is currently an associate editor for Physics of Plasmas and a member of the Lawrence Livermore National Laboratory Physical Sciences Directorate External Review Committee.

    Although leaving his full-time responsibilities, Greenwald will remain at MIT as a visiting scientist, a role he says will allow him to “stick my nose into everything without being responsible for anything.”

    “At some point in the race you have to hand off the baton,“ he says. “And it doesn’t mean you’re not interested in the outcome; and it doesn’t mean you’re just going to walk away into the stands. I want to be there at the end when we succeed.” More

  • in

    Leveraging science and technology against the world’s top problems

    Looking back on nearly a half-century at MIT, Richard K. Lester, associate provost and Japan Steel Industry Professor, sees a “somewhat eccentric professional trajectory.”

    But while his path has been irregular, there has been a clearly defined through line, Lester says: the emergence of new science and new technologies, the potential of these developments to shake up the status quo and address some of society’s most consequential problems, and what the outcomes might mean for America’s place in the world.

    Perhaps no assignment in Lester’s portfolio better captures this theme than the new MIT Climate Grand Challenges competition. Spearheaded by Lester and Maria Zuber, MIT vice president for research, and launched at the height of the pandemic in summer 2020, this initiative is designed to mobilize the entire MIT research community around tackling “the really hard, challenging problems currently standing in the way of an effective global response to the climate emergency,” says Lester. “The focus is on those problems where progress requires developing and applying frontier knowledge in the natural and social sciences and cutting-edge technologies. This is the MIT community swinging for the fences in areas where we have a comparative advantage.”This is a passion project for him, not least because it has engaged colleagues from nearly all of MIT’s departments. After nearly 100 initial ideas were submitted by more than 300 faculty, 27 teams were named finalists and received funding to develop comprehensive research and innovation plans in such areas as decarbonizing complex industries; risk forecasting and adaptation; advancing climate equity; and carbon removal, management, and storage. In April, a small subset of this group will become multiyear flagship projects, augmenting the work of existing MIT units that are pursuing climate research. Lester is sunny in the face of these extraordinarily complex problems. “This is a bottom-up effort with exciting proposals, and where the Institute is collectively committed — it’s MIT at its best.”

    Nuclear to the core

    This initiative carries a particular resonance for Lester, who remains deeply engaged in nuclear engineering. “The role of nuclear energy is central and will need to become even more central if we’re to succeed in addressing the climate challenge,” he says. He also acknowledges that for nuclear energy technologies — both fission and fusion — to play a vital role in decarbonizing the economy, they must not just win “in the court of public opinion, but in the marketplace,” he says. “Over the years, my research has sought to elucidate what needs to be done to overcome these obstacles.”

    In fact, Lester has been campaigning for much of his career for a U.S. nuclear innovation agenda, a commitment that takes on increased urgency as the contours of the climate crisis sharpen. He argues for the rapid development and testing of nuclear technologies that can complement the renewable but intermittent energy sources of sun and wind. Whether powerful, large-scale, molten-salt-cooled reactors or small, modular, light water reactors, nuclear batteries or promising new fusion projects, U.S. energy policy must embrace nuclear innovation, says Lester, or risk losing the high-stakes race for a sustainable future.

    Chancing into a discipline

    Lester’s introduction to nuclear science was pure happenstance.

    Born in the English industrial city of Leeds, he grew up in a musical family and played piano, violin, and then viola. “It was a big part of my life,” he says, and for a time, music beckoned as a career. He tumbled into a chemical engineering concentration at Imperial College, London, after taking a job in a chemical factory following high school. “There’s a certain randomness to life, and in my case, it’s reflected in my choice of major, which had a very large impact on my ultimate career.”

    In his second year, Lester talked his way into running a small experiment in the university’s research reactor, on radiation effects in materials. “I got hooked, and began thinking of studying nuclear engineering.” But there were few graduate programs in British universities at the time. Then serendipity struck again. The instructor of Lester’s single humanities course at Imperial had previously taught at MIT, and suggested Lester take a look at the nuclear program there. “I will always be grateful to him (and, indirectly, to MIT’s Humanities program) for opening my eyes to the existence of this institution where I’ve spent my whole adult life,” says Lester.

    He arrived at MIT with the notion of mitigating the harms of nuclear weapons. It was a time when the nuclear arms race “was an existential threat in everyone’s life,” he recalls. He targeted his graduate studies on nuclear proliferation. But he also encountered an electrifying study by MIT meteorologist Jule Charney. “Professor Charney produced one of the first scientific assessments of the effects on climate of increasing CO2 concentrations in the atmosphere, with quantitative estimates that have not fundamentally changed in 40 years.”

    Lester shifted directions. “I came to MIT to work on nuclear security, but stayed in the nuclear field because of the contributions that it can and must make in addressing climate change,” he says.

    Research and policy

    His path forward, Lester believed, would involve applying his science and technology expertise to critical policy problems, grounded in immediate, real-world concerns, and aiming for broad policy impacts. Even as a member of NSE, he joined with colleagues from many MIT departments to study American industrial practices and what was required to make them globally competitive, and then founded MIT’s Industrial Performance Center (IPC). Working at the IPC with interdisciplinary teams of faculty and students on the sources of productivity and innovation, his research took him to many countries at different stages of industrialization, including China, Taiwan, Japan, and Brazil.

    Lester’s wide-ranging work yielded books (including the MIT Press bestseller “Made in America”), advisory positions with governments, corporations, and foundations, and unexpected collaborations. “My interests were always fairly broad, and being at MIT made it possible to team up with world-leading scholars and extraordinary students not just in nuclear engineering, but in many other fields such as political science, economics, and management,” he says.

    Forging cross-disciplinary ties and bringing creative people together around a common goal proved a valuable skill as Lester stepped into positions of ever-greater responsibility at the Institute. He didn’t exactly relish the prospect of a desk job, though. “I religiously avoided administrative roles until I felt I couldn’t keep avoiding them,” he says.

    Today, as associate provost, he tends to MIT’s international activities — a daunting task given increasing scrutiny of research universities’ globe-spanning research partnerships and education of foreign students. But even in the midst of these consuming chores, Lester remains devoted to his home department. “Being a nuclear engineer is a central part of my identity,” he says.

    To students entering the nuclear field nearly 50 years after he did, who are understandably “eager to fix everything that seems wrong immediately,” he has a message: “Be patient. The hard things, the ones that are really worth doing, will take a long time to do.” Putting the climate crisis behind us will take two generations, Lester believes. Current students will start the job, but it will also take the efforts of their children’s generation before it is done.  “So we need you to be energetic and creative, of course, but whatever you do we also need you to be patient and to have ‘stick-to-itiveness’ — and maybe also a moral compass that our generation has lacked.” More

  • in

    Q&A: Climate Grand Challenges finalists on using data and science to forecast climate-related risk

    Note: This is the final article in a four-part interview series featuring the work of the 27 MIT Climate Grand Challenges finalist teams, which received a total of $2.7 million in startup funding to advance their projects. This month, the Institute will name a subset of the finalists as multiyear flagship projects.

    Advances in computation, artificial intelligence, robotics, and data science are enabling a new generation of observational tools and scientific modeling with the potential to produce timely, reliable, and quantitative analysis of future climate risks at a local scale. These projections can increase the accuracy and efficacy of early warning systems, improve emergency planning, and provide actionable information for climate mitigation and adaptation efforts, as human actions continue to change planetary conditions.

    In conversations prepared for MIT News, faculty from four Climate Grand Challenges teams with projects in the competition’s “Using data and science to forecast climate-related risk” category describe the promising new technologies that can help scientists understand the Earth’s climate system on a finer scale than ever before. (The other Climate Grand Challenges research themes include building equity and fairness into climate solutions, removing, managing, and storing greenhouse gases, and decarbonizing complex industries and processes.) The following responses have been edited for length and clarity.

    An observational system that can initiate a climate risk forecasting revolution

    Despite recent technological advances and massive volumes of data, climate forecasts remain highly uncertain. Gaps in observational capabilities create substantial challenges to predicting extreme weather events and establishing effective mitigation and adaptation strategies. R. John Hansman, the T. Wilson Professor of Aeronautics and Astronautics and director of the MIT International Center for Air Transportation, discusses the Stratospheric Airborne Climate Observatory System (SACOS) being developed together with Brent Minchew, the Cecil and Ida Green Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and a team that includes researchers from MIT Lincoln Laboratory and Harvard University.

    Q: How does SACOS reduce uncertainty in climate risk forecasting?

    A: There is a critical need for higher spatial and temporal resolution observations of the climate system than are currently available through remote (satellite or airborne) and surface (in-situ) sensing. We are developing an ensemble of high-endurance, solar-powered aircraft with instrument systems capable of performing months-long climate observing missions that satellites or aircraft alone cannot fulfill. Summer months are ideal for SACOS operations, as many key climate phenomena are active and short night periods reduce the battery mass, vehicle size, and technical risks. These observations hold the potential to inform and predict, allowing emergency planners, policymakers, and the rest of society to better prepare for the changes to come.

    Q: Describe the types of observing missions where SACOS could provide critical improvements.

    A: The demise of the Antarctic Ice Sheet, which is leading to rising sea levels around the world and threatening the displacement of millions of people, is one example. Current sea level forecasts struggle to account for giant fissures that create massive icebergs and cause the Antarctic Ice Sheet to flow more rapidly into the ocean. SACOS can track these fissures to accurately forecast ice slippage and give impacted populations enough time to prepare or evacuate. Elsewhere, widespread droughts cause rampant wildfires and water shortages. SACOS has the ability to monitor soil moisture and humidity in critically dry regions to identify where and when wildfires and droughts are imminent. SACOS also offers the most effective method to measure, track, and predict local ozone depletion over North America, which has resulted in increasingly severe summer thunderstorms.

    Quantifying and managing the risks of sea-level rise

    Prevailing estimates of sea-level rise range from approximately 20 centimeters to 2 meters by the end of the century, with the associated costs on the order of trillions of dollars. The instability of certain portions of the world’s ice sheets creates vast uncertainties, complicating how the world prepares for and responds to these potential changes. EAPS Professor Brent Minchew is leading another Climate Grand Challenges finalist team working on an integrated, multidisciplinary effort to improve the scientific understanding of sea-level rise and provide actionable information and tools to manage the risks it poses.

    Q: What have been the most significant challenges to understanding the potential rates of sea-level rise?

    A: West Antarctica is one of the most remote, inaccessible, and hostile places on Earth — to people and equipment. Thus, opportunities to observe the collapse of the West Antarctic Ice Sheet, which contains enough ice to raise global sea levels by about 3 meters, are limited and current observations crudely resolved. It is essential that we understand how the floating edge of the ice sheets, often called ice shelves, fracture and collapse because they provide critical forces that govern the rate of ice mass loss and can stabilize the West Antarctic Ice Sheet.

    Q: How will your project advance what is currently known about sea-level rise?

    A: We aim to advance global-scale projections of sea-level rise through novel observational technologies and computational models of ice sheet change and to link those predictions to region- to neighborhood-scale estimates of costs and adaptation strategies. To do this, we propose two novel instruments: a first-of-its-kind drone that can fly for months at a time over Antarctica making continuous observations of critical areas and an airdropped seismometer and GPS bundle that can be deployed to vulnerable and hard-to-reach areas of the ice sheet. This technology will provide greater data quality and density and will observe the ice sheet at frequencies that are currently inaccessible — elements that are essential for understanding the physics governing the evolution of the ice sheet and sea-level rise.

    Changing flood risk for coastal communities in the developing world

    Globally, more than 600 million people live in low-elevation coastal areas that face an increasing risk of flooding from sea-level rise. This includes two-thirds of cities with populations of more than 5 million and regions that conduct the vast majority of global trade. Dara Entekhabi, the Bacardi and Stockholm Water Foundations Professor in the Department of Civil and Environmental Engineering and professor in the Department of Earth, Atmospheric, and Planetary Sciences, outlines an interdisciplinary partnership that leverages data and technology to guide short-term and chart long-term adaptation pathways with Miho Mazereeuw, associate professor of architecture and urbanism and director of the Urban Risk Lab in the School of Architecture and Planning, and Danielle Wood, assistant professor in the Program in Media Arts and Sciences and the Department of Aeronautics and Astronautics.

    Q: What is the key problem this program seeks to address?

    A: The accumulated heating of the Earth system due to fossil burning is largely absorbed by the oceans, and the stored heat expands the ocean volume leading to increased base height for tides. When the high tides inundate a city, the condition is referred to as “sunny day” flooding, but the saline waters corrode infrastructure and wreak havoc on daily routines. The danger ahead for many coastal cities in the developing world is the combination of increasing high tide intrusions, coupled with heavy precipitation storm events.

    Q: How will your proposed solutions impact flood risk management?

    A: We are producing detailed risk maps for coastal cities in developing countries using newly available, very high-resolution remote-sensing data from space-borne instruments, as well as historical tides records and regional storm characteristics. Using these datasets, we aim to produce street-by-street risk maps that provide local decision-makers and stakeholders with a way to estimate present and future flood risks. With the model of future tides and probabilistic precipitation events, we can forecast future inundation by a flooding event, decadal changes with various climate-change and sea-level rise projections, and an increase in the likelihood of sunny-day flooding. Working closely with local partners, we will develop toolkits to explore short-term emergency response, as well as long-term mitigation and adaptation techniques in six pilot locations in South and Southeast Asia, Africa, and South America.

    Ocean vital signs

    On average, every person on Earth generates fossil fuel emissions equivalent to an 8-pound bag of carbon, every day. Much of this is absorbed by the ocean, but there is wide variability in the estimates of oceanic absorption, which translates into differences of trillions of dollars in the required cost of mitigation. In the Department of Earth, Atmospheric and Planetary Sciences, Christopher Hill, a principal research engineer specializing in Earth and planetary computational science, works with Ryan Woosley, a principal research scientist focusing on the carbon cycle and ocean acidification. Hill explains that they hope to use artificial intelligence and machine learning to help resolve this uncertainty.

    Q: What is the current state of knowledge on air-sea interactions?

    A: Obtaining specific, accurate field measurements of critical physical, chemical, and biological exchanges between the ocean and the planet have historically entailed expensive science missions with large ship-based infrastructure that leave gaps in real-time data about significant ocean climate processes. Recent advances in highly scalable in-situ autonomous observing and navigation combined with airborne, remote sensing, and machine learning innovations have the potential to transform data gathering, provide more accurate information, and address fundamental scientific questions around air-sea interaction.

    Q: How will your approach accelerate real-time, autonomous surface ocean observing from an experimental research endeavor to a permanent and impactful solution?

    A: Our project seeks to demonstrate how a scalable surface ocean observing network can be launched and operated, and to illustrate how this can reduce uncertainties in estimates of air-sea carbon dioxide exchange. With an initial high-impact goal of substantially eliminating the vast uncertainties that plague our understanding of ocean uptake of carbon dioxide, we will gather critical measurements for improving extended weather and climate forecast models and reducing climate impact uncertainty. The results have the potential to more accurately identify trillions of dollars worth of economic activity. More

  • in

    Ocean vital signs

    Without the ocean, the climate crisis would be even worse than it is. Each year, the ocean absorbs billions of tons of carbon from the atmosphere, preventing warming that greenhouse gas would otherwise cause. Scientists estimate about 25 to 30 percent of all carbon released into the atmosphere by both human and natural sources is absorbed by the ocean.

    “But there’s a lot of uncertainty in that number,” says Ryan Woosley, a marine chemist and a principal research scientist in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) at MIT. Different parts of the ocean take in different amounts of carbon depending on many factors, such as the season and the amount of mixing from storms. Current models of the carbon cycle don’t adequately capture this variation.

    To close the gap, Woosley and a team of other MIT scientists developed a research proposal for the MIT Climate Grand Challenges competition — an Institute-wide campaign to catalyze and fund innovative research addressing the climate crisis. The team’s proposal, “Ocean Vital Signs,” involves sending a fleet of sailing drones to cruise the oceans taking detailed measurements of how much carbon the ocean is really absorbing. Those data would be used to improve the precision of global carbon cycle models and improve researchers’ ability to verify emissions reductions claimed by countries.

    “If we start to enact mitigation strategies—either through removing CO2 from the atmosphere or reducing emissions — we need to know where CO2 is going in order to know how effective they are,” says Woosley. Without more precise models there’s no way to confirm whether observed carbon reductions were thanks to policy and people, or thanks to the ocean.

    “So that’s the trillion-dollar question,” says Woosley. “If countries are spending all this money to reduce emissions, is it enough to matter?”

    In February, the team’s Climate Grand Challenges proposal was named one of 27 finalists out of the almost 100 entries submitted. From among this list of finalists, MIT will announce in April the selection of five flagship projects to receive further funding and support.

    Woosley is leading the team along with Christopher Hill, a principal research engineer in EAPS. The team includes physical and chemical oceanographers, marine microbiologists, biogeochemists, and experts in computational modeling from across the department, in addition to collaborators from the Media Lab and the departments of Mathematics, Aeronautics and Astronautics, and Electrical Engineering and Computer Science.

    Today, data on the flux of carbon dioxide between the air and the oceans are collected in a piecemeal way. Research ships intermittently cruise out to gather data. Some commercial ships are also fitted with sensors. But these present a limited view of the entire ocean, and include biases. For instance, commercial ships usually avoid storms, which can increase the turnover of water exposed to the atmosphere and cause a substantial increase in the amount of carbon absorbed by the ocean.

    “It’s very difficult for us to get to it and measure that,” says Woosley. “But these drones can.”

    If funded, the team’s project would begin by deploying a few drones in a small area to test the technology. The wind-powered drones — made by a California-based company called Saildrone — would autonomously navigate through an area, collecting data on air-sea carbon dioxide flux continuously with solar-powered sensors. This would then scale up to more than 5,000 drone-days’ worth of observations, spread over five years, and in all five ocean basins.

    Those data would be used to feed neural networks to create more precise maps of how much carbon is absorbed by the oceans, shrinking the uncertainties involved in the models. These models would continue to be verified and improved by new data. “The better the models are, the more we can rely on them,” says Woosley. “But we will always need measurements to verify the models.”

    Improved carbon cycle models are relevant beyond climate warming as well. “CO2 is involved in so much of how the world works,” says Woosley. “We’re made of carbon, and all the other organisms and ecosystems are as well. What does the perturbation to the carbon cycle do to these ecosystems?”

    One of the best understood impacts is ocean acidification. Carbon absorbed by the ocean reacts to form an acid. A more acidic ocean can have dire impacts on marine organisms like coral and oysters, whose calcium carbonate shells and skeletons can dissolve in the lower pH. Since the Industrial Revolution, the ocean has become about 30 percent more acidic on average.

    “So while it’s great for us that the oceans have been taking up the CO2, it’s not great for the oceans,” says Woosley. “Knowing how this uptake affects the health of the ocean is important as well.” More