More stories

  • in

    MIT in the media: 2023 in review

    It was an eventful trip around the sun for MIT this year, from President Sally Kornbluth’s inauguration and Mark Rober’s Commencement address to Professor Moungi Bawendi winning the Nobel Prize in Chemistry. In 2023 MIT researchers made key advances, detecting a dying star swallowing a planet, exploring the frontiers of artificial intelligence, creating clean energy solutions, inventing tools aimed at earlier detection and diagnosis of cancer, and even exploring the science of spreading kindness. Below are highlights of some of the uplifting people, breakthroughs, and ideas from MIT that made headlines in 2023.

    The gift: Kindness goes viral with Steve HartmanSteve Hartman visited Professor Anette “Peko” Hosoi to explore the science behind whether a single act of kindness can change the world.Full story via CBS News

    Trio wins Nobel Prize in chemistry for work on quantum dots, used in electronics and medical imaging“The motivation really is the basic science. A basic understanding, the curiosity of ‘how does the world work?’” said Professor Moungi Bawendi of the inspiration for his research on quantum dots, for which he was co-awarded the 2023 Nobel Prize in Chemistry.Full story via the Associated Press

    How MIT’s all-women leadership team plans to change science for the betterPresident Sally Kornbluth, Provost Cynthia Barnhart, and Chancellor Melissa Nobles emphasized the importance of representation for women and underrepresented groups in STEM.Full story via Radio Boston

    MIT via community college? Transfer students find a new path to a degreeUndergraduate Subin Kim shared his experience transferring from community college to MIT through the Transfer Scholars Network, which is aimed at helping community college students find a path to four-year universities.Full story via the Christian Science Monitor

    MIT president Sally Kornbluth doesn’t think we can hit the pause button on AIPresident Kornbluth discussed the future of AI, ethics in science, and climate change with columnist Shirley Leung on her new “Say More” podcast. “I view [the climate crisis] as an existential issue to the extent that if we don’t take action there, all of the many, many other things that we’re working on, not that they’ll be irrelevant, but they’ll pale in comparison,” Kornbluth said.Full story via The Boston Globe 

    It’s the end of a world as we know itAstronomers from MIT, Harvard University, Caltech and elsewhere spotted a dying star swallowing a large planet. Postdoc Kishalay De explained that: “Finding an event like this really puts all of the theories that have been out there to the most stringent tests possible. It really opens up this entire new field of research.”Full story via The New York Times

    Frontiers of AI

    Hey, Alexa, what should students learn about AI?The Day of AI is a program developed by the MIT RAISE initiative aimed at introducing and teaching K-12 students about AI. “We want students to be informed, responsible users and informed, responsible designers of these technologies,” said Professor Cynthia Breazeal, dean of digital learning at MIT.Full story via The New York Times

    AI tipping pointFour faculty members from across MIT — Professors Song Han, Simon Johnson, Yoon Kim and Rosalind Picard — described the opportunities and risks posed by the rapid advancements in the field of AI.Full story via Curiosity Stream 

    A look into the future of AI at MIT’s robotics laboratoryProfessor Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Laboratory, discussed the future of artificial intelligence, robotics, and machine learning, emphasizing the importance of balancing the development of new technologies with the need to ensure they are deployed in a way that benefits humanity.Full story via Mashable

    Health care providers say artificial intelligence could transform medicineProfessor Regina Barzilay spoke about her work developing new AI systems that could be used to help diagnose breast and lung cancer before the cancers are detectable to the human eye.Full story via Chronicle

    Is AI coming for your job? Tech experts weigh in: “They don’t replace human labor”Professor David Autor discussed how the rise of artificial intelligence could change the quality of jobs available.Full story via CBS News

    Big tech is bad. Big AI will be worse.Institute Professor Daron Acemoglu and Professor Simon Johnson made the case that “rather than machine intelligence, what we need is ‘machine usefulness,’ which emphasizes the ability of computers to augment human capabilities.”Full story via The New York Times

    Engineering excitement

    MIT’s 3D-printed hearts could pump new life into customized treatments MIT engineers developed a technique for 3D printing a soft, flexible, custom-designed replica of a patient’s heart.Full story via WBUR

    Mystery of why Roman buildings have survived so long has been unraveled, scientists sayScientists from MIT and other institutions discovered that ancient Romans used lime clasts when manufacturing concrete, giving the material self-healing properties.Full story via CNN

    The most interesting startup in America is in Massachusetts. You’ve probably never heard of it.VulcanForms, an MIT startup, is at the “leading edge of a push to transform 3D printing from a niche technology — best known for new-product prototyping and art-class experimentation — into an industrial force.”Full story via The Boston Globe

    Catalyzing climate innovations

    Can Boston’s energy innovators save the world?Boston Magazine reporter Rowan Jacobsen spotlighted how MIT faculty, students, and alumni are leading the charge in clean energy startups. “When it comes to game-changing breakthroughs in energy, three letters keep surfacing again and again: MIT,” writes Jacobsen.Full story via Boston Magazine

    MIT research could be game changer in combating water shortagesMIT researchers discovered that a common hydrogel used in cosmetic creams, industrial coatings, and pharmaceutical capsules can absorb moisture from the atmosphere even as the temperature rises. “For a planet that’s getting hotter, this could be a game-changing discovery.”Full story via NBC Boston

    Energy-storing concrete could form foundations for solar-powered homesMIT engineers uncovered a new way of creating an energy supercapacitor by combining cement, carbon black, and water that could one day be used to power homes or electric vehicles.Full story via New Scientist

    MIT researchers tackle key question of EV adoption: When to charge?MIT scientists found that delayed charging and strategic placement of EV charging stations could help reduce additional energy demands caused by more widespread EV adoption.Full story via Fast Company

    Building better buildingsProfessor John Fernández examined how to reduce the climate footprints of homes and office buildings, recommending creating airtight structures, switching to cleaner heating sources, using more environmentally friendly building materials, and retrofitting existing homes and offices.Full story via The New York Times

    They’re building an “ice penetrator” on a hillside in WestfordResearchers from MIT’s Haystack Observatory built an “ice penetrator,” a device designed to monitor the changing conditions of sea ice.Full story via The Boston Globe

    Healing health solutions

    How Boston is beating cancerMIT researchers are developing drug-delivery nanoparticles aimed at targeting cancer cells without disturbing healthy cells. Essentially, the nanoparticles are “engineered for selectivity,” explained Professor Paula Hammond, head of MIT’s Department of Chemical Engineering.Full story via Boston Magazine

    A new antibiotic, discovered with artificial intelligence, may defeat a dangerous superbugUsing a machine-learning algorithm, researchers from MIT discovered a type of antibiotic that’s effective against a particular strain of drug-resistant bacteria.Full story via CNN

    To detect breast cancer sooner, an MIT professor designs an ultrasound braMIT researchers designed a wearable ultrasound device that attaches to a bra and could be used to detect early-stage breast tumors.Full story via STAT

    The quest for a switch to turn on hungerAn ingestible pill developed by MIT scientists can raise levels of hormones to help increase appetite and decrease nausea in patients with gastroparesis.Full story via Wired

    Here’s how to use dreams for creative inspirationMIT scientists found that the earlier stages of sleep are key to sparking creativity and that people can be guided to dream about specific topics, further boosting creativity.Full story via Scientific American

    Astounding art

    An AI opera from 1987 reboots for a new generationProfessor Tod Machover discussed the restaging of his opera “VALIS” at MIT, which featured an artificial intelligence-assisted musical instrument developed by Nina Masuelli ’23.Full story via The Boston Globe

    Surfacing the stories hidden in migration dataAssociate Professor Sarah Williams discussed the Civic Data Design Lab’s “Motivational Tapestry,” a large woven art piece that uses data from the United Nations World Food Program to visually represent the individual motivations of 1,624 Central Americans who have migrated to the U.S.Full story via Metropolis

    Augmented reality-infused production of Wagner’s “Parsifal” opens Bayreuth FestivalProfessor Jay Scheib’s augmented reality-infused production of Richard Wagner’s “Parsifal” brought “fantastical images” to audience members.Full story via the Associated Press

    Understanding our universe

    New image reveals violent events near a supermassive black holeScientists captured a new image of M87*, the black hole at the center of the Messier 87 galaxy, showing the “launching point of a colossal jet of high-energy particles shooting outward into space.”Full story via Reuters

    Gravitational waves: A new universeMIT researchers Lisa Barsotti, Deep Chatterjee, and Victoria Xu explored how advances in gravitational wave detection are enabling a better understanding of the universe.Full story via Curiosity Stream 

    Nergis Mavalvala helped detect the first gravitational wave. Her work doesn’t stop thereProfessor Nergis Mavalvala, dean of the School of Science, discussed her work searching for gravitational waves, the importance of skepticism in scientific research, and why she enjoys working with young people.Full story via Wired

    Hitting the books

    “The Transcendent Brain” review: Beyond ones and zeroesIn his book “The Transcendent Brain: Spirituality in the Age of Science,” Alan Lightman, a professor of the practice of humanities, displayed his gift for “distilling complex ideas and emotions to their bright essence.”Full story via The Wall Street Journal

    What happens when CEOs treat workers better? Companies (and workers) win.Professor of the practice Zeynep Ton published a book, “The Case for Good Jobs,” and is “on a mission to change how company leaders think, and how they treat their employees.”Full story via The Boston Globe

    How to wage war on conspiracy theoriesProfessor Adam Berinsky’s book, “Political Rumors: Why We Accept Misinformation and How to Fight it,” examined “attitudes toward both politics and health, both of which are undermined by distrust and misinformation in ways that cause harm to both individuals and society.”Full story via Politico

    What it takes for Mexican coders to cross the cultural border with Silicon ValleyAssistant Professor Héctor Beltrán discussed his new book, “Code Work: Hacking across the U.S./México Techno-Borderlands,” which explores the culture of hackathons and entrepreneurship in Mexico.Full story via Marketplace

    Cultivating community

    The Indigenous rocketeerNicole McGaa, a fourth-year student at MIT, discussed her work leading MIT’s all-Indigenous rocket team at the 2023 First Nations Launch National Rocket Competition.Full story via Nature

    “You totally got this,” YouTube star and former NASA engineer Mark Rober tells MIT graduatesDuring his Commencement address at MIT, Mark Rober urged graduates to embrace their accomplishments and boldly face any challenges they encounter.Full story via The Boston Globe

    MIT Juggling Club going strong after half centuryAfter almost 50 years, the MIT Juggling Club, which was founded in 1975 and then merged with a unicycle club, is the oldest drop-in juggling club in continuous operation and still welcomes any aspiring jugglers to come toss a ball (or three) into the air.Full story via Cambridge Day

    Volpe Transportation Center opens as part of $750 million deal between MIT and fedsThe John A. Volpe National Transportation Systems Center in Kendall Square was the first building to open in MIT’s redevelopment of the 14-acre Volpe site that will ultimately include “research labs, retail, affordable housing, and open space, with the goal of not only encouraging innovation, but also enhancing the surrounding community.”Full story via The Boston Globe

    Sparking conversation

    The future of AI innovation and the role of academics in shaping itProfessor Daniela Rus emphasized the central role universities play in fostering innovation and the importance of ensuring universities have the computing resources necessary to help tackle major global challenges.Full story via The Boston Globe

    Moving the needle on supply chain sustainabilityProfessor Yossi Sheffi examined several strategies companies could use to help improve supply chain sustainability, including redesigning last-mile deliveries, influencing consumer choices and incentivizing returnable containers.Full story via The Hill

    Expelled from the mountain top?Sylvester James Gates Jr. ’73, PhD ’77 made the case that “diverse learning environments expose students to a broader range of perspectives, enhance education, and inculcate creativity and innovative habits of mind.”Full story via Science

    Marketing magic of “Barbie” movie has lessons for women’s sportsMIT Sloan Lecturer Shira Springer explored how the success of the “Barbie” movie could be applied to women’s sports.Full story via Sports Business Journal

    We’re already paying for universal health care. Why don’t we have it?Professor Amy Finkelstein asserted that the solution to health insurance reform in the U.S. is “universal coverage that is automatic, free and basic.”Full story via The New York Times 

    The internet could be so good. Really.Professor Deb Roy described how “new kinds of social networks can be designed for constructive communication — for listening, dialogue, deliberation, and mediation — and they can actually work.”Full story via The Atlantic

    Fostering educational excellence

    MIT students give legendary linear algebra professor standing ovation in last lectureAfter 63 years of teaching and over 10 million views of his online lectures, Professor Gilbert Strang received a standing ovation after his last lecture on linear algebra. “I am so grateful to everyone who likes linear algebra and sees its importance. So many universities (and even high schools) now appreciate how beautiful it is and how valuable it is,” said Strang.Full story via USA Today

    “Brave Behind Bars”: Reshaping the lives of inmates through coding classesGraduate students Martin Nisser and Marisa Gaetz co-founded Brave Behind Bars, a program designed to provide incarcerated individuals with coding and digital literacy skills to better prepare them for life after prison.Full story via MSNBC

    Melrose TikTok user “Ms. Nuclear Energy” teaching about nuclear power through social mediaGraduate student Kaylee Cunningham discussed her work using social media to help educate and inform the public about nuclear energy.Full story via CBS Boston  More

  • in

    New tools are available to help reduce the energy that AI models devour

    When searching for flights on Google, you may have noticed that each flight’s carbon-emission estimate is now presented next to its cost. It’s a way to inform customers about their environmental impact, and to let them factor this information into their decision-making.

    A similar kind of transparency doesn’t yet exist for the computing industry, despite its carbon emissions exceeding those of the entire airline industry. Escalating this energy demand are artificial intelligence models. Huge, popular models like ChatGPT signal a trend of large-scale artificial intelligence, boosting forecasts that predict data centers will draw up to 21 percent of the world’s electricity supply by 2030.

    The MIT Lincoln Laboratory Supercomputing Center (LLSC) is developing techniques to help data centers reel in energy use. Their techniques range from simple but effective changes, like power-capping hardware, to adopting novel tools that can stop AI training early on. Crucially, they have found that these techniques have a minimal impact on model performance.

    In the wider picture, their work is mobilizing green-computing research and promoting a culture of transparency. “Energy-aware computing is not really a research area, because everyone’s been holding on to their data,” says Vijay Gadepally, senior staff in the LLSC who leads energy-aware research efforts. “Somebody has to start, and we’re hoping others will follow.”

    Curbing power and cooling down

    Like many data centers, the LLSC has seen a significant uptick in the number of AI jobs running on its hardware. Noticing an increase in energy usage, computer scientists at the LLSC were curious about ways to run jobs more efficiently. Green computing is a principle of the center, which is powered entirely by carbon-free energy.

    Training an AI model — the process by which it learns patterns from huge datasets — requires using graphics processing units (GPUs), which are power-hungry hardware. As one example, the GPUs that trained GPT-3 (the precursor to ChatGPT) are estimated to have consumed 1,300 megawatt-hours of electricity, roughly equal to that used by 1,450 average U.S. households per month.

    While most people seek out GPUs because of their computational power, manufacturers offer ways to limit the amount of power a GPU is allowed to draw. “We studied the effects of capping power and found that we could reduce energy consumption by about 12 percent to 15 percent, depending on the model,” Siddharth Samsi, a researcher within the LLSC, says.

    The trade-off for capping power is increasing task time — GPUs will take about 3 percent longer to complete a task, an increase Gadepally says is “barely noticeable” considering that models are often trained over days or even months. In one of their experiments in which they trained the popular BERT language model, limiting GPU power to 150 watts saw a two-hour increase in training time (from 80 to 82 hours) but saved the equivalent of a U.S. household’s week of energy.

    The team then built software that plugs this power-capping capability into the widely used scheduler system, Slurm. The software lets data center owners set limits across their system or on a job-by-job basis.

    “We can deploy this intervention today, and we’ve done so across all our systems,” Gadepally says.

    Side benefits have arisen, too. Since putting power constraints in place, the GPUs on LLSC supercomputers have been running about 30 degrees Fahrenheit cooler and at a more consistent temperature, reducing stress on the cooling system. Running the hardware cooler can potentially also increase reliability and service lifetime. They can now consider delaying the purchase of new hardware — reducing the center’s “embodied carbon,” or the emissions created through the manufacturing of equipment — until the efficiencies gained by using new hardware offset this aspect of the carbon footprint. They’re also finding ways to cut down on cooling needs by strategically scheduling jobs to run at night and during the winter months.

    “Data centers can use these easy-to-implement approaches today to increase efficiencies, without requiring modifications to code or infrastructure,” Gadepally says.

    Taking this holistic look at a data center’s operations to find opportunities to cut down can be time-intensive. To make this process easier for others, the team — in collaboration with Professor Devesh Tiwari and Baolin Li at Northeastern University — recently developed and published a comprehensive framework for analyzing the carbon footprint of high-performance computing systems. System practitioners can use this analysis framework to gain a better understanding of how sustainable their current system is and consider changes for next-generation systems.  

    Adjusting how models are trained and used

    On top of making adjustments to data center operations, the team is devising ways to make AI-model development more efficient.

    When training models, AI developers often focus on improving accuracy, and they build upon previous models as a starting point. To achieve the desired output, they have to figure out what parameters to use, and getting it right can take testing thousands of configurations. This process, called hyperparameter optimization, is one area LLSC researchers have found ripe for cutting down energy waste. 

    “We’ve developed a model that basically looks at the rate at which a given configuration is learning,” Gadepally says. Given that rate, their model predicts the likely performance. Underperforming models are stopped early. “We can give you a very accurate estimate early on that the best model will be in this top 10 of 100 models running,” he says.

    In their studies, this early stopping led to dramatic savings: an 80 percent reduction in the energy used for model training. They’ve applied this technique to models developed for computer vision, natural language processing, and material design applications.

    “In my opinion, this technique has the biggest potential for advancing the way AI models are trained,” Gadepally says.

    Training is just one part of an AI model’s emissions. The largest contributor to emissions over time is model inference, or the process of running the model live, like when a user chats with ChatGPT. To respond quickly, these models use redundant hardware, running all the time, waiting for a user to ask a question.

    One way to improve inference efficiency is to use the most appropriate hardware. Also with Northeastern University, the team created an optimizer that matches a model with the most carbon-efficient mix of hardware, such as high-power GPUs for the computationally intense parts of inference and low-power central processing units (CPUs) for the less-demanding aspects. This work recently won the best paper award at the International ACM Symposium on High-Performance Parallel and Distributed Computing.

    Using this optimizer can decrease energy use by 10-20 percent while still meeting the same “quality-of-service target” (how quickly the model can respond).

    This tool is especially helpful for cloud customers, who lease systems from data centers and must select hardware from among thousands of options. “Most customers overestimate what they need; they choose over-capable hardware just because they don’t know any better,” Gadepally says.

    Growing green-computing awareness

    The energy saved by implementing these interventions also reduces the associated costs of developing AI, often by a one-to-one ratio. In fact, cost is usually used as a proxy for energy consumption. Given these savings, why aren’t more data centers investing in green techniques?

    “I think it’s a bit of an incentive-misalignment problem,” Samsi says. “There’s been such a race to build bigger and better models that almost every secondary consideration has been put aside.”

    They point out that while some data centers buy renewable-energy credits, these renewables aren’t enough to cover the growing energy demands. The majority of electricity powering data centers comes from fossil fuels, and water used for cooling is contributing to stressed watersheds. 

    Hesitancy may also exist because systematic studies on energy-saving techniques haven’t been conducted. That’s why the team has been pushing their research in peer-reviewed venues in addition to open-source repositories. Some big industry players, like Google DeepMind, have applied machine learning to increase data center efficiency but have not made their work available for others to deploy or replicate. 

    Top AI conferences are now pushing for ethics statements that consider how AI could be misused. The team sees the climate aspect as an AI ethics topic that has not yet been given much attention, but this also appears to be slowly changing. Some researchers are now disclosing the carbon footprint of training the latest models, and industry is showing a shift in energy transparency too, as in this recent report from Meta AI.

    They also acknowledge that transparency is difficult without tools that can show AI developers their consumption. Reporting is on the LLSC roadmap for this year. They want to be able to show every LLSC user, for every job, how much energy they consume and how this amount compares to others, similar to home energy reports.

    Part of this effort requires working more closely with hardware manufacturers to make getting these data off hardware easier and more accurate. If manufacturers can standardize the way the data are read out, then energy-saving and reporting tools can be applied across different hardware platforms. A collaboration is underway between the LLSC researchers and Intel to work on this very problem.

    Even for AI developers who are aware of the intense energy needs of AI, they can’t do much on their own to curb this energy use. The LLSC team wants to help other data centers apply these interventions and provide users with energy-aware options. Their first partnership is with the U.S. Air Force, a sponsor of this research, which operates thousands of data centers. Applying these techniques can make a significant dent in their energy consumption and cost.

    “We’re putting control into the hands of AI developers who want to lessen their footprint,” Gadepally says. “Do I really need to gratuitously train unpromising models? Am I willing to run my GPUs slower to save energy? To our knowledge, no other supercomputing center is letting you consider these options. Using our tools, today, you get to decide.”

    Visit this webpage to see the group’s publications related to energy-aware computing and findings described in this article. More

  • in

    Ms. Nuclear Energy is winning over nuclear skeptics

    First-year MIT nuclear science and engineering (NSE) doctoral student Kaylee Cunningham is not the first person to notice that nuclear energy has a public relations problem. But her commitment to dispel myths about the alternative power source has earned her the moniker “Ms. Nuclear Energy” on TikTok and a devoted fan base on the social media platform.

    Cunningham’s activism kicked into place shortly after a week-long trip to Iceland to study geothermal energy. During a discussion about how the country was going to achieve its net zero energy goals, a representative from the University of Reykjavik balked at Cunnigham’s suggestion of including a nuclear option in the alternative energy mix. “The response I got was that we’re a peace-loving nation, we don’t do that,” Cunningham remembers. “I was appalled by the reaction, I mean we’re talking energy not weapons here, right?” she asks. Incredulous, Cunningham made a TikTok that targeted misinformation. Overnight she garnered 10,000 followers and “Ms. Nuclear Energy” was off to the races. Ms. Nuclear Energy is now Cunningham’s TikTok handle.

    Kaylee Cunningham: Dispelling myths and winning over skeptics

    A theater and science nerd

    TikTok is a fitting platform for a theater nerd like Cunningham. Born in Melrose, Massachusetts, Cunningham’s childhood was punctuated by moves to places where her roofer father’s work took the family. She moved to North Carolina shortly after fifth grade and fell in love with theater. “I was doing theater classes, the spring musical, it was my entire world,” Cunningham remembers. When she moved again, this time to Florida halfway through her first year of high school, she found the spring musical had already been cast. But she could help behind the scenes. Through that work, Cunningham gained her first real exposure to hands-on tech. She was hooked.

    Soon Cunningham was part of a team that represented her high school at the student Astronaut Challenge, an aerospace competition run by Florida State University. Statewide winners got to fly a space shuttle simulator at the Kennedy Space Center and participate in additional engineering challenges. Cunningham’s team was involved in creating a proposal to help NASA’s Asteroid Redirect Mission, designed to help the agency gather a large boulder from a near-earth asteroid. The task was Cunningham’s induction into an understanding of radiation and “anything nuclear.” Her high school engineering teacher, Nirmala Arunachalam, encouraged Cunningham’s interest in the subject.

    The Astronaut Challenge might just have been the end of Cunningham’s path in nuclear engineering had it not been for her mother. In high school, Cunningham had also enrolled in computer science classes and her love of the subject earned her a scholarship at Norwich University in Vermont where she had pursued a camp in cybersecurity. Cunningham had already laid down the college deposit for Norwich.

    But Cunningham’s mother persuaded her daughter to pay another visit to the University of Florida, where she had expressed interest in pursuing nuclear engineering. To her pleasant surprise, the department chair, Professor James Baciak, pulled out all the stops, bringing mother and daughter on a tour of the on-campus nuclear reactor and promising Cunningham a paid research position. Cunningham was sold and Backiak has been a mentor throughout her research career.

    Merging nuclear engineering and computer science

    Undergraduate research internships, including one at Oak Ridge National Laboratory, where she could combine her two loves, nuclear engineering and computer science, convinced Cunningham she wanted to pursue a similar path in graduate school.

    Cunningham’s undergraduate application to MIT had been rejected but that didn’t deter her from applying to NSE for graduate school. Having spent her early years in an elementary school barely 20 minutes from campus, she had grown up hearing that “the smartest people in the world go to MIT.” Cunningham figured that if she got into MIT, it would be “like going back home to Massachusetts” and that she could fit right in.

    Under the advisement of Professor Michael Short, Cunningham is looking to pursue her passions in both computer science and nuclear engineering in her doctoral studies.

    The activism continues

    Simultaneously, Cunningham is determined to keep her activism going.

    Her ability to digest “complex topics into something understandable to people who have no connection to academia” has helped Cunningham on TikTok. “It’s been something I’ve been doing all my life with my parents and siblings and extended family,” she says.

    Punctuating her video snippets with humor — a Simpsons reference is par for the course — helps Cunningham break through to her audience who love her goofy and tongue-in-cheek approach to the subject matter without compromising accuracy. “Sometimes I do stupid dances and make a total fool of myself, but I’ve really found my niche by being willing to engage and entertain people and educate them at the same time.”

    Such education needs to be an important part of an industry that’s received its share of misunderstandings, Cunningham says. “Technical people trying to communicate in a way that the general people don’t understand is such a concerning thing,” she adds. Case in point: the response in the wake of the Three Mile Island accident, which prevented massive contamination leaks. It was a perfect example of how well our safety regulations actually work, Cunningham says, “but you’d never guess from the PR fallout from it all.”

    As Ms. Nuclear Energy, Cunningham receives her share of skepticism. One viewer questioned the safety of nuclear reactors if “tons of pollution” was spewing out from them. Cunningham produced a TikTok that addressed this misconception. Pointing to the “pollution” in a photo, Cunningham clarifies that it’s just water vapor. The TikTok has garnered over a million views. “It really goes to show how starving for accurate information the public really is,” Cunningham says, “ in this age of having all the information we could ever want at our fingertips, it’s hard to sift through and decide what’s real and accurate and what isn’t.”

    Another reason for her advocacy: doing her part to encourage young people toward a nuclear science or engineering career. “If we’re going to start putting up tons of small modular reactors around the country, we need people to build them, people to run them, and we need regulatory bodies to inspect and keep them safe,” Cunningham points out. “ And we don’t have enough people entering the workforce in comparison to those that are retiring from the workforce,” she adds. “I’m able to engage those younger audiences and put nuclear engineering on their radar,” Cunningham says. The advocacy has been paying off: Cunningham regularly receives — and responds to — inquiries from high school junior girls looking for advice on pursuing nuclear engineering.

    All the activism is in service toward a clear end goal. “At the end of the day, the fight is to save the planet,” Cunningham says, “I honestly believe that nuclear power is the best chance we’ve got to fight climate change and keep our planet alive.” More

  • in

    An interdisciplinary approach to fighting climate change through clean energy solutions

    In early 2021, the U.S. government set an ambitious goal: to decarbonize its power grid, the system that generates and transmits electricity throughout the country, by 2035. It’s an important goal in the fight against climate change, and will require a switch from current, greenhouse-gas producing energy sources (such as coal and natural gas), to predominantly renewable ones (such as wind and solar).

    Getting the power grid to zero carbon will be a challenging undertaking, as Audun Botterud, a principal research scientist at the MIT Laboratory for Information and Decision Systems (LIDS) who has long been interested in the problem, knows well. It will require building lots of renewable energy generators and new infrastructure; designing better technology to capture, store, and carry electricity; creating the right regulatory and economic incentives; and more. Decarbonizing the grid also presents many computational challenges, which is where Botterud’s focus lies. Botterud has modeled different aspects of the grid — the mechanics of energy supply, demand, and storage, and electricity markets — where economic factors can have a huge effect on how quickly renewable solutions get adopted.

    On again, off again

    A major challenge of decarbonization is that the grid must be designed and operated to reliably meet demand. Using renewable energy sources complicates this, as wind and solar power depend on an infamously volatile system: the weather. A sunny day becomes gray and blustery, and wind turbines get a boost but solar farms go idle. This will make the grid’s energy supply variable and hard to predict. Additional resources, including batteries and backup power generators, will need to be incorporated to regulate supply. Extreme weather events, which are becoming more common with climate change, can further strain both supply and demand. Managing a renewables-driven grid will require algorithms that can minimize uncertainty in the face of constant, sometimes random fluctuations to make better predictions of supply and demand, guide how resources are added to the grid, and inform how those resources are committed and dispatched across the entire United States.

    “The problem of managing supply and demand in the grid has to happen every second throughout the year, and given how much we rely on electricity in society, we need to get this right,” Botterud says. “You cannot let the reliability drop as you increase the amount of renewables, especially because I think that will lead to resistance towards adopting renewables.”

    That is why Botterud feels fortunate to be working on the decarbonization problem at LIDS — even though a career here is not something he had originally planned. Botterud’s first experience with MIT came during his time as a graduate student in his home country of Norway, when he spent a year as a visiting student with what is now called the MIT Energy Initiative. He might never have returned, except that while at MIT, Botterud met his future wife, Bilge Yildiz. The pair both ended up working at the Argonne National Laboratory outside of Chicago, with Botterud focusing on challenges related to power systems and electricity markets. Then Yildiz got a faculty position at MIT, where she is a professor of nuclear and materials science and engineering. Botterud moved back to the Cambridge area with her and continued to work for Argonne remotely, but he also kept an eye on local opportunities. Eventually, a position at LIDS became available, and Botterud took it, while maintaining his connections to Argonne.

    “At first glance, it may not be an obvious fit,” Botterud says. “My work is very focused on a specific application, power system challenges, and LIDS tends to be more focused on fundamental methods to use across many different application areas. However, being at LIDS, my lab [the Energy Analytics Group] has access to the most recent advances in these fundamental methods, and we can apply them to power and energy problems. Other people at LIDS are working on energy too, so there is growing momentum to address these important problems.”

    Weather, space, and time

    Much of Botterud’s research involves optimization, using mathematical programming to compare alternatives and find the best solution. Common computational challenges include dealing with large geographical areas that contain regions with different weather, different types and quantities of renewable energy available, and different infrastructure and consumer needs — such as the entire United States. Another challenge is the need for granular time resolution, sometimes even down to the sub-second level, to account for changes in energy supply and demand.

    Often, Botterud’s group will use decomposition to solve such large problems piecemeal and then stitch together solutions. However, it’s also important to consider systems as a whole. For example, in a recent paper, Botterud’s lab looked at the effect of building new transmission lines as part of national decarbonization. They modeled solutions assuming coordination at the state, regional, or national level, and found that the more regions coordinate to build transmission infrastructure and distribute electricity, the less they will need to spend to reach zero carbon.

    In other projects, Botterud uses game theory approaches to study strategic interactions in electricity markets. For example, he has designed agent-based models to analyze electricity markets. These assume each actor will make strategic decisions in their own best interest and then simulate interactions between them. Interested parties can use the models to see what would happen under different conditions and market rules, which may lead companies to make different investment decisions, or governing bodies to issue different regulations and incentives. These choices can shape how quickly the grid gets decarbonized.

    Botterud is also collaborating with researchers in MIT’s chemical engineering department who are working on improving battery storage technologies. Batteries will help manage variable renewable energy supply by capturing surplus energy during periods of high generation to release during periods of insufficient generation. Botterud’s group models the sort of charge cycles that batteries are likely to experience in the power grid, so that chemical engineers in the lab can test their batteries’ abilities in more realistic scenarios. In turn, this also leads to a more realistic representation of batteries in power system optimization models.

    These are only some of the problems that Botterud works on. He enjoys the challenge of tackling a spectrum of different projects, collaborating with everyone from engineers to architects to economists. He also believes that such collaboration leads to better solutions. The problems created by climate change are myriad and complex, and solving them will require researchers to cooperate and explore.

    “In order to have a real impact on interdisciplinary problems like energy and climate,” Botterud says, “you need to get outside of your research sweet spot and broaden your approach.” More

  • in

    Tackling counterfeit seeds with “unclonable” labels

    Average crop yields in Africa are consistently far below those expected, and one significant reason is the prevalence of counterfeit seeds whose germination rates are far lower than those of the genuine ones. The World Bank estimates that as much as half of all seeds sold in some African countries are fake, which could help to account for crop production that is far below potential.

    There have been many attempts to prevent this counterfeiting through tracking labels, but none have proved effective; among other issues, such labels have been vulnerable to hacking because of the deterministic nature of their encoding systems. But now, a team of MIT researchers has come up with a kind of tiny, biodegradable tag that can be applied directly to the seeds themselves, and that provides a unique randomly created code that cannot be duplicated.

    The new system, which uses minuscule dots of silk-based material, each containing a unique combination of different chemical signatures, is described today in the journal Science Advances in a paper by MIT’s dean of engineering Anantha Chandrakasan, professor of civil and environmental engineering Benedetto Marelli, postdoc Hui Sun, and graduate student Saurav Maji.

    The problem of counterfeiting is an enormous one globally, the researchers point out, affecting everything from drugs to luxury goods, and many different systems have been developed to try to combat this. But there has been less attention to the problem in the area of agriculture, even though the consequences can be severe. In sub-Saharan Africa, for example, the World Bank estimates that counterfeit seeds are a significant factor in crop yields that average less than one-fifth of the potential for maize, and less than one-third for rice.

    Marelli explains that a key to the new system is creating a randomly-produced physical object whose exact composition is virtually impossible to duplicate. The labels they create “leverage randomness and uncertainty in the process of application, to generate unique signature features that can be read, and that cannot be replicated,” he says.

    What they’re dealing with, Sun adds, “is the very old job of trying, basically, not to get your stuff stolen. And you can try as much as you can, but eventually somebody is always smart enough to figure out how to do it, so nothing is really unbreakable. But the idea is, it’s almost impossible, if not impossible, to replicate it, or it takes so much effort that it’s not worth it anymore.”

    The idea of an “unclonable” code was originally developed as a way of protecting the authenticity of computer chips, explains Chandrakasan, who is the Vannevar Bush Professor of Electrical Engineering and Computer Science. “In integrated circuits, individual transistors have slightly different properties coined device variations,” he explains, “and you could then use that variability and combine that variability with higher-level circuits to create a unique ID for the device. And once you have that, then you can use that unique ID as a part of a security protocol. Something like transistor variability is hard to replicate from device to device, so that’s what gives it its uniqueness, versus storing a particular fixed ID.” The concept is based on what are known as physically unclonable functions, or PUFs.

    The team decided to try to apply that PUF principle to the problem of fake seeds, and the use of silk proteins was a natural choice because the material is not only harmless to the environment but also classified by the Food and Drug Administration in the “generally recognized as safe” category, so it requires no special approval for use on food products.

    “You could coat it on top of seeds,” Maji says, “and if you synthesize silk in a certain way, it will also have natural random variations. So that’s the idea, that every seed or every bag could have a unique signature.”

    Developing effective secure system solutions has long been one of Chandrakasan’s specialties, while Marelli has spent many years developing systems for applying silk coatings to a variety of fruits, vegetables, and seeds, so their collaboration was a natural for developing such a silk-based coding system toward enhanced security.

    “The challenge was what type of form factor to give to silk,” Sun says, “so that it can be fabricated very easily.” They developed a simple drop-casting approach that produces tags that are less than one-tenth of an inch in diameter. The second challenge was to develop “a way where we can read the uniqueness, in also a very high throughput and easy way.”

    For the unique silk-based codes, Marelli says, “eventually we found a way to add a color to these microparticles so that they assemble in random structures.” The resulting unique patterns can be read out not only by a spectrograph or a portable microscope, but even by an ordinary cellphone camera with a macro lens. This image can be processed locally to generate the PUF code and then sent to the cloud and compared with a secure database to ensure the authenticity of the product. “It’s random so that people cannot easily replicate it,” says Sun. “People cannot predict it without measuring it.”

    And the number of possible permutations that could result from the way they mix four basic types of colored silk nanoparticles is astronomical. “We were able to show that with a minimal amount of silk, we were able to generate 128 random bits of security,” Maji says. “So this gives rise to 2 to the power 128 possible combinations, which is extremely difficult to crack given the computational capabilities of the state-of-the-art computing systems.”

    Marelli says that “for us, it’s a good test bed in order to think out-of-the-box, and how we can have a path that somehow is more democratic.” In this case, that means “something that you can literally read with your phone, and you can fabricate by simply drop casting a solution, without using any advanced manufacturing technique, without going in a clean room.”

    Some additional work will be needed to make this a practical commercial product, Chandrakasan says. “There will have to be a development for at-scale reading” via smartphones. “So, that’s clearly a future opportunity.” But the principle now shows a clear path to the day when “a farmer could at least, maybe not every seed, but could maybe take some random seeds in a particular batch and verify them,” he says.

    The research was partially supported by the U.S. Office of Naval research and the National Science Foundation, Analog Devices Inc., an EECS Mathworks fellowship, and a Paul M. Cook Career Development Professorship. More

  • in

    Detailed images from space offer clearer picture of drought effects on plants

    “MIT is a place where dreams come true,” says César Terrer, an assistant professor in the Department of Civil and Environmental Engineering. Here at MIT, Terrer says he’s given the resources needed to explore ideas he finds most exciting, and at the top of his list is climate science. In particular, he is interested in plant-soil interactions, and how the two can mitigate impacts of climate change. In 2022, Terrer received seed grant funding from the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) to produce drought monitoring systems for farmers. The project is leveraging a new generation of remote sensing devices to provide high-resolution plant water stress at regional to global scales.

    Growing up in Granada, Spain, Terrer always had an aptitude and passion for science. He studied environmental science at the University of Murcia, where he interned in the Department of Ecology. Using computational analysis tools, he worked on modeling species distribution in response to human development. Early on in his undergraduate experience, Terrer says he regarded his professors as “superheroes” with a kind of scholarly prowess. He knew he wanted to follow in their footsteps by one day working as a faculty member in academia. Of course, there would be many steps along the way before achieving that dream. 

    Upon completing his undergraduate studies, Terrer set his sights on exciting and adventurous research roles. He thought perhaps he would conduct field work in the Amazon, engaging with native communities. But when the opportunity arose to work in Australia on a state-of-the-art climate change experiment that simulates future levels of carbon dioxide, he headed south to study how plants react to CO2 in a biome of native Australian eucalyptus trees. It was during this experience that Terrer started to take a keen interest in the carbon cycle and the capacity of ecosystems to buffer rising levels of CO2 caused by human activity.

    Around 2014, he began to delve deeper into the carbon cycle as he began his doctoral studies at Imperial College London. The primary question Terrer sought to answer during his PhD was “will plants be able to absorb predicted future levels of CO2 in the atmosphere?” To answer the question, Terrer became an early adopter of artificial intelligence, machine learning, and remote sensing to analyze data from real-life, global climate change experiments. His findings from these “ground truth” values and observations resulted in a paper in the journal Science. In it, he claimed that climate models most likely overestimated how much carbon plants will be able to absorb by the end of the century, by a factor of three. 

    After postdoctoral positions at Stanford University and the Universitat Autonoma de Barcelona, followed by a prestigious Lawrence Fellowship, Terrer says he had “too many ideas and not enough time to accomplish all those ideas.” He knew it was time to lead his own group. Not long after applying for faculty positions, he landed at MIT. 

    New ways to monitor drought

    Terrer is employing similar methods to those he used during his PhD to analyze data from all over the world for his J-WAFS project. He and postdoc Wenzhe Jiao collect data from remote sensing satellites and field experiments and use machine learning to come up with new ways to monitor drought. Terrer says Jiao is a “remote sensing wizard,” who fuses data from different satellite products to understand the water cycle. With Jiao’s hydrology expertise and Terrer’s knowledge of plants, soil, and the carbon cycle, the duo is a formidable team to tackle this project.

    According to the U.N. World Meteorological Organization, the number and duration of droughts has increased by 29 percent since 2000, as compared to the two previous decades. From the Horn of Africa to the Western United States, drought is devastating vegetation and severely stressing water supplies, compromising food production and spiking food insecurity. Drought monitoring can offer fundamental information on drought location, frequency, and severity, but assessing the impact of drought on vegetation is extremely challenging. This is because plants’ sensitivity to water deficits varies across species and ecosystems. 

    Terrer and Jiao are able to obtain a clearer picture of how drought is affecting plants by employing the latest generation of remote sensing observations, which offer images of the planet with incredible spatial and temporal resolution. Satellite products such as Sentinel, Landsat, and Planet can provide daily images from space with such high resolution that individual trees can be discerned. Along with the images and datasets from satellites, the team is using ground-based observations from meteorological data. They are also using the MIT SuperCloud at MIT Lincoln Laboratory to process and analyze all of the data sets. The J-WAFS project is among one of the first to leverage high-resolution data to quantitatively measure plant drought impacts in the United States with the hopes of expanding to a global assessment in the future.

    Assisting farmers and resource managers 

    Every week, the U.S. Drought Monitor provides a map of drought conditions in the United States. The map has zero resolution and is more of a drought recap or summary, unable to predict future drought scenarios. The lack of a comprehensive spatiotemporal evaluation of historic and future drought impacts on global vegetation productivity is detrimental to farmers both in the United States and worldwide.  

    Terrer and Jiao plan to generate metrics for plant water stress at an unprecedented resolution of 10-30 meters. This means that they will be able to provide drought monitoring maps at the scale of a typical U.S. farm, giving farmers more precise, useful data every one to two days. The team will use the information from the satellites to monitor plant growth and soil moisture, as well as the time lag of plant growth response to soil moisture. In this way, Terrer and Jiao say they will eventually be able to create a kind of “plant water stress forecast” that may be able to predict adverse impacts of drought four weeks in advance. “According to the current soil moisture and lagged response time, we hope to predict plant water stress in the future,” says Jiao. 

    The expected outcomes of this project will give farmers, land and water resource managers, and decision-makers more accurate data at the farm-specific level, allowing for better drought preparation, mitigation, and adaptation. “We expect to make our data open-access online, after we finish the project, so that farmers and other stakeholders can use the maps as tools,” says Jiao. 

    Terrer adds that the project “has the potential to help us better understand the future states of climate systems, and also identify the regional hot spots more likely to experience water crises at the national, state, local, and tribal government scales.” He also expects the project will enhance our understanding of global carbon-water-energy cycle responses to drought, with applications in determining climate change impacts on natural ecosystems as a whole. More

  • in

    Integrating humans with AI in structural design

    Modern fabrication tools such as 3D printers can make structural materials in shapes that would have been difficult or impossible using conventional tools. Meanwhile, new generative design systems can take great advantage of this flexibility to create innovative designs for parts of a new building, car, or virtually any other device.

    But such “black box” automated systems often fall short of producing designs that are fully optimized for their purpose, such as providing the greatest strength in proportion to weight or minimizing the amount of material needed to support a given load. Fully manual design, on the other hand, is time-consuming and labor-intensive.

    Now, researchers at MIT have found a way to achieve some of the best of both of these approaches. They used an automated design system but stopped the process periodically to allow human engineers to evaluate the work in progress and make tweaks or adjustments before letting the computer resume its design process. Introducing a few of these iterations produced results that performed better than those designed by the automated system alone, and the process was completed more quickly compared to the fully manual approach.

    The results are reported this week in the journal Structural and Multidisciplinary Optimization, in a paper by MIT doctoral student Dat Ha and assistant professor of civil and environmental engineering Josephine Carstensen.

    The basic approach can be applied to a broad range of scales and applications, Carstensen explains, for the design of everything from biomedical devices to nanoscale materials to structural support members of a skyscraper. Already, automated design systems have found many applications. “If we can make things in a better way, if we can make whatever we want, why not make it better?” she asks.

    “It’s a way to take advantage of how we can make things in much more complex ways than we could in the past,” says Ha, adding that automated design systems have already begun to be widely used over the last decade in automotive and aerospace industries, where reducing weight while maintaining structural strength is a key need.

    “You can take a lot of weight out of components, and in these two industries, everything is driven by weight,” he says. In some cases, such as internal components that aren’t visible, appearance is irrelevant, but for other structures aesthetics may be important as well. The new system makes it possible to optimize designs for visual as well as mechanical properties, and in such decisions the human touch is essential.

    As a demonstration of their process in action, the researchers designed a number of structural load-bearing beams, such as might be used in a building or a bridge. In their iterations, they saw that the design has an area that could fail prematurely, so they selected that feature and required the program to address it. The computer system then revised the design accordingly, removing the highlighted strut and strengthening some other struts to compensate, and leading to an improved final design.

    The process, which they call Human-Informed Topology Optimization, begins by setting out the needed specifications — for example, a beam needs to be this length, supported on two points at its ends, and must support this much of a load. “As we’re seeing the structure evolve on the computer screen in response to initial specification,” Carstensen says, “we interrupt the design and ask the user to judge it. The user can select, say, ‘I’m not a fan of this region, I’d like you to beef up or beef down this feature size requirement.’ And then the algorithm takes into account the user input.”

    While the result is not as ideal as what might be produced by a fully rigorous yet significantly slower design algorithm that considers the underlying physics, she says it can be much better than a result generated by a rapid automated design system alone. “You don’t get something that’s quite as good, but that was not necessarily the goal. What we can show is that instead of using several hours to get something, we can use 10 minutes and get something much better than where we started off.”

    The system can be used to optimize a design based on any desired properties, not just strength and weight. For example, it can be used to minimize fracture or buckling, or to reduce stresses in the material by softening corners.

    Carstensen says, “We’re not looking to replace the seven-hour solution. If you have all the time and all the resources in the world, obviously you can run these and it’s going to give you the best solution.” But for many situations, such as designing replacement parts for equipment in a war zone or a disaster-relief area with limited computational power available, “then this kind of solution that catered directly to your needs would prevail.”

    Similarly, for smaller companies manufacturing equipment in essentially “mom and pop” businesses, such a simplified system might be just the ticket. The new system they developed is not only simple and efficient to run on smaller computers, but it also requires far less training to produce useful results, Carstensen says. A basic two-dimensional version of the software, suitable for designing basic beams and structural parts, is freely available now online, she says, as the team continues to develop a full 3D version.

    “The potential applications of Prof Carstensen’s research and tools are quite extraordinary,” says Christian Málaga-Chuquitaype, a professor of civil and environmental engineering at Imperial College London, who was not associated with this work. “With this work, her group is paving the way toward a truly synergistic human-machine design interaction.”

    “By integrating engineering ‘intuition’ (or engineering ‘judgement’) into a rigorous yet computationally efficient topology optimization process, the human engineer is offered the possibility of guiding the creation of optimal structural configurations in a way that was not available to us before,” he adds. “Her findings have the potential to change the way engineers tackle ‘day-to-day’ design tasks.” More

  • in

    Machine learning facilitates “turbulence tracking” in fusion reactors

    Fusion, which promises practically unlimited, carbon-free energy using the same processes that power the sun, is at the heart of a worldwide research effort that could help mitigate climate change.

    A multidisciplinary team of researchers is now bringing tools and insights from machine learning to aid this effort. Scientists from MIT and elsewhere have used computer-vision models to identify and track turbulent structures that appear under the conditions needed to facilitate fusion reactions.

    Monitoring the formation and movements of these structures, called filaments or “blobs,” is important for understanding the heat and particle flows exiting from the reacting fuel, which ultimately determines the engineering requirements for the reactor walls to meet those flows. However, scientists typically study blobs using averaging techniques, which trade details of individual structures in favor of aggregate statistics. Individual blob information must be tracked by marking them manually in video data. 

    The researchers built a synthetic video dataset of plasma turbulence to make this process more effective and efficient. They used it to train four computer vision models, each of which identifies and tracks blobs. They trained the models to pinpoint blobs in the same ways that humans would.

    When the researchers tested the trained models using real video clips, the models could identify blobs with high accuracy — more than 80 percent in some cases. The models were also able to effectively estimate the size of blobs and the speeds at which they moved.

    Because millions of video frames are captured during just one fusion experiment, using machine-learning models to track blobs could give scientists much more detailed information.

    “Before, we could get a macroscopic picture of what these structures are doing on average. Now, we have a microscope and the computational power to analyze one event at a time. If we take a step back, what this reveals is the power available from these machine-learning techniques, and ways to use these computational resources to make progress,” says Theodore Golfinopoulos, a research scientist at the MIT Plasma Science and Fusion Center and co-author of a paper detailing these approaches.

    His fellow co-authors include lead author Woonghee “Harry” Han, a physics PhD candidate; senior author Iddo Drori, a visiting professor in the Computer Science and Artificial Intelligence Laboratory (CSAIL), faculty associate professor at Boston University, and adjunct at Columbia University; as well as others from the MIT Plasma Science and Fusion Center, the MIT Department of Civil and Environmental Engineering, and the Swiss Federal Institute of Technology at Lausanne in Switzerland. The research appears today in Nature Scientific Reports.

    Heating things up

    For more than 70 years, scientists have sought to use controlled thermonuclear fusion reactions to develop an energy source. To reach the conditions necessary for a fusion reaction, fuel must be heated to temperatures above 100 million degrees Celsius. (The core of the sun is about 15 million degrees Celsius.)

    A common method for containing this super-hot fuel, called plasma, is to use a tokamak. These devices utilize extremely powerful magnetic fields to hold the plasma in place and control the interaction between the exhaust heat from the plasma and the reactor walls.

    However, blobs appear like filaments falling out of the plasma at the very edge, between the plasma and the reactor walls. These random, turbulent structures affect how energy flows between the plasma and the reactor.

    “Knowing what the blobs are doing strongly constrains the engineering performance that your tokamak power plant needs at the edge,” adds Golfinopoulos.

    Researchers use a unique imaging technique to capture video of the plasma’s turbulent edge during experiments. An experimental campaign may last months; a typical day will produce about 30 seconds of data, corresponding to roughly 60 million video frames, with thousands of blobs appearing each second. This makes it impossible to track all blobs manually, so researchers rely on average sampling techniques that only provide broad characteristics of blob size, speed, and frequency.

    “On the other hand, machine learning provides a solution to this by blob-by-blob tracking for every frame, not just average quantities. This gives us much more knowledge about what is happening at the boundary of the plasma,” Han says.

    He and his co-authors took four well-established computer vision models, which are commonly used for applications like autonomous driving, and trained them to tackle this problem.

    Simulating blobs

    To train these models, they created a vast dataset of synthetic video clips that captured the blobs’ random and unpredictable nature.

    “Sometimes they change direction or speed, sometimes multiple blobs merge, or they split apart. These kinds of events were not considered before with traditional approaches, but we could freely simulate those behaviors in the synthetic data,” Han says.

    Creating synthetic data also allowed them to label each blob, which made the training process more effective, Drori adds.

    Using these synthetic data, they trained the models to draw boundaries around blobs, teaching them to closely mimic what a human scientist would draw.

    Then they tested the models using real video data from experiments. First, they measured how closely the boundaries the models drew matched up with actual blob contours.

    But they also wanted to see if the models predicted objects that humans would identify. They asked three human experts to pinpoint the centers of blobs in video frames and checked to see if the models predicted blobs in those same locations.

    The models were able to draw accurate blob boundaries, overlapping with brightness contours which are considered ground-truth, about 80 percent of the time. Their evaluations were similar to those of human experts, and successfully predicted the theory-defined regime of the blob, which agrees with the results from a traditional method.

    Now that they have shown the success of using synthetic data and computer vision models for tracking blobs, the researchers plan to apply these techniques to other problems in fusion research, such as estimating particle transport at the boundary of a plasma, Han says.

    They also made the dataset and models publicly available, and look forward to seeing how other research groups apply these tools to study the dynamics of blobs, says Drori.

    “Prior to this, there was a barrier to entry that mostly the only people working on this problem were plasma physicists, who had the datasets and were using their methods. There is a huge machine-learning and computer-vision community. One goal of this work is to encourage participation in fusion research from the broader machine-learning community toward the broader goal of helping solve the critical problem of climate change,” he adds.

    This research is supported, in part, by the U.S. Department of Energy and the Swiss National Science Foundation. More