More stories

  • in

    How to increase the rate of plastics recycling

    While recycling systems and bottle deposits have become increasingly widespread in the U.S., actual rates of recycling are “abysmal,” according to a team of MIT researchers who studied the rates for recycling of PET, the plastic commonly used in beverage bottles. However, their findings suggest some ways to change this.The present rate of recycling for PET, or polyethylene terephthalate, bottles nationwide is about 24 percent and has remained stagnant for a decade, the researchers say. But their study indicates that with a nationwide bottle deposit program, the rates could increase to 82 percent, with nearly two-thirds of all PET bottles being recycled into new bottles, at a net cost of just a penny a bottle when demand is robust. At the same time, they say, policies would be needed to ensure a sufficient demand for the recycled material.The findings are being published today in the Journal of Industrial Ecology, in a paper by MIT professor of materials science and engineering Elsa Olivetti, graduate students Basuhi Ravi and Karan Bhuwalka, and research scientist Richard Roth.The team looked at PET bottle collection and recycling rates in different states as well as other nations with and without bottle deposit policies, and with or without curbside recycling programs, as well as the inputs and outputs of various recycling companies and methods. The researchers say this study is the first to look in detail at the interplay between public policies and the end-to-end realities of the packaging production and recycling market.They found that bottle deposit programs are highly effective in the areas where they are in place, but at present there is not nearly enough collection of used bottles to meet the targets set by the packaging industry. Their analysis suggests that a uniform nationwide bottle deposit policy could achieve the levels of recycling that have been mandated by proposed legislation and corporate commitments.The recycling of PET is highly successful in terms of quality, with new products made from all-recycled material virtually matching the qualities of virgin material. And brands have shown that new bottles can be safely made with 100 percent postconsumer waste. But the team found that collection of the material is a crucial bottleneck that leaves processing plants unable to meet their needs. However, with the right policies in place, “one can be optimistic,” says Olivetti, who is the Jerry McAfee Professor in Engineering and the associate dean of the School of Engineering.“A message that we have found in a number of cases in the recycling space is that if you do the right work to support policies that think about both the demand but also the supply,” then significant improvements are possible, she says. “You have to think about the response and the behavior of multiple actors in the system holistically to be viable,” she says. “We are optimistic, but there are many ways to be pessimistic if we’re not thinking about that in a holistic way.”For example, the study found that it is important to consider the needs of existing municipal waste-recovery facilities. While expanded bottle deposit programs are essential to increase recycling rates and provide the feedstock to companies recycling PET into new products, the current facilities that process material from curbside recycling programs will lose revenue from PET bottles, which are a relatively high-value product compared to the other materials in the recycled waste stream. These companies would lose a source of their income if the bottles are collected through deposit programs, leaving them with only the lower-value mixed plastics.The researchers developed economic models based on rates of collection found in the states with deposit programs, recycled-content requirements, and other policies, and used these models to extrapolate to the nation as a whole. Overall, they found that the supply needs of packaging producers could be met through a nationwide bottle deposit system with a 10-cent deposit per bottle — at a net cost of about 1 cent per bottle produced when demand is strong. This need not be a federal program, but rather one where the implementation would be left up to the individual states, Olivetti says.Other countries have been much more successful in implementing deposit systems that result in very high participation rates. Several European countries manage to collect more than 90 percent of PET bottles for recycling, for example. But in the U.S., less than 29 percent are collected, and after losses in the recycling chain about 24 percent actually get recycled, the researchers found. Whereas 73 percent of Americans have access to curbside recycling, presently only 10 states have bottle deposit systems in place.Yet the demand is there so far. “There is a market for this material,” says Olivetti. While bottles collected through mixed-waste collection can still be recycled to some extent, those collected through deposit systems tend to be much cleaner and require less processing, and so are more economical to recycle into new bottles, or into textiles.To be effective, policies need to not just focus on increasing rates of recycling, but on the whole cycle of supply and demand and the different players involved, Olivetti says. Safeguards would need to be in place to protect existing recycling facilities from the lost revenues they would suffer as a result of bottle deposits, perhaps in the form of subsidies funded by fees on the bottle producers, to avoid putting these essential parts of the processing chain out of business. And other policies may be needed to ensure the continued market for the material that gets collected, including recycled content requirements and extended producer responsibility regulations, the team found.At this stage, it’s important to focus on the specific waste streams that can most effectively be recycled, and PET, along with many metals, clearly fit that category. “When we start to think about mixed plastic streams, that’s much more challenging from an environmental perspective,” she says. “Recycling systems need to be pursuing extended producers’ responsibility, or specifically thinking about materials designed more effectively toward recycled content,” she says.It’s also important to address “what the right metrics are to design for sustainably managed materials streams,” she says. “It could be energy use, could be circularity [for example, making old bottles into new bottles], could be around waste reduction, and making sure those are all aligned. That’s another kind of policy coordination that’s needed.” More

  • in

    Q&A: The power of tiny gardens and their role in addressing climate change

    To address the climate crisis, one must understand environmental history. MIT Professor Kate Brown’s research has typically focused on environmental catastrophes. More recently, Brown has been exploring a more hopeful topic: tiny gardens.Brown is the Thomas M. Siebel Distinguished Professor in History of Science in the MIT Program in Science, Technology, and Society. In this Q&A, Brown discusses her research, and how she believes her current project could help put power into the hands of everyday people.This is part of an ongoing series exploring how the MIT School of Humanities, Arts, and Social Sciences is addressing the climate crisis.Q: You have created an unusual niche for yourself as an historian of environmental catastrophes. What drew you to such a dismal beat?A: Historians often study New York, Warsaw, Moscow, Berlin, but if you go to these little towns that nobody’s ever heard of, that’s where you see the destruction in the wake of progress. This is likely because I grew up in a manufacturing town in the Midwestern Rust Belt, watching stores go bankrupt and houses sit empty. I became very interested in the people who were the last to turn off the lights.Q: Did this interest in places devastated by technological and economic change eventually lead to your investigation of Chernobyl?A: I first studied the health and environmental consequences of radioactive waste on communities near nuclear weapons facilities in the U.S. and Russia, and then decided to focus on the health and environmental impacts of fallout from the Chernobyl nuclear energy plant disaster. After gaining access to the KGB records in Kiev, I realized that there was a Klondike of records describing what Soviet officials at the time called a “public health disaster.” People on the ground recognized the saturation of radioactivity into environments and food supplies not with any with sensitive devices, but by noticing the changes in ecologies and on human bodies. I documented how Moscow leaders historically and decades later engaged in a coverup, and that even international bodies charged with examining nuclear issues were reluctant to acknowledge this ongoing public health disaster due to liabilities in their own countries from the production and testing of nuclear weapons during the Cold War.Q: Why did you turn from detailed studies of what you call “modernist wastelands” to the subject of climate change?A: Journalists and scholars have worked hard in the last two decades to get people to understand the scope and the scale and the verisimilitude of climate change. And that’s great, but some of these catastrophic stories we tell don’t make people feel very safe or secure. They have a paralyzing effect on us. Climate change is one of many problems that are too big for any one person to tackle, or any one entity, whether it’s a huge nation like the United States or an international body like the U.N.So I thought I would start to work on something that is very small scale that puts action in the hands of just regular people to try to tell a more hopeful story. I am finishing a new book about working-class people who got pushed off their farms in the 19th century, and ended up in mega cities like London, Berlin, Amsterdam, and Washington D.C., find land on the periphery of the cities. They start digging, growing their own food, cooperating together. They basically recreated forms of the commons in cities. And in so doing, they generate the most productive agriculture in recorded history.Q: What are some highlights of this extraordinary city-based food generation?A: In Paris circa 1900, 5,000 urban farmers grew fruits and vegetables and fresh produce for 2 million Parisians with a surplus left over to sell to London. They would plant three to six crops a year on one tract of land using horse manure to heat up soils from below to push the season and grow spring crops in winter and summer crops in spring.An agricultural economist looked at the inputs and the outputs from these Parisian farms. He found there was no comparison to the Green Revolution fields of the 1970s. These urban gardeners were producing far more per acre, with no petroleum-based fertilizers.Q: What is the connection between little gardens like these and the global climate crisis, where individuals can feel at loss facing the scale of the problems?A: You can think of a tiny city garden like a coral reef, where one little worm comes and builds its cave. And then another one attaches itself to the first, and so on. Pretty soon you have a great coral reef with a platform to support hundreds of different species — a rich biodiversity. Tiny gardens work that way in cities, which is one reason cities are now surprising hotspots of biodiversity.Transforming urban green space into tiny gardens doesn’t take an act of God, the U.N., or the U.S. Congress to make a change. You could just go to your municipality and say, “Listen, right now we have a zoning code that says every time there’s a new condo, you have to have one or two parking spaces, but we’d rather see one or two garden spaces.”And if you don’t want a garden, you’ll have a neighbor who does. So people are outside and they have their hands in the soil and then they start to exchange produce with one another. As they share carrots and zucchini, they exchange soil and human microbes as well. We know that when people share microbiomes, they get along better, have more in common. It comes as no surprise that humans have organized societies around shaking hands, kissing on the cheek, producing food together and sharing meals. That’s what I think we’ve lost in our remote worlds.Q: So can we address or mitigate the impacts of climate change on a community-by-community basis?A: I believe that’s probably the best way to do it. When we think of energy we often imagine deposits of oil or gas, but, as our grad student Turner Adornetto points out, every environment has energy running through it. Every environment has its own best solution. If it’s a community that lives along a river, tap into hydropower; or if it’s a community that has tons of organic waste, maybe you want to use microbial power; and if it’s a community that has lots of sun then use different kinds of solar power. The legacy of midcentury modernism is that engineers came up with one-size-fits-all solutions to plug in anywhere in the world, regardless of local culture, traditions, or environment. That is one of the problems that has gotten us into this fix in the first place.Politically, it’s a good idea to avoid making people feel they’re being pushed around by one set of codes, one set of laws in terms of coming up with solutions that work. There are ways of deriving energy and nutrients that enrich the environment, ways that don’t drain and deplete. You see that so clearly with a plant, which just does nothing but grow and contribute and give, whether it’s in life or in death. It’s just constantly improving its environment.Q: How do you unleash creativity and propagate widespread local responses to climate change?A: One of the important things we are trying to accomplish in the humanities is communicating in the most down-to-earth ways possible to our students and the public so that anybody — from a fourth grader to a retired person — can get engaged.There’s “TECHNOLOGY” in uppercase letters, the kind that is invented and patented in places like MIT. And then there’s technology in lowercase letters, where people are working with things readily at hand. That is the kind of creativity we don’t often pay enough attention to.Keep in mind that at the end of the 19th century, scientists were sure that the earth was cooling and the earth would all under ice by 2020. In the 1950s, many people feared nuclear warfare. In the 1960s the threat was the “population bomb.” Every generation seems to have its apocalyptic sense of doom. It is helpful to take climate change and the Anthropocene and put them in perspective. These are problems we can solve. More

  • in

    An AI dataset carves new paths to tornado detection

    The return of spring in the Northern Hemisphere touches off tornado season. A tornado’s twisting funnel of dust and debris seems an unmistakable sight. But that sight can be obscured to radar, the tool of meteorologists. It’s hard to know exactly when a tornado has formed, or even why.

    A new dataset could hold answers. It contains radar returns from thousands of tornadoes that have hit the United States in the past 10 years. Storms that spawned tornadoes are flanked by other severe storms, some with nearly identical conditions, that never did. MIT Lincoln Laboratory researchers who curated the dataset, called TorNet, have now released it open source. They hope to enable breakthroughs in detecting one of nature’s most mysterious and violent phenomena.

    “A lot of progress is driven by easily available, benchmark datasets. We hope TorNet will lay a foundation for machine learning algorithms to both detect and predict tornadoes,” says Mark Veillette, the project’s co-principal investigator with James Kurdzo. Both researchers work in the Air Traffic Control Systems Group. 

    Along with the dataset, the team is releasing models trained on it. The models show promise for machine learning’s ability to spot a twister. Building on this work could open new frontiers for forecasters, helping them provide more accurate warnings that might save lives. 

    Swirling uncertainty

    About 1,200 tornadoes occur in the United States every year, causing millions to billions of dollars in economic damage and claiming 71 lives on average. Last year, one unusually long-lasting tornado killed 17 people and injured at least 165 others along a 59-mile path in Mississippi.  

    Yet tornadoes are notoriously difficult to forecast because scientists don’t have a clear picture of why they form. “We can see two storms that look identical, and one will produce a tornado and one won’t. We don’t fully understand it,” Kurdzo says.

    A tornado’s basic ingredients are thunderstorms with instability caused by rapidly rising warm air and wind shear that causes rotation. Weather radar is the primary tool used to monitor these conditions. But tornadoes lay too low to be detected, even when moderately close to the radar. As the radar beam with a given tilt angle travels further from the antenna, it gets higher above the ground, mostly seeing reflections from rain and hail carried in the “mesocyclone,” the storm’s broad, rotating updraft. A mesocyclone doesn’t always produce a tornado.

    With this limited view, forecasters must decide whether or not to issue a tornado warning. They often err on the side of caution. As a result, the rate of false alarms for tornado warnings is more than 70 percent. “That can lead to boy-who-cried-wolf syndrome,” Kurdzo says.  

    In recent years, researchers have turned to machine learning to better detect and predict tornadoes. However, raw datasets and models have not always been accessible to the broader community, stifling progress. TorNet is filling this gap.

    The dataset contains more than 200,000 radar images, 13,587 of which depict tornadoes. The rest of the images are non-tornadic, taken from storms in one of two categories: randomly selected severe storms or false-alarm storms (those that led a forecaster to issue a warning but that didn’t produce a tornado).

    Each sample of a storm or tornado comprises two sets of six radar images. The two sets correspond to different radar sweep angles. The six images portray different radar data products, such as reflectivity (showing precipitation intensity) or radial velocity (indicating if winds are moving toward or away from the radar).

    A challenge in curating the dataset was first finding tornadoes. Within the corpus of weather radar data, tornadoes are extremely rare events. The team then had to balance those tornado samples with difficult non-tornado samples. If the dataset were too easy, say by comparing tornadoes to snowstorms, an algorithm trained on the data would likely over-classify storms as tornadic.

    “What’s beautiful about a true benchmark dataset is that we’re all working with the same data, with the same level of difficulty, and can compare results,” Veillette says. “It also makes meteorology more accessible to data scientists, and vice versa. It becomes easier for these two parties to work on a common problem.”

    Both researchers represent the progress that can come from cross-collaboration. Veillette is a mathematician and algorithm developer who has long been fascinated by tornadoes. Kurdzo is a meteorologist by training and a signal processing expert. In grad school, he chased tornadoes with custom-built mobile radars, collecting data to analyze in new ways.

    “This dataset also means that a grad student doesn’t have to spend a year or two building a dataset. They can jump right into their research,” Kurdzo says.

    This project was funded by Lincoln Laboratory’s Climate Change Initiative, which aims to leverage the laboratory’s diverse technical strengths to help address climate problems threatening human health and global security.

    Chasing answers with deep learning

    Using the dataset, the researchers developed baseline artificial intelligence (AI) models. They were particularly eager to apply deep learning, a form of machine learning that excels at processing visual data. On its own, deep learning can extract features (key observations that an algorithm uses to make a decision) from images across a dataset. Other machine learning approaches require humans to first manually label features. 

    “We wanted to see if deep learning could rediscover what people normally look for in tornadoes and even identify new things that typically aren’t searched for by forecasters,” Veillette says.

    The results are promising. Their deep learning model performed similar to or better than all tornado-detecting algorithms known in literature. The trained algorithm correctly classified 50 percent of weaker EF-1 tornadoes and over 85 percent of tornadoes rated EF-2 or higher, which make up the most devastating and costly occurrences of these storms.

    They also evaluated two other types of machine-learning models, and one traditional model to compare against. The source code and parameters of all these models are freely available. The models and dataset are also described in a paper submitted to a journal of the American Meteorological Society (AMS). Veillette presented this work at the AMS Annual Meeting in January.

    “The biggest reason for putting our models out there is for the community to improve upon them and do other great things,” Kurdzo says. “The best solution could be a deep learning model, or someone might find that a non-deep learning model is actually better.”

    TorNet could be useful in the weather community for others uses too, such as for conducting large-scale case studies on storms. It could also be augmented with other data sources, like satellite imagery or lightning maps. Fusing multiple types of data could improve the accuracy of machine learning models.

    Taking steps toward operations

    On top of detecting tornadoes, Kurdzo hopes that models might help unravel the science of why they form.

    “As scientists, we see all these precursors to tornadoes — an increase in low-level rotation, a hook echo in reflectivity data, specific differential phase (KDP) foot and differential reflectivity (ZDR) arcs. But how do they all go together? And are there physical manifestations we don’t know about?” he asks.

    Teasing out those answers might be possible with explainable AI. Explainable AI refers to methods that allow a model to provide its reasoning, in a format understandable to humans, of why it came to a certain decision. In this case, these explanations might reveal physical processes that happen before tornadoes. This knowledge could help train forecasters, and models, to recognize the signs sooner. 

    “None of this technology is ever meant to replace a forecaster. But perhaps someday it could guide forecasters’ eyes in complex situations, and give a visual warning to an area predicted to have tornadic activity,” Kurdzo says.

    Such assistance could be especially useful as radar technology improves and future networks potentially grow denser. Data refresh rates in a next-generation radar network are expected to increase from every five minutes to approximately one minute, perhaps faster than forecasters can interpret the new information. Because deep learning can process huge amounts of data quickly, it could be well-suited for monitoring radar returns in real time, alongside humans. Tornadoes can form and disappear in minutes.

    But the path to an operational algorithm is a long road, especially in safety-critical situations, Veillette says. “I think the forecaster community is still, understandably, skeptical of machine learning. One way to establish trust and transparency is to have public benchmark datasets like this one. It’s a first step.”

    The next steps, the team hopes, will be taken by researchers across the world who are inspired by the dataset and energized to build their own algorithms. Those algorithms will in turn go into test beds, where they’ll eventually be shown to forecasters, to start a process of transitioning into operations.

    In the end, the path could circle back to trust.

    “We may never get more than a 10- to 15-minute tornado warning using these tools. But if we could lower the false-alarm rate, we could start to make headway with public perception,” Kurdzo says. “People are going to use those warnings to take the action they need to save their lives.” More

  • in

    New study shows how universities are critical to emerging fusion industry

    A new study suggests that universities have an essential role to fulfill in the continued growth and success of any modern high-tech industry, and especially the nascent fusion industry; however, the importance of that role is not reflected in the number of fusion-oriented faculty and educational channels currently available. Academia’s responsiveness to the birth of other modern scientific fields, such as aeronautics and nuclear fission, provides a template for the steps universities can take to enable a robust fusion industry.

    Authored by Dennis Whyte, the Hitachi America Professor of Engineering and director of the Plasma Science and Fusion Center at MIT; Carlos Paz-Soldan, associate professor of applied physics and applied mathematics at Columbia University; and Brian D. Wirth, the Governor’s Chair Professor of Computational Nuclear Engineering at the University of Tennessee, the paper was recently published in the journal Physics of Plasmas as part of a special collection titled “Private Fusion Research: Opportunities and Challenges in Plasma Science.”

    With contributions from authors in academia, government, and private industry, the collection outlines a framework for public-private partnerships that will be essential for the success of the fusion industry.

    Now being seen as a potential source of unlimited green energy, fusion is the same process that powers the sun — hydrogen atoms combine to form helium, releasing vast amounts of clean energy in the form of light and heat.

    The excitement surrounding fusion’s arrival has resulted in the proliferation of dozens of for-profit companies positioning themselves at the forefront of the commercial fusion energy industry. In the near future, those companies will require a significant network of fusion-fluent workers to take on varied tasks requiring a range of skills.

    While the authors acknowledge the role of private industry, especially as an increasingly dominant source of research funding, they also show that academia is and will continue to be critical to industry’s development, and it cannot be decoupled from private industry’s growth. Despite the evidence of this burgeoning interest, the size and scale of the field’s academic network at U.S.-based universities is sparse.

    According to Whyte, “Diversifying the [fusion] field by adding more tracks for master’s students and undergraduates who can transition into industry more quickly is an important step.”

    An analysis found that while there are 57 universities in the United States active in plasma and fusion research, the average number of tenured or tenure-track plasma/fusion faculty at each institution is only two. By comparison, a sampling of US News and World Report’s top 10 programs for nuclear fission and aeronautics/astronautics found an average of nearly 20 faculty devoted to fission and 32 to aero/astro.

    “University programs in fusion and their sponsors need to up their game and hire additional faculty if they want to provide the necessary workforce to support a growing U.S. fusion industry,” adds Paz-Soldan.

    The growth and proliferation of those fields and others, such as computing and biotechnology, were historically in lockstep with the creation of academic programs that helped drive the fields’ progress and widespread acceptance. Creating a similar path for fusion is essential to ensuring its sustainable growth, and as Wirth notes, “that this growth should be pursued in a way that is interdisciplinary across numerous engineering and science disciplines.”

    At MIT, an example of that path is seen at the Plasma Science and Fusion Center.

    The center has deep historical ties to government research programs, and the largest fusion company in the world, Commonwealth Fusion Systems (CFS), was spun out of the PSFC by Whyte’s former students and an MIT postdoc. Whyte also serves as the primary investigator in collaborative research with CFS on SPARC, a proof-of-concept fusion platform for advancing tokamak science that is scheduled for completion in 2025.

    “Public and private roles in the fusion community are rapidly evolving in response to the growth of privately funded commercial product development,” says Michael Segal, head of open innovation at CFS. “The fusion industry will increasingly rely on its university partners to train students, work across diverse disciplines, and execute small and midsize programs at speed.”

    According to the authors, another key reason academia will remain essential to the continued growth and development of fusion is because it is unconflicted. Whyte comments, “Our mandate is sharing information and education, which means we have no competitive conflict and innovation can flow freely.” Furthermore, fusion science is inherently multidisciplinary: “[It] requires physicists, computer scientists, engineers, chemists, etc. and it’s easy to tap into all those disciplines in an academic environment where they’re all naturally rubbing elbows and collaborating.”

    Creating a new energy industry, however, will also require a workforce skilled in disciplines other than STEM, say the authors. As fusion companies continue to grow, they will need expertise in finance, safety, licensing, and market analysis. Any successful fusion enterprise will also have major geopolitical, societal, and economic impacts, all of which must be managed.

    Ultimately, there are several steps the authors identify to help build the connections between academia and industry that will be important going forward: The first is for universities to acknowledge the rapidly changing fusion landscape and begin to adapt. “Universities need to embrace the growth of the private sector in fusion, recognize the opportunities it provides, and seek out mutually beneficial partnerships,” says Paz-Soldan.

    The second step is to reconcile the mission of educational institutions — unconflicted open access — with condensed timelines and proprietary outputs that come with private partnerships. At the same time, the authors note that private fusion companies should embrace the transparency of academia by publishing and sharing the findings they can through peer-reviewed journals, which will be a necessary part of building the industry’s credibility.

    The last step, the authors say, is for universities to become more flexible and creative in their technology licensing strategies to ensure ideas and innovations find their way from the lab into industry.

    “As an industry, we’re in a unique position because everything is brand new,” Whyte says. “But we’re enough students of history that we can see what’s needed to succeed; quantifying the status of the private and academic landscape is an important strategic touchstone. By drawing attention to the current trajectory, hopefully we’ll be in a better position to work with our colleagues in the public and private sector and make better-informed choices about how to proceed.” More

  • in

    Celebrating five years of MIT.nano

    There is vast opportunity for nanoscale innovation to transform the world in positive ways — expressed MIT.nano Director Vladimir Bulović as he posed two questions to attendees at the start of the inaugural Nano Summit: “Where are we heading? And what is the next big thing we can develop?”

    “The answer to that puts into perspective our main purpose — and that is to change the world,” Bulović, the Fariborz Maseeh Professor of Emerging Technologies, told an audience of more than 325 in-person and 150 virtual participants gathered for an exploration of nano-related research at MIT and a celebration of MIT.nano’s fifth anniversary.

    Over a decade ago, MIT embarked on a massive project for the ultra-small — building an advanced facility to support research at the nanoscale. Construction of MIT.nano in the heart of MIT’s campus, a process compared to assembling a ship in a bottle, began in 2015, and the facility launched in October 2018.

    Fast forward five years: MIT.nano now contains nearly 170 tools and instruments serving more than 1,200 trained researchers. These individuals come from over 300 principal investigator labs, representing more than 50 MIT departments, labs, and centers. The facility also serves external users from industry, other academic institutions, and over 130 startup and multinational companies.

    A cross section of these faculty and researchers joined industry partners and MIT community members to kick off the first Nano Summit, which is expected to become an annual flagship event for MIT.nano and its industry consortium. Held on Oct. 24, the inaugural conference was co-hosted by the MIT Industrial Liaison Program.

    Six topical sessions highlighted recent developments in quantum science and engineering, materials, advanced electronics, energy, biology, and immersive data technology. The Nano Summit also featured startup ventures and an art exhibition.

    Watch the videos here.

    Seeing and manipulating at the nanoscale — and beyond

    “We need to develop new ways of building the next generation of materials,” said Frances Ross, the TDK Professor in Materials Science and Engineering (DMSE). “We need to use electron microscopy to help us understand not only what the structure is after it’s built, but how it came to be. I think the next few years in this piece of the nano realm are going to be really amazing.”

    Speakers in the session “The Next Materials Revolution,” chaired by MIT.nano co-director for Characterization.nano and associate professor in DMSE James LeBeau, highlighted areas in which cutting-edge microscopy provides insights into the behavior of functional materials at the nanoscale, from anti-ferroelectrics to thin-film photovoltaics and 2D materials. They shared images and videos collected using the instruments in MIT.nano’s characterization suites, which were specifically designed and constructed to minimize mechanical-vibrational and electro-magnetic interference.

    Later, in the “Biology and Human Health” session chaired by Boris Magasanik Professor of Biology Thomas Schwartz, biologists echoed the materials scientists, stressing the importance of the ultra-quiet, low-vibration environment in Characterization.nano to obtain high-resolution images of biological structures.

    “Why is MIT.nano important for us?” asked Schwartz. “An important element of biology is to understand the structure of biology macromolecules. We want to get to an atomic resolution of these structures. CryoEM (cryo-electron microscopy) is an excellent method for this. In order to enable the resolution revolution, we had to get these instruments to MIT. For that, MIT.nano was fantastic.”

    Seychelle Vos, the Robert A. Swanson (1969) Career Development Professor of Life Sciences, shared CryoEM images from her lab’s work, followed by biology Associate Professor Joey Davis who spoke about image processing. When asked about the next stage for CryoEM, Davis said he’s most excited about in-situ tomography, noting that there are new instruments being designed that will improve the current labor-intensive process.

    To chart the future of energy, chemistry associate professor Yogi Surendranath is also using MIT.nano to see what is happening at the nanoscale in his research to use renewable electricity to change carbon dioxide into fuel.

    “MIT.nano has played an immense role, not only in facilitating our ability to make nanostructures, but also to understand nanostructures through advanced imaging capabilities,” said Surendranath. “I see a lot of the future of MIT.nano around the question of how nanostructures evolve and change under the conditions that are relevant to their function. The tools at MIT.nano can help us sort that out.”

    Tech transfer and quantum computing

    The “Advanced Electronics” session chaired by Jesús del Alamo, the Donner Professor of Science in the Department of Electrical Engineering and Computer Science (EECS), brought together industry partners and MIT faculty for a panel discussion on the future of semiconductors and microelectronics. “Excellence in innovation is not enough, we also need to be excellent in transferring these to the marketplace,” said del Alamo. On this point, panelists spoke about strengthening the industry-university connection, as well as the importance of collaborative research environments and of access to advanced facilities, such as MIT.nano, for these environments to thrive.

    The session came on the heels of a startup exhibit in which eleven START.nano companies presented their technologies in health, energy, climate, and virtual reality, among other topics. START.nano, MIT.nano’s hard-tech accelerator, provides participants use of MIT.nano’s facilities at a discounted rate and access to MIT’s startup ecosystem. The program aims to ease hard-tech startups’ transition from the lab to the marketplace, surviving common “valleys of death” as they move from idea to prototype to scaling up.

    When asked about the state of quantum computing in the “Quantum Science and Engineering” session, physics professor Aram Harrow related his response to these startup challenges. “There are quite a few valleys to cross — there are the technical valleys, and then also the commercial valleys.” He spoke about scaling superconducting qubits and qubits made of suspended trapped ions, and the need for more scalable architectures, which we have the ingredients for, he said, but putting everything together is quite challenging.

    Throughout the session, William Oliver, professor of physics and the Henry Ellis Warren (1894) Professor of Electrical Engineering and Computer Science, asked the panelists how MIT.nano can address challenges in assembly and scalability in quantum science.

    “To harness the power of students to innovate, you really need to allow them to get their hands dirty, try new things, try all their crazy ideas, before this goes into a foundry-level process,” responded Kevin O’Brien, associate professor in EECS. “That’s what my group has been working on at MIT.nano, building these superconducting quantum processors using the state-of-the art fabrication techniques in MIT.nano.”

    Connecting the digital to the physical

    In his reflections on the semiconductor industry, Douglas Carlson, senior vice president for technology at MACOM, stressed connecting the digital world to real-world application. Later, in the “Immersive Data Technology” session, MIT.nano associate director Brian Anthony explained how, at the MIT.nano Immersion Lab, researchers are doing just that.

    “We think about and facilitate work that has the human immersed between hardware, data, and experience,” said Anthony, principal research scientist in mechanical engineering. He spoke about using the capabilities of the Immersion Lab to apply immersive technologies to different areas — health, sports, performance, manufacturing, and education, among others. Speakers in this session gave specific examples in hardware, pediatric health, and opera.

    Anthony connected this third pillar of MIT.nano to the fab and characterization facilities, highlighting how the Immersion Lab supports work conducted in other parts of the building. The Immersion Lab’s strength, he said, is taking novel work being developed inside MIT.nano and bringing it up to the human scale to think about applications and uses.

    Artworks that are scientifically inspired

    The Nano Summit closed with a reception at MIT.nano where guests could explore the facility and gaze through the cleanroom windows, where users were actively conducting research. Attendees were encouraged to visit an exhibition on MIT.nano’s first- and second-floor galleries featuring work by students from the MIT Program in Art, Culture, and Technology (ACT) who were invited to utilize MIT.nano’s tool sets and environments as inspiration for art.

    In his closing remarks, Bulović reflected on the community of people who keep MIT.nano running and who are using the tools to advance their research. “Today we are celebrating the facility and all the work that has been done over the last five years to bring it to where it is today. It is there to function not just as a space, but as an essential part of MIT’s mission in research, innovation, and education. I hope that all of us here today take away a deep appreciation and admiration for those who are leading the journey into the nano age.” More

  • in

    Merging science and systems thinking to make materials more sustainable

    For Professor Elsa Olivetti, tackling a problem as large and complex as climate change requires not only lab research but also understanding the systems of production that power the global economy.

    Her career path reflects a quest to investigate materials at scales ranging from the microscopic to the mass-manufactured.

    “I’ve always known what questions I wanted to ask, and then set out to build the tools to help me ask those questions,” says Olivetti, the Jerry McAfee Professor in Engineering.

    Olivetti, who earned tenure in 2022 and was recently appointed associate dean of engineering, has sought to equip students with similar skills, whether in the classroom, in her lab group, or through the interdisciplinary programs she leads at MIT. Those efforts have earned her accolades including the Bose Award for Excellence in Teaching, a MacVicar Faculty Fellowship in 2021, and the McDonald Award for Excellence in Mentoring and Advising in 2023.

    “I think to make real progress in sustainability, materials scientists need to think in interdisciplinary, systems-level ways, but at a deep technical level,” Olivetti says. “Supporting my students so that’s something that a lot more people can do is very rewarding for me.”

    Her mission to make materials more sustainable also makes Olivetti grateful [EAO1] she’s at MIT, which has a long tradition of both interdisciplinary collaboration and technical know-how.

    “MIT’s core competencies are well-positioned for bold achievements in climate and sustainability — the deep expertise on the economics side, the frontier knowledge in science, the computational creativity,” Olivetti says. “It’s a really exciting time and place where the key ingredients for progress are simmering in transformative ways.”

    Answering the call

    The moment that set Olivetti on her life’s journey began when she was 8, with a knock at her door. Her parents were in the other room, so Olivetti opened the door and met an organizer for Greenpeace, a nonprofit that works to raise awareness of environmental issues.

    “I had a chat with that guy and got hooked on environmental concerns,” Olivetti says. “I still remember that conversation.”

    The interaction changed the way Olivetti thought about her place in the world, and her new perspective manifested itself in some unique ways. Her elementary school science fair projects became elaborate pursuits of environmental solutions involving burying various items in the backyard to test for biodegradability. There was also an awkward attempt at natural pesticide development, which lead to a worm hatching in her bedroom.

    As an undergraduate at the University of Virginia, Olivetti gravitated toward classes in environmentalism and materials science.

    “There was a link between materials science and a broader, systems way of framing design for environment, and that just clicked for me in terms of the way I wanted to think about environmental problems — from the atom to the system,” Olivetti recalls.

    That interest led Olivetti to MIT for a PhD in 2001, where she studied the feasibility of new materials for lithium-ion batteries.

    “I really wanted to be thinking of things at a systems level, but I wanted to ground that in lab-based research,” Olivetti says. “I wanted an experiential experience in grad school, and that’s why I chose MIT’s program.”

    Whether it was her undergraduate studies, her PhD, or her ensuing postdoc work at MIT, Olivetti sought to learn new skills to continue bridging the gap between materials science and environmental systems thinking.

    “I think of it as, ‘Here’s how I can build up the ways I ask questions,’” Olivetti explains. “How do we design these materials while thinking about their implications as early as possible?”

    Since joining MIT’s faculty in 2014, Olivetti has developed computational models to measure the cost and environmental impact of new materials, explored ways to adopt more sustainable and circular supply chains, and evaluated potential materials limitations as lithium-ion battery production is scaled. That work helps companies increase their use of greener, recyclable materials and more sustainably dispose of waste.

    Olivetti believes the wide scope of her research gives the students in her lab a more holistic understanding of the life cycle of materials.

    “When the group started, each student was working on a different aspect of the problem — like on the natural language processing pipeline, or on recycling technology assessment, or beneficial use of waste — and now each student can link each of those pieces in their research,” Olivetti explains.

    Beyond her research, Olivetti also co-directs the MIT Climate and Sustainability Consortium, which has established a set of eight areas of sustainability that it organizes coalitions around. Each coalition involves technical leaders at companies and researchers at MIT that work together to accelerate the impact of MIT’s research by helping companies adopt innovative and more sustainable technologies.

    “Climate change mitigation and resilience is such a complex problem, and at MIT we have practice in working together across disciplines on many challenges,” Olivetti says. “It’s been exciting to lean on that culture and unlock ways to move forward more effectively.”

    Bridging divides

    Today, Olivetti tries to maximize the impact of her and her students’ research in materials industrial ecology by maintaining close ties to applications. In her research, this means working directly with aluminum companies to design alloys that could incorporate more scrap material or with nongovernmental organizations to incorporate agricultural residues in building products. In the classroom, that means bringing in people from companies to explain how they think about concepts like heat exchange or fluid flow in their products.

    “I enjoy trying to ground what students are learning in the classroom with what’s happening in the world,” Olivetti explains.

    Exposing students to industry is also a great way to help them think about their own careers. In her research lab, she’s started using the last 30 minutes of meetings to host talks from people working in national labs, startups, and larger companies to show students what they can do after their PhDs. The talks are similar to the Industry Seminar series Olivetti started that pairs undergraduate students with people working in areas like 3D printing, environmental consulting, and manufacturing.

    “It’s about helping students learn what they’re excited about,” Olivetti says.

    Whether in the classroom, lab, or at events held by organizations like MCSC, Olivetti believes collaboration is humanity’s most potent tool to combat climate change.

    “I just really enjoy building links between people,” Olivetti says. “Learning about people and meeting them where they are is a way that one can create effective links. It’s about creating the right playgrounds for people to think and learn.” More

  • in

    Celebrating Kendall Square’s past and shaping its future

    Kendall Square’s community took a deep dive into the history and future of the region at the Kendall Square Association’s 15th annual meeting on Oct. 19.

    It’s no secret that Kendall Square, located in Cambridge, Massachusetts, moves fast. The event, titled “Looking Back, Looking Ahead,” gave community members a chance to pause and reflect on how far the region has come and to discuss efforts to shape where it’s going next.

    “The impact of the last 15 years of working together with a purposeful commitment to make the world a better place was on display this evening,” KSA Executive Director Beth O’Neill Maloney told the audience toward the end of the evening. “It also shows how Kendall Square can continue contributing to the world.”

    The gathering took place at the Microsoft NERD Center on Memorial Drive, on a floor that also featured music from the Kendall Square Orchestra and, judging by the piles of empty trays at the end of the night, an exceedingly popular selection of food from Kendall Square restaurants. Attendees came from across Cambridge’s prolific innovation ecosystem — not just entrepreneurs and life science workers but also high school and college students, restaurant and retail shop owners, workers at local cleantech and robotics companies, and leaders of nonprofits.

    KSA itself is a nonprofit made up of over 150 organizations across Kendall Square, from major companies to universities like MIT to research organizations like the Broad Institute of MIT and Harvard and the independent shops and restaurants that give Kendall Square its distinct character.

    The night’s programming included talks about recent funding achievements in the region, a panel discussion on the implications of artificial intelligence, and a highly entertaining, whirlwind history lesson led by Daniel Berger-Jones of Cambridge Historical Tours.

    “Our vision for the state is to be the best, and Kendall really represents that,” said Yvonne Hao, Massachusetts secretary of economic development. “When I went to DC to talk to folks about why Massachusetts should win some of these grants, they said, ‘You already have Kendall, that’s what we’re trying to get the whole country to be like!’”

    Hao started her talk by noting her personal connection to Kendall Square. She moved to Cambridge with her family in 2010 and has watched the neighborhood transform, with her kids frequenting the old and new restaurants and shops around town.

    The crux of Hao’s talk was to remind attendees they had more to celebrate than KSA’s anniversary. Massachusetts was recently named the recipient of two major federal grants that will fuel the state’s innovation work. One of those grants, from the Advanced Research Projects Agency for Health (ARPA-H), designated the state an “Investor Catalyst Hub” to accelerate innovation around health care. The other, which came through the federal CHIPS and Science Act, will allow the state to establish the Northeast Microelectronics Coalition Hub to advance microelectronics jobs, workforce training opportunities, and investment in the region’s advanced manufacturing.

    Hao recalled making the pitch for the grants, which could collectively amount to hundreds of millions of dollars in funding over time.

    “The pitch happened in Kendall Square because Kendall highlights everything magical about Massachusetts — we have our universities, MIT, we have our research institutions, nonprofits, small businesses, and great community members,” Hao said. “We were hoping for good weather because we wanted to walk with government officials, because when you walk around Kendall, you see the art, you see the coffee shops, you see the people bumping into each other and talking, and you see why it’s so important that this one square mile of geography become the hub they were looking for.”

    Hao is also part of work to put together the state’s newest economic development plan. She said the group’s tier one priorities are transportation and housing, but listed a number of other areas where she hopes Massachusetts can improve.

    “We can be an amazing, strong economy that’s mission-driven and innovation-driven with all kinds of jobs for all kinds of people, and at the same time an awesome community that loves each other and has great food and small businesses and looks out for each other, that looks diverse just like this room,” Hao said. “That’s the story we want to tell.”

    After the historical tour and the debut of a video explaining the origins of the KSA, attendees fast-forwarded into the future with a panel discussion on the impact and implications of generative AI.

    “I think the paradigm shift we’re seeing with generative AI is going to be as transformative as the internet, perhaps even more so because the pace of adoption is much faster now,” said Microsoft’s Soundar Srinivasan.

    The panel also featured Jennat Jounaidi, a student at Cambridge Rindge and Latin School and member of Innovators for Purpose, a nonprofit that seeks to empower young people from historically marginalized groups to become innovators.

    “I’m interested to see how generative AI shapes my upbringing as well as the lives of future generations, and I think it’s a pivotal moment to decide how we can best develop and incorporate AI into all of our lives,” Jounaidi said.

    Panelists noted that today’s concerns around AI are important, such as its potential to perpetuate inequality and amplify misinformation. But they also discussed the technology’s potential to drive advances in areas like sustainability and health care.

    “I came to Kendall Square to do my PhD in AI at MIT back when the internet was called the ARPA-Net… so a while ago,” said Jeremy Wertheimer SM ’89, PhD ’96. “One of the dreams I had back then was to create a program to read all biology papers. We’re not quite there yet, but I think we’re on the cusp, and it’s very exciting.

    Above all else, the panelists characterized AI as an opportunity. Despite all that’s been accomplished in Kendall Square to date, the prevailing feeling at the event was excitement for the future.

    “Generative AI is giving us chance to stop working in siloes,” Jounaidi said. “Many people in this room go back to their companies and think about corporate responsibility, and I want to expand that to creating shared value in companies by seeking out the community and the people here. I think that’s important, and I’m excited to see what comes next.” More

  • in

    Solve Challenge Finals 2023: Action in service to the world

    In a celebratory convergence of innovation and global impact, the 2023 Solve Challenge Finals, hosted by MIT Solve, unfolded to welcome the 2023 Solver Class. These teams, resolute in their commitment to addressing Solve’s 2023 Global Challenges and rooted in advancing the United Nation’s Sustainable Development Goals, serve as the perfect examples of the impact technology can have when addressed toward social good.

    To set the tone of the day, Cynthia Barnhart, MIT provost, called for bold action in service to the world, and Hala Hanna, MIT Solve executive director, urged the new Solver teams and attendees to harness the power of technology for benevolent purposes. “Humans have lived with the dichotomy of technology since the dawn of time. Today we find ourselves at another juncture with generative AI, and we have choices to make. So, what if we choose that every line of code heals, and every algorithm uplifts, and every device includes?” she said during the opening plenary, Tech-Powered and Locally-Led: Solutions for Global Progress.

    Global, intergenerational, and contextual change for good

    This year’s Solve Challenge Finals served as a global platform for reflection. Majid Al Suwaidi, director-general of COP28, shared the experiences that have shaped his approach to climate negotiation. He recounted a poignant visit to a United Nations High Commissioner for Refugees-facilitated refugee camp housing 300,000 climate migrants. There he met a mother and her nine children. In a sprawling camp housing 300,000 people, scarcity was evident, with just one toilet for every 100 residents. “There are people who contribute nothing to the problem but are impacted the most,” Majid emphasized, stressing the need to prioritize those most affected by climate change when crafting solutions.

    Moderator Lysa John, secretary-general of CIVICUS, steered the conversation toward Africa’s growing influence during her fireside chat with David Sengeh SM ’12, PhD ’16, chief minister of Sierra Leone, and Toyin Saraki, president of the Wellbeing Foundation. The African Union was recently named a permanent member of the G20. Saraki passionately advocated for Africa to assert itself: “I would like this to be more than just the North recognizing the South. This is the time now for us to bring African intelligence to the forefront. We have to bring our own people, our own data, our own resources.” She also called for an intergenerational shift, recognizing the readiness of the younger generation to lead.

    Sengeh, who is 36 himself, emphasized that young people are natural leaders, especially in a nation where 70 percent of the population is youth. He challenged the status quo, urging society to entrust leadership roles to the younger generation.

    Saraki praised Solve as a vital incubation hub, satisfying the need for contextual innovation while contributing to global progress. She views Solve as a marketplace of solutions to systemic weaknesses, drawing upon the diverse approaches of innovators both young and old. “That is the generation of intelligence that needs to grow, not just in Africa. Solve is amazing for that, it’s an investor’s delight,” she said.

    Henrietta Fore, managing partner, chair, and CEO of Radiate Capital, Holsman International, shared an example of entrepreneurship catalyzed by country-level leaders, referencing India’s Swachh Bharat program aimed at promoting cleaner environments. The government initiative led to a burst of entrepreneurial activity, with women opening various shops for toilets and bathroom commodities. Fore highlighted the potential for companies to collaborate with countries on such programs, creating momentum and innovation.

    Trust as capital

    Trust was a prevalent theme throughout the event, from personal to business levels.

    Johanna Mair, academic editor of the Stanford Social Innovation Review, asked Sarah Chandler, vice president of environment and supply chain innovation at Apple, for advice she may have for corporations and startups thinking about their holistic climate goals. Chandler emphasized the importance of trust that businesses must have that environmental goals can align with business goals, highlighting Apple’s 45 percent reduction in carbon footprint since 2015 and 65 percent revenue increase.

    Neela Montgomery, board partner at Greycroft, discussed her initial skepticism around collaborating with large entities, seeking advice from Ilan Goldfajn, president of the Inter-American Development Bank. “Don’t be shy to come … take advantage of a multilateral bank … think about multilateral organizations as the ones to make connections. We can be your support commercially and financially, we could be your clients, and we could be your promoters,” said Goldfajn.

    During a fireside chat among Janti Soeripto, president and CEO of Save the Children USA, and Imran Ahmed, founder and CEO of Center for Countering Digital, Soeripto shared her belief that the most effective change comes from the country and local community level. She pointed to a contextual example of this where Save the Children invested in scaling a small Austrian ed-tech startup — Library for All. The partnership positively impacted literacy for other communities around the world by making literature more accessible.

    There still exist major hurdles for small enterprises to enter the global market. Imran points to sclerosis and hesitancy to trust small-scale innovation as a roadblock to meaningful change. 

    The final discussion of the closing plenary, Funding the Future: Scaling up Inclusive Impact, featured Fore; Mohammed Nanabhay, managing partner of Mozilla Ventures; and Alfred Ironside, vice president of communications at MIT, who asked the two panelists, “What do you [look for] when thinking about putting money into leaders and organizations who are on this mission to create impact and achieve scale?”

    Beyond aligning principles with organizations, Nanabhay said that he looks for tenacity and, most importantly, trust in oneself. “Entrepreneurship is a long journey, it’s a hard journey — whether you’re on the for-profit side or the nonprofit side. It’s easy to say people should have grit, everyone says this. When the time comes and you’re struggling … you need to have the fundamental belief that what you’re working on is meaningful and that it’s going to make the world better.” More