More stories

  • in

    Computers that power self-driving cars could be a huge driver of global carbon emissions

    In the future, the energy needed to run the powerful computers on board a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today.

    That is one key finding of a new study from MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles are widely adopted.

    The data centers that house the physical computing infrastructure used for running applications are widely known for their large carbon footprint: They currently account for about 0.3 percent of global greenhouse gas emissions, or about as much carbon as the country of Argentina produces annually, according to the International Energy Agency. Realizing that less attention has been paid to the potential footprint of autonomous vehicles, the MIT researchers built a statistical model to study the problem. They determined that 1 billion autonomous vehicles, each driving for one hour per day with a computer consuming 840 watts, would consume enough energy to generate about the same amount of emissions as data centers currently do.

    The researchers also found that in over 90 percent of modeled scenarios, to keep autonomous vehicle emissions from zooming past current data center emissions, each vehicle must use less than 1.2 kilowatts of power for computing, which would require more efficient hardware. In one scenario — where 95 percent of the global fleet of vehicles is autonomous in 2050, computational workloads double every three years, and the world continues to decarbonize at the current rate — they found that hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels.

    “If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn’t seem like it is going to be enough to constrain the emissions from computing onboard autonomous vehicles. This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start,” says first author Soumya Sudhakar, a graduate student in aeronautics and astronautics.

    Sudhakar wrote the paper with her co-advisors Vivienne Sze, associate professor in the Department of Electrical Engineering and Computer Science (EECS) and a member of the Research Laboratory of Electronics (RLE); and Sertac Karaman, associate professor of aeronautics and astronautics and director of the Laboratory for Information and Decision Systems (LIDS). The research appears today in the January-February issue of IEEE Micro.

    Modeling emissions

    The researchers built a framework to explore the operational emissions from computers on board a global fleet of electric vehicles that are fully autonomous, meaning they don’t require a back-up human driver.

    The model is a function of the number of vehicles in the global fleet, the power of each computer on each vehicle, the hours driven by each vehicle, and the carbon intensity of the electricity powering each computer.

    “On its own, that looks like a deceptively simple equation. But each of those variables contains a lot of uncertainty because we are considering an emerging application that is not here yet,” Sudhakar says.

    For instance, some research suggests that the amount of time driven in autonomous vehicles might increase because people can multitask while driving and the young and the elderly could drive more. But other research suggests that time spent driving might decrease because algorithms could find optimal routes that get people to their destinations faster.

    In addition to considering these uncertainties, the researchers also needed to model advanced computing hardware and software that doesn’t exist yet.

    To accomplish that, they modeled the workload of a popular algorithm for autonomous vehicles, known as a multitask deep neural network because it can perform many tasks at once. They explored how much energy this deep neural network would consume if it were processing many high-resolution inputs from many cameras with high frame rates, simultaneously.

    When they used the probabilistic model to explore different scenarios, Sudhakar was surprised by how quickly the algorithms’ workload added up.

    For example, if an autonomous vehicle has 10 deep neural networks processing images from 10 cameras, and that vehicle drives for one hour a day, it will make 21.6 million inferences each day. One billion vehicles would make 21.6 quadrillion inferences. To put that into perspective, all of Facebook’s data centers worldwide make a few trillion inferences each day (1 quadrillion is 1,000 trillion).

    “After seeing the results, this makes a lot of sense, but it is not something that is on a lot of people’s radar. These vehicles could actually be using a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they may have 20 eyes, looking all over the place and trying to understand all the things that are happening at the same time,” Karaman says.

    Autonomous vehicles would be used for moving goods, as well as people, so there could be a massive amount of computing power distributed along global supply chains, he says. And their model only considers computing — it doesn’t take into account the energy consumed by vehicle sensors or the emissions generated during manufacturing.

    Keeping emissions in check

    To keep emissions from spiraling out of control, the researchers found that each autonomous vehicle needs to consume less than 1.2 kilowatts of energy for computing. For that to be possible, computing hardware must become more efficient at a significantly faster pace, doubling in efficiency about every 1.1 years.

    One way to boost that efficiency could be to use more specialized hardware, which is designed to run specific driving algorithms. Because researchers know the navigation and perception tasks required for autonomous driving, it could be easier to design specialized hardware for those tasks, Sudhakar says. But vehicles tend to have 10- or 20-year lifespans, so one challenge in developing specialized hardware would be to “future-proof” it so it can run new algorithms.

    In the future, researchers could also make the algorithms more efficient, so they would need less computing power. However, this is also challenging because trading off some accuracy for more efficiency could hamper vehicle safety.

    Now that they have demonstrated this framework, the researchers want to continue exploring hardware efficiency and algorithm improvements. In addition, they say their model can be enhanced by characterizing embodied carbon from autonomous vehicles — the carbon emissions generated when a car is manufactured — and emissions from a vehicle’s sensors.

    While there are still many scenarios to explore, the researchers hope that this work sheds light on a potential problem people may not have considered.

    “We are hoping that people will think of emissions and carbon efficiency as important metrics to consider in their designs. The energy consumption of an autonomous vehicle is really critical, not just for extending the battery life, but also for sustainability,” says Sze.

    This research was funded, in part, by the National Science Foundation and the MIT-Accenture Fellowship. More

  • in

    Moving water and earth

    As a river cuts through a landscape, it can operate like a conveyer belt, moving truckloads of sediment over time. Knowing how quickly or slowly this sediment flows can help engineers plan for the downstream impact of restoring a river or removing a dam. But the models currently used to estimate sediment flow can be off by a wide margin.

    An MIT team has come up with a better formula to calculate how much sediment a fluid can push across a granular bed — a process known as bed load transport. The key to the new formula comes down to the shape of the sediment grains.

    It may seem intuitive: A smooth, round stone should skip across a river bed faster than an angular pebble. But flowing water also pushes harder on the angular pebble, which could erase the round stone’s advantage. Which effect wins? Existing sediment transport models surprisingly don’t offer an answer, mainly because the problem of measuring grain shape is too unwieldy: How do you quantify a pebble’s contours?

    The MIT researchers found that instead of considering a grain’s exact shape, they could boil the concept of shape down to two related properties: friction and drag. A grain’s drag, or resistance to fluid flow, relative to its internal friction, the resistance to sliding past other grains, can provide an easy way to gauge the effects of a grain’s shape.

    When they incorporated this new mathematical measure of grain shape into a standard model for bed load transport, the new formula made predictions that matched experiments that the team performed in the lab.

    “Sediment transport is a part of life on Earth’s surface, from the impact of storms on beaches to the gravel nests in mountain streams where salmon lay their eggs,” the team writes of their new study, appearing today in Nature. “Damming and sea level rise have already impacted many such terrains and pose ongoing threats. A good understanding of bed load transport is crucial to our ability to maintain these landscapes or restore them to their natural states.”

    The study’s authors are Eric Deal, Santiago Benavides, Qiong Zhang, Ken Kamrin, and Taylor Perron of MIT, and Jeremy Venditti and Ryan Bradley of Simon Fraser University in Canada.

    Figuring flow

    Video of glass spheres (top) and natural river gravel (bottom) undergoing bed load transport in a laboratory flume, slowed down 17x relative to real time. Average grain diameter is about 5 mm. This video shows how rolling and tumbling natural grains interact with one another in a way that is not possible for spheres. What can’t be seen so easily is that natural grains also experience higher drag forces from the flowing water than spheres do.

    Credit: Courtesy of the researchers

    Previous item
    Next item

    Bed load transport is the process by which a fluid such as air or water drags grains across a bed of sediment, causing the grains to hop, skip, and roll along the surface as a fluid flows through. This movement of sediment in a current is what drives rocks to migrate down a river and sand grains to skip across a desert.

    Being able to estimate bed load transport can help scientists prepare for situations such as urban flooding and coastal erosion. Since the 1930s, one formula has been the go-to model for calculating bed load transport; it’s based on a quantity known as the Shields parameter, after the American engineer who originally derived it. This formula sets a relationship between the force of a fluid pushing on a bed of sediment, and how fast the sediment moves in response. Albert Shields incorporated certain variables into this formula, including the average size and density of a sediment’s grains — but not their shape.

    “People may have backed away from accounting for shape because it’s one of these very scary degrees of freedom,” says Kamrin, a professor of mechanical engineering at MIT. “Shape is not a single number.”

    And yet, the existing model has been known to be off by a factor of 10 in its predictions of sediment flow. The team wondered whether grain shape could be a missing ingredient, and if so, how the nebulous property could be mathematically represented.

    “The trick was to focus on characterizing the effect that shape has on sediment transport dynamics, rather than on characterizing the shape itself,” says Deal.

    “It took some thinking to figure that out,” says Perron, a professor of geology in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “But we went back to derive the Shields parameter, and when you do the math, this ratio of drag to friction falls out.”

    Drag and drop

    Their work showed that the Shields parameter — which predicts how much sediment is transported — can be modified to include not just size and density, but also grain shape, and furthermore, that a grain’s shape can be simply represented by a measure of the grain’s drag and its internal friction. The math seemed to make sense. But could the new formula predict how sediment actually flows?

    To answer this, the researchers ran a series of flume experiments, in which they pumped a current of water through an inclined tank with a floor covered in sediment. They ran tests with sediment of various grain shapes, including beds of round glass beads, smooth glass chips, rectangular prisms, and natural gravel. They measured the amount of sediment that was transported through the tank in a fixed amount of time. They then determined the effect of each sediment type’s grain shape by measuring the grains’ drag and friction.

    For drag, the researchers simply dropped individual grains down through a tank of water and gathered statistics for the time it took the grains of each sediment type to reach the bottom. For instance, a flatter grain type takes a longer time on average, and therefore has greater drag, than a round grain type of the same size and density.

    To measure friction, the team poured grains through a funnel and onto a circular tray, then measured the resulting pile’s angle, or slope — an indication of the grains’ friction, or ability to grip onto each other.

    For each sediment type, they then worked the corresponding shape’s drag and friction into the new formula, and found that it could indeed predict the bedload transport, or the amount of moving sediment that the researchers measured in their experiments.

    The team says the new model more accurately represents sediment flow. Going forward, scientists and engineers can use the model to better gauge how a river bed will respond to scenarios such as sudden flooding from severe weather or the removal of a dam.

    “If you were trying to make a prediction of how fast all that sediment will get evacuated after taking a dam out, and you’re wrong by a factor of three or five, that’s pretty bad,” Perron says. “Now we can do a lot better.”

    This research was supported, in part, by the U.S. Army Research Laboratory. More

  • in

    Looking to the past to prepare for an uncertain future

    Aviva Intveld, an MIT senior majoring in Earth, atmospheric, and planetary sciences, is accustomed to city life. But despite hailing from metropolitan Los Angeles, she has always maintained a love for the outdoors.

    “Growing up in L.A., you just have a wealth of resources when it comes to beautiful environments,” she says, “but you’re also constantly living connected to the environment.” She developed a profound respect for the natural world and its effects on people, from the earthquakes that shook the ground to the wildfires that displaced inhabitants.

    “I liked the lifestyle that environmental science afforded,” Intveld recalls. “I liked the idea that you can make a career out of spending a huge amount of time in the field and exploring different parts of the world.”

    From the moment she arrived at MIT, Intveld threw herself into research on and off campus. During her first semester, she joined Terrascope, a program that encourages first-year students to tackle complex, real-world problems. Intveld and her cohort developed proposals to make recovery from major storms in Puerto Rico faster, more sustainable, and more equitable.

    Intveld also spent a semester studying drought stress in the lab of Assistant Professor David Des Marais, worked as a research assistant at a mineral sciences research lab back in L.A., and interned at the World Wildlife Fund. Most of her work focused on contemporary issues like food insecurity and climate change. “I was really interested in questions about today,” Intveld says.

    Her focus began to shift to the past when she interned as a research assistant at the Marine Geoarchaeology and Micropaleontology Lab at the University of Haifa. For weeks, she would spend eight hours a day hunched over a microscope, using a paintbrush to sort through grains of sand from the coastal town of Caesarea. She was looking for tiny spiral-shaped fossils of foraminifera, an organism that resides in seafloor sediments.

    These microfossils can reveal a lot about the environment in which they originated, including extreme weather events. By cataloging diverse species of foraminifera, Intveld was helping to settle a rather niche debate in the field of geoarchaeology: Did tsunamis destroy the harbor of Caesarea during the time of the ancient Romans?

    But in addition to figuring out if and when these natural disasters occurred, Intveld was interested in understanding how ancient communities prepared for and recovered from them. What methods did they use? Could those same methods be used today?

    Intveld’s research at the University of Haifa was part of the Onward Israel program, which offers young Jewish people the chance to participate in internships, academic study, and fellowships in Israel. Intveld describes the experience as a great opportunity to learn about the culture, history, and diversity of the Israeli community. The trip was also an excellent lesson in dealing with challenging situations.

    Intveld suffers from claustrophobia, but she overcame her fears to climb through the Bar Kokhba caves, and despite a cat allergy, she grew to adore the many stray cats that roam the streets of Haifa. “Sometimes you can’t let your physical limitations stop you from doing what you love,” she quips.

    Over the course of her research, Intveld has often found herself in difficult and even downright dangerous situations, all of which she looks back on with good humor. As part of an internship with the National Oceanic and Atmospheric Administration, she spent three months investigating groundwater in Homer, Alaska. While she was there, she learned to avoid poisonous plants out in the field, got lost bushwhacking, and was twice charged by a moose.

    These days, Intveld spends less time in the field and more time thinking about the ancient past. She works in the lab of Associate Professor David McGee, where her undergraduate thesis research focuses on reconstructing the paleoclimate and paleoecology of northeastern Mexico during the Early Holocene. To get an idea of what the Mexican climate looked like thousands of years ago, Intveld analyzes stable isotopes and trace elements in stalagmites taken from Mexican caves. By analyzing the isotopes of carbon and oxygen present in these stalagmites, which were formed over thousands of years from countless droplets of mineral-rich rainwater, Intveld can estimate the amount of rainfall and average temperature in a given time period.

    Intveld is primarily interested in how the area’s climate may have influenced human migration. “It’s very interesting to learn about the history of human motivation, what drives us to do what we do,” she explains. “What causes humans to move, and what causes us to stay?” So far, it seems the Mexican climate during the Early Holocene was quite inconsistent, with oscillating periods of wet and dry, but Intveld needs to conduct more research before drawing any definitive conclusions.

    Recent research has linked periods of drought in the geological record to periods of violence in the archaeological one, suggesting ancient humans often fought over access to water. “I think you can easily see the connections to stuff that we deal with today,” Intveld says, pointing out the parallels between paleolithic migration and today’s climate refugees. “We have to answer a lot of difficult questions, and one way that we can do so is by looking to see what earlier human communities did and what we can learn from them.”

    Intveld recognizes the impact of the past on our present and future in many other areas. She works as a tour guide for the List Visual Arts Center, where she educates people about public art on the MIT campus. “[Art] interested me as a way to experience history and learn about the story of different communities and people over time,” she says.

    Intveld is also unafraid to acknowledge the history of discrimination and exclusion in science. “Earth science has a big problem when it comes to inclusion and diversity,” she says. As a member of the EAPS Diversity, Equity and Inclusion Committee, she aims to make earth science more accessible.

    “Aviva has a clear drive to be at the front lines of geoscience research, connecting her work to the urgent environmental issues we’re all facing,” says McGee. “She also understands the critical need for our field to include more voices, more perspectives — ultimately making for better science.”

    After MIT, Intveld hopes to pursue an advanced degree in the field of sustainable mining. This past spring, she studied abroad at Imperial College London, where she took courses within the Royal School of Mines. As Intveld explains, mining is becoming crucial to sustainable energy. The rise of electric vehicles in places like California has increased the need for energy-critical elements like lithium and cobalt, but mining for these elements often does more harm than good. “The current mining complex is very environmentally destructive,” Intveld says.

    But Intveld hopes to take the same approach to mining she does with her other endeavors — acknowledging the destructive past to make way for a better future. More

  • in

    A new way to assess radiation damage in reactors

    A new method could greatly reduce the time and expense needed for certain important safety checks in nuclear power reactors. The approach could save money and increase total power output in the short run, and it might increase plants’ safe operating lifetimes in the long run.

    One of the most effective ways to control greenhouse gas emissions, many analysts argue, is to prolong the lifetimes of existing nuclear power plants. But extending these plants beyond their originally permitted operating lifetimes requires monitoring the condition of many of their critical components to ensure that damage from heat and radiation has not led, and will not lead, to unsafe cracking or embrittlement.

    Today, testing of a reactor’s stainless steel components — which make up much of the plumbing systems that prevent heat buildup, as well as many other parts — requires removing test pieces, known as coupons, of the same kind of steel that are left adjacent to the actual components so they experience the same conditions. Or, it requires the removal of a tiny piece of the actual operating component. Both approaches are done during costly shutdowns of the reactor, prolonging these scheduled outages and costing millions of dollars per day.

    Now, researchers at MIT and elsewhere have come up with a new, inexpensive, hands-off test that can produce similar information about the condition of these reactor components, with far less time required during a shutdown. The findings are reported today in the journal Acta Materiala in a paper by MIT professor of nuclear science and engineering Michael Short; Saleem Al Dajani ’19 SM ’20, who did his master’s work at MIT on this project and is now a doctoral student at the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia; and 13 others at MIT and other institutions.

    The test involves aiming laser beams at the stainless steel material, which generates surface acoustic waves (SAWs) on the surface. Another set of laser beams is then used to detect and measure the frequencies of these SAWs. Tests on material aged identically to nuclear power plants showed that the waves produced a distinctive double-peaked spectral signature when the material was degraded.

    Short and Al Dajani embarked on the process in 2018, looking for a more rapid way to detect a specific kind of degradation, called spinodal decomposition, that can take place in austenitic stainless steel, which is used for components such as the 2- to 3-foot wide pipes that carry coolant water to and from the reactor core. This process can lead to embrittlement, cracking, and potential failure in the event of an emergency.

    While spinodal decomposition is not the only type of degradation that can occur in reactor components, it is a primary concern for the lifetime and sustainability of nuclear reactors, Short says.

    “We were looking for a signal that can link material embrittlement with properties we can measure, that can be used to estimate lifetimes of structural materials,” Al Dajani says.

    They decided to try a technique Short and his students and collaborators had expanded upon, called transient grating spectroscopy, or TGS, on samples of reactor materials known to have experienced spinodal decomposition as a result of their reactor-like thermal aging history. The method uses laser beams to stimulate, and then measure, SAWs on a material. The idea was that the decomposition should slow down the rate of heat flow through the material, that slowdown would be detectable by the TGS method.

    However, it turns out there was no such slowdown. “We went in with a hypothesis about what we would see, and we were wrong,” Short says.

    That’s often the way things work out in science, he says. “You go in guns blazing, looking for a certain thing, for a great reason, and you turn out to be wrong. But if you look carefully, you find other patterns in the data that reveal what nature actually has to say.”

    Instead, what showed up in the data was that, while a material would usually produce a single frequency peak for the material’s SAWs, in the degraded samples there was a splitting into two peaks.

    “It was a very clear pattern in the data,” Short recalls. “We just didn’t expect it, but it was right there screaming at us in the measurements.”

    Cast austenitic stainless steels like those used in reactor components are what’s known as duplex steels, actually a mixture of two different crystal structures in the same material by design. But while one of the two types is quite impervious to spinodal decomposition, the other is quite vulnerable to it. When the material starts to degrade, the difference shows up in the different frequency responses of the material, which is what the team found in their data.

    That finding was a total surprise, though. “Some of my current and former students didn’t believe it was happening,” Short says. “We were unable to convince our own team this was happening, with the initial statistics we had.” So, they went back and carried out further tests, which continued to strengthen the significance of the results. They reached a point where the confidence level was 99.9 percent that spinodal decomposition was indeed coincident with the wave peak separation.

    “Our discussions with those who opposed our initial hypotheses ended up taking our work to the next level,” Al Dajani says.

    The tests they did used large lab-based lasers and optical systems, so the next step, which the researchers are hard at work on, is miniaturizing the whole system into something that can be an easily portable test kit to use to check reactor components on-site, reducing the length of shutdowns. “We’re making great strides, but we still have some way to go,” he says.

    But when they achieve that next step, he says, it could make a significant difference. “Every day that your nuclear plant goes down, for a typical gigawatt-scale reactor, you lose about $2 million a day in lost electricity,” Al Dajani says, “so shortening outages is a huge thing in the industry right now.”

    He adds that the team’s goal was to find ways to enable existing plants to operate longer: “Let them be down for less time and be as safe or safer than they are right now — not cutting corners, but using smart science to get us the same information with far less effort.” And that’s what this new technique seems to offer.

    Short hopes that this could help to enable the extension of power plant operating licenses for some additional decades without compromising safety, by enabling frequent, simple and inexpensive testing of the key components. Existing, large-scale plants “generate just shy of a billion dollars in carbon-free electricity per plant each year,” he says, whereas bringing a new plant online can take more than a decade. “To bridge that gap, keeping our current nukes online is the single biggest thing we can do to fight climate change.”

    The team included researchers at MIT, Idaho National Laboratory, Manchester University and Imperial College London in the UK, Oak Ridge National Laboratory, the Electric Power Research Institute, Northeastern University, the University of California at Berkeley, and KAUST. The work was supported by the International Design Center at MIT and the Singapore University of Technology and Design, the U.S. Nuclear Regulatory Commission, and the U.S. National Science Foundation. More

  • in

    New MIT internships expand research opportunities in Africa

    With new support from the Office of the Associate Provost for International Activities, MIT International Science and Technology Initiatives (MISTI) and the MIT-Africa program are expanding internship opportunities for MIT students at universities and leading academic research centers in Africa. This past summer, MISTI supported 10 MIT student interns at African universities, significantly more than in any previous year.

    “These internships are an opportunity to better merge the research ecosystem of MIT with academia-based research systems in Africa,” says Evan Lieberman, the Total Professor of Political Science and Contemporary Africa and faculty director for MISTI.

    For decades, MISTI has helped MIT students to learn and explore through international experiential learning opportunities and internships in industries like health care, education, agriculture, and energy. MISTI’s MIT-Africa Seed Fund supports collaborative research between MIT faculty and Africa-based researchers, and the new student research internship opportunities are part of a broader vision for deeper engagement between MIT and research institutions across the African continent.

    While Africa is home to 12.5 percent of the world’s population, it generates less than 1 percent of scientific research output in the form of academic journal publications, according to the African Academy of Sciences. Research internships are one way that MIT can build mutually beneficial partnerships across Africa’s research ecosystem, to advance knowledge and spawn innovation in fields important to MIT and its African counterparts, including health care, biotechnology, urban planning, sustainable energy, and education.

    Ari Jacobovits, managing director of MIT-Africa, notes that the new internships provide additional funding to the lab hosting the MIT intern, enabling them to hire a counterpart student research intern from the local university. This support can make the internships more financially feasible for host institutions and helps to grow the research pipeline.

    With the support of MIT, State University of Zanzibar (SUZA) lecturers Raya Ahmada and Abubakar Bakar were able to hire local students to work alongside MIT graduate students Mel Isidor and Rajan Hoyle. Together the students collaborated over a summer on a mapping project designed to plan and protect Zanzibar’s coastal economy.

    “It’s been really exciting to work with research peers in a setting where we can all learn alongside one another and develop this project together,” says Hoyle.

    Using low-cost drone technology, the students and their local counterparts worked to create detailed maps of Zanzibar to support community planning around resilience projects designed to combat coastal flooding and deforestation and assess climate-related impacts to seaweed farming activities. 

    “I really appreciated learning about how engagement happens in this particular context and how community members understand local environmental challenges and conditions based on research and lived experience,” says Isidor. “This is beneficial for us whether we’re working in an international context or in the United States.”

    For biology major Shaida Nishat, her internship at the University of Cape Town allowed her to work in a vital sphere of public health and provided her with the chance to work with a diverse, international team headed by Associate Professor Salome Maswine, head of the global surgery division and a widely-renowned expert in global surgery, a multidisciplinary field in the sphere of global health focused on improved and equitable surgical outcomes.

    “It broadened my perspective as to how an effort like global surgery ties so many nations together through a common goal that would benefit them all,” says Nishat, who plans to pursue a career in public health.

    For computer science sophomore Antonio L. Ortiz Bigio, the MISTI research internship in Africa was an incomparable experience, culturally and professionally. Bigio interned at the Robotics Autonomous Intelligence and Learning Laboratory at the University of Witwatersrand in Johannesburg, led by Professor Benjamin Rosman, where he developed software to enable a robot to play chess. The experience has inspired Bigio to continue to pursue robotics and machine learning.

    Participating faculty at the host institutions welcomed their MIT interns, and were impressed by their capabilities. Both Rosman and Maswime described their MIT interns as hard-working and valued team members, who had helped to advance their own work.  

    Building strong global partnerships, whether through faculty research, student internships, or other initiatives, takes time and cultivation, explains Jacobovits. Each successful collaboration helps to seed future exchanges and builds interest at MIT and peer institutions in creative partnerships. As MIT continues to deepen its connections to institutions and researchers across Africa, says Jacobovits, “students like Shaida, Rajan, Mel, and Antonio are really effective ambassadors in building those networks.” More

  • in

    Ian Hutchinson: A lifetime probing plasma, on Earth and in space

    Ordinary folks gazing at the night sky can readily spot Earth’s close neighbors and the light of distant stars. But when Ian Hutchinson scans the cosmos, he takes in a great deal more. There is, for instance, the constant rush of plasma — highly charged ionized gases — from the sun. As this plasma flows by solid bodies such as the moon, it interacts with them electromagnetically, sometimes generating a phenomenon called an electron hole — a perturbation in the gaseous solar tide that forms a solitary, long-lived wave. Hutchinson, a professor in the MIT Department of Nuclear Science and Engineering (NSE), knows they exist because he found a way to measure them.

    “When I look up at the moon with my sweetheart, my wife of 48 years, I imagine that streaming from its dark side are electron holes that my students and I predicted, and that we then discovered,” he says. “It’s quite sentimental to me.”

    Hutchinson’s studies of these wave phenomena, summed up in a paper, “Electron holes in phase space: What they are and why they matter,” recently earned the 2022 Ronald C. Davidson Award for Plasma Physics presented by the American Physical Society’s Division of Plasma Physics.

    Measuring perturbations in plasma

    Hutchinson’s exploration of electron holes was sparked by his work over many decades in fusion energy, another branch of plasma physics. He has made many contributions to the design, operation, and experimental investigation of tokamaks — a toroidal magnetic confinement device — intended to replicate and harness the fiery thermonuclear reactions in the plasma of stars for carbon-free energy on Earth. Hutchinson took a particular interest in how to measure the plasma, notably the flow at the edges of tokamaks.

    Heat generated from fusion reactions may escape magnetic confinement and build up along these edges, leading to potential temperature spikes that impact the performance of the confinement device. Hutchinson discovered how to interpret signals from small probes to measure and track plasma velocity at the tokamak’s edge.

    “My theoretical work also showed that these probes quite likely induce electron holes,” he says. But proving this contention required experiments at resolutions in time and space beyond what tokamaks allow. That’s when Hutchinson had an important insight.

    “I realized that the phenomena we were trying to investigate can actually be measured with exquisite accuracy by satellites that travel through plasma surrounding Earth and other solid bodies,” he says. Although plasmas in space are at a much larger scale than the plasmas generated in the laboratory, measurements of these gases by a satellite is analogous “to a situation where we fly a tiny micron-sized spacecraft through the wakes of probes at the edge of tokamaks,” says Hutchinson.

    Using satellite data provided by NASA, Hutchinson set about analyzing solar plasma as it whips by the moon. “We predicted instabilities and the generation of electron holes,” he recounts. “Our theory passed with flying colors: We saw lots of holes in the wake of the moon, and few elsewhere.”

    Developing tokamaks

    Hutchinson grew up in the English midlands and attended Cambridge University, where he became “intrigued by plasma physics in a course taught by an entertaining and effective teacher,” he says.

    Hutchinson headed for doctoral studies at Australian National University on fellowship. The experience afforded him his first opportunity for research on plasma confinement. “There I was at the ends of the Earth, and I was one of very few scientists worldwide with a tokamak almost to myself,” he says. “It was a device that had risen to the top of everyone’s agenda in fusion research as something we really needed to understand.”

    His dissertation, which examined instabilities in plasma, and his hands-on experience with the device, brought him to the attention of Ronald Parker SM ’63, PhD ’67, now emeritus professor of nuclear science and engineering and electrical engineering and computer science, who was building MIT’s Alcator tokamak program.

    In 1976, Hutchinson joined this group, spending three years as a research scientist. After an interval in Britain, he returned to MIT with a faculty position in NSE, and soon, a leadership role in developing the next phase of the Institute’s fusion experiment, the Alcator-C Mod tokamak.

    “This was a major development of the high-magnetic field approach to fusion,” says Hutchinson. Powerful magnets are essential for containing the superhot plasma; the MIT group developed an experiment with a magnetic field more than 150,000 times the strength of the Earth’s magnetic field. “We were in the business of determining whether tokamaks had sufficiently good confinement to function as fusion reactors,” he says.

    Hutchinson oversaw the nearly six-year construction of the device, which was funded by the U.S. Department of Energy. He then led its operation starting in 1993, creating a national facility for experiments that drew scientists and students from around the world. At the time, it was the largest research group on campus at MIT.

    In their studies, scientists employed novel heating and sustainment techniques using radio waves and microwaves. They also discovered new methods for performing diagnostics inside the tokamak. “Alcator C-Mod demonstrated excellent confinement in a more compact and cost-effective device,” says Hutchinson. “It was unique in the world.”

    Hutchinson is proud of Alcator C-Mod’s technological achievements, including its record for highest plasma pressure for a magnetic confinement device. But this large-scale project holds even greater significance for him. “Alcator C-Mod helped beat a new path in fusion research, and has become the basis for the SPARC tokamak now under construction,” he says.

    SPARC is a compact, high-magnetic field fusion energy device under development through a collaboration between MIT’s Plasma Science and Fusion Center and startup Commonwealth Fusions Systems. Its goal is to demonstrate net energy gain from fusion, prove the viability of fusion as a source of carbon-free energy, and tip the scales in the race against climate change. A number of SPARC’s leaders are students Hutchinson taught. “This is a source of considerable satisfaction,” he says. “Some of their down-to-Earth realism comes from me, and perhaps some of their aspirations have been molded by their work with me.” 

    A new phase

    After leading Alcator C-Mod for 15 years and generating hundreds of journal articles, Hutchinson served as NSE’s department head from 2003 to 2009. He wrote the standard textbook on measuring plasmas, and has more recently written “A Student’s Guide to Numerical Methods” (2015), which evolved from a course he taught to introduce graduate students to computational problem-solving in physics and engineering.

    After this, his 40th year on the MIT faculty, Hutchinson will be stepping back from teaching. “It’s important for new generations of students to be taught by people at the pinnacle of their mental and intellectual capacity, and when you reach my age, you’re aware of the fact that you’re slowing down,” he says.

    Hutchinson’s at no loss for ways to spend his time. As a devout Christian, he speaks and writes about the relationship between religion and science, trying to help skeptics on both sides find common ground. He sings in two choral groups, and is very busy grandparenting four grandsons. For a complete change of pace, Hutchinson goes fly fishing.

    But he still has plans to explore new frontiers in plasma physics. “I’m gratified to say I still do important research,” he says. “I’ve solved most of the problems in electron holes, and now I need to say something about ion holes!” More

  • in

    Sustainable supply chains put the customer first

    When we consider the supply chain, we typically think of factories, ships, trucks, and warehouses. Yet, the customer side is equally important, especially in efforts to make our distribution networks more sustainable. Customers are an untapped resource in building sustainability, says Josué C. Velázquez Martínez, a research scientist at MIT Center for Transportation and Logistics. 

    Velázquez Martínez, who is director of MIT’s Sustainable Supply Chain Lab, investigates how customer-facing supply chains can be made more environmentally and socially sustainable. One way is a Green Button project that explores how to optimize e-commerce delivery schedules to reduce carbon emissions and persuade customers to use less carbon-intensive four- or five-day shipping options instead of one or two days. Velázquez Martínez has also launched the MIT Low Income Firms Transformation (LIFT) Lab that is researching ways to improve micro-retailer supply chains in the developing world to provide owners with the necessary tools for survival.  

    “The definition of sustainable supply chain keeps evolving because things that were sustainable 20 to 30 years ago are not as sustainable now,” says Velázquez Martínez. “Today, there are more companies that are capturing information to build strategies for environmental, economic, and social sustainability. They are investing in alternative energy and other solutions to make the supply chain more environmentally friendly and are tracking their suppliers and identifying key vulnerabilities. A big part of this is an attempt to create fairer conditions for people who work in supply chains or are dependent on them.”

    Play video

    The move toward sustainable supply chain is being driven as much by people as by companies, whether they are playing the role of selective consumer or voting citizens. The consumer aspect is often overlooked, says Velázquez Martínez. “Consumers are the ones who move the supply chain. We are looking at how companies can provide transparency to involve customers in their sustainability strategy.” 

    Proposed solutions for sustainability are not always as effective as promised. Some fashion rental schemes fall into this category, says Velázquez Martínez. “There are many new rental companies that are trying to get more use out of clothes to offset the emissions associated with production. We recently researched the environmental impact of monthly subscription models where consumers pay a fee to receive clothes for a month before returning them, as well as peer-to-peer sharing models.” 

    The researchers found that while rental services generally have a lower carbon footprint than retail sales, hidden emissions from logistics played a surprisingly large role. “First, you need to deliver the clothes and pick them up, and there are high return rates,” says Velázquez Martínez. “When you factor in dry cleaning and packaging emissions, the rental models in some cases have a worse carbon footprint than buying new clothes.” Peer-to-peer sharing could be better, he adds, but that depends on how far the consumers travel to meet-up points. 

    Typically, says Velázquez Martínez, garment types that are frequently used are not well suited to rental models. “But for specialty clothes such as wedding dresses or prom dresses, it is better to rent.” 

    Waiting a few days to save the planet 

    Even before the pandemic, online retailing gained a second wind due to low-cost same- and next-day delivery options. While e-commerce may have its drawbacks as a contributor to social isolation and reduced competition, it has proven itself to be far more eco-friendly than brick-and-mortar shopping, not to mention a lot more convenient. Yet rapid deliveries are cutting into online-shopping’s carbon-cutting advantage.

    In 2019, MIT’s Sustainable Supply Chain Lab launched a Green Bottle project to study the rapid delivery phenomenon. The project has been “testing whether consumers would be willing to delay their e-commerce deliveries to reduce the environmental impact of fast shipping,” says Velázquez Martínez. “Many companies such as Walmart and Target have followed Amazon’s 2019 strategy of moving from two-day to same-day delivery. Instead of sending a fully loaded truck to a neighborhood every few days, they now send multiple trucks to that neighborhood every day, and there are more days when trucks are targeting each neighborhood. All this increases carbon emissions and makes it hard for shippers to consolidate. ”  

    Working with Coppel, one of Mexico’s largest retailers, the Green Button project inspired a related Consolidation Ecommerce Project that built a large-scale mathematical model to provide a strategy for consolidation. The model determined what delivery time window each neighborhood demands and then calculated the best day to deliver to each neighborhood to meet the desired window while minimizing carbon emissions. 

    No matter what mixture of delivery times was used, the consolidation model helped retailers schedule deliveries more efficiently. Yet, the biggest cuts in emissions emerged when customers were willing to wait several days.

    Play video

    “When we ran a month-long simulation comparing our model for four-to-five-day delivery with Coppel’s existing model for one- or two-day delivery, we saw savings in fuel consumption of over 50 percent on certain routes” says Velázquez Martínez. “This is huge compared to other strategies for squeezing more efficiency from the last-mile supply chain, such as routing optimization, where savings are close to 5 percent. The optimal solution depends on factors such as the capacity for consolidation, the frequency of delivery, the store capacity, and the impact on inbound operations.” 

    The researchers next set out to determine if customers could be persuaded to wait longer for deliveries. Considering that the price differential is low or nonexistent, this was a considerable challenge. Yet, the same day habit is only a few years old, and some consumers have come to realize they don’t always need rapid deliveries. “Some consumers who order by rapid delivery find they are too busy to open the packages right away,” says Velázquez Martínez.  

    Trees beat kilograms of CO2

    The researchers set out to find if consumers would be willing to sacrifice a bit of convenience if they knew they were helping to reduce climate change. The Green Button project tested different public outreach strategies. For one test group, they reported the carbon impact of delivery times in kilograms of carbon dioxide (CO2). Another group received the information expressed in terms of the energy required to recycle a certain amount of garbage. A third group learned about emissions in terms of the number of trees required to trap the carbon. “Explaining the impact in terms of trees led to almost 90 percent willing to wait another day or two,” says Velázquez Martínez. “This is compared to less than 40 percent for the group that received the data in kilograms of CO2.” 

    Another surprise was that there was no difference in response based on income, gender, or age. “Most studies of green consumers suggest they are predominantly high income, female, highly educated, or younger,” says Velázquez Martínez. “However, our results show that the differences were the same between low and high income, women and men, and younger and older people. We have shown that disclosing emissions transparently and making the consumer a part of the strategy can be a new opportunity for more consumer-driven logistics sustainability.” 

    The researchers are now developing similar models for business-to-business (B2B) e-commerce. “We found that B2B supply chain emissions are often high because many shipping companies require strict delivery windows,” says Velázquez Martínez.  

    The B2B models drill down to examine the Corporate Value Chain (Scope 3) emissions of suppliers. “Although some shipping companies are now asking their suppliers to review emissions, it is a challenge to create a transparent supply chain,” says Velázquez Martínez.  “Technological innovations have made it easier, starting with RFID [radio frequency identification], and then real-time GPS mapping and blockchain. But these technologies need to be more accessible and affordable, and we need more companies willing to use them.” 

    Some companies have been hesitant to dig too deeply into their supply chain, fearing they might uncover a scandal that might risk their reputation, says Velázquez Martínez. Other organizations are forced to look at the issue when nongovernmental organizations research sustainability issues such as social injustice in sweat shops and conflict mineral mines. 

    One challenge to building a transparent supply chain is that “in many companies, the sustainability teams are separate from the rest of the company,” says Velázquez Martínez. “Even if the CEOs receive information on sustainability issues, it often doesn’t filter down because the information does not belong to the planners or managers. We are pushing companies to not only account for sustainability factors in supply chain network design but also examine daily operations that affect sustainability. This is a big topic now: How can we translate sustainability information into something that everybody can understand and use?” 

    LIFT Lab lifts micro-retailers  

    In 2016, Velázquez Martínez launched the MIT GeneSys project to gain insights into micro and small enterprises (MSEs) in developing countries. The project released a GeneSys mobile app, which was used by more than 500 students throughout Latin America to collect data on more than 800 microfirms. In 2022, he launched the LIFT Lab, which focuses more specifically on studying and improving the supply chain for MSEs.  

    Worldwide, some 90 percent of companies have fewer than 10 employees. In Latin America and the Caribbean, companies with fewer than 50 employees represent 99 percent of all companies and 47 percent of employment. 

    Although MSEs represent much of the world’s economy, they are poorly understood, notes Velázquez Martínez. “Those tiny businesses are driving a lot of the economy and serve as important customers for the large companies working in developing countries. They range from small businesses down to people trying to get some money to eat by selling cakes or tacos through their windows.”  

    The MIT LIFT Lab researchers investigated whether MSE supply chain issues could help shed light on why many Latin American countries have been limited to marginal increases in gross domestic product. “Large companies from the developed world that are operating in Latin America, such as Unilever, Walmart, and Coca-Cola, have huge growth there, in some cases higher than they have in the developed world,” says Velázquez Martínez. “Yet, the countries are not developing as fast as we would expect.” 

    The LIFT Lab data showed that while the multinationals are thriving in Latin America, the local MSEs are decreasing in productivity. The study also found the trend has worsened with Covid-19.  

    The LIFT Lab’s first big project, which is sponsored by Mexican beverage and retail company FEMSA, is studying supply chains in Mexico. The study spans 200,000 micro-retailers and 300,000 consumers. In a collaboration with Tecnológico de Monterrey, hundreds of students are helping with a field study.  

    “We are looking at supply chain management and business capabilities and identifying the challenges to adoption of technology and digitalization,” says Velázquez Martínez. “We want to find the best ways for micro-firms to work with suppliers and consumers by identifying the consumers who access this market, as well as the products and services that can best help the micro-firms drive growth.” 

    Based on the earlier research by GeneSys, Velázquez Martínez has developed some hypotheses for potential improvements for micro-retailer supply chain, starting with payment terms. “We found that the micro-firms often get the worst purchasing deals. Owners without credit cards and with limited cash often buy in smaller amounts at much higher prices than retailers like Walmart. The big suppliers are squeezing them.” 

    While large retailers usually get 60 to 120 days to pay, micro-retailers “either pay at the moment of the transaction or in advance,” says Velázquez Martínez. “In a study of 500 micro-retailers in five countries in Latin America, we found the average payment time was minus seven days payment in advance. These terms reduce cash availability and often lead to bankruptcy.” 

    LIFT Lab is working with suppliers to persuade them to offer a minimum payment time of two weeks. “We can show the suppliers that the change in terms will let them move more product and increase sales,” says Velázquez Martínez. “Meanwhile, the micro-retailers gain higher profits and become more stable, even if they may pay a bit more.” 

    LIFT Lab is also looking at ways that micro-retailers can leverage smartphones for digitalization and planning. “Some of these companies are keeping records on napkins,” says Velázquez Martínez. “By using a cellphone, they can charge orders to suppliers and communicate with consumers. We are testing different dashboards for mobile apps to help with planning and financial performance. We are also recommending services the stores can provide, such as paying electricity or water bills. The idea is to build more capabilities and knowledge and increase business competencies for the supply chain that are tailored for micro-retailers.” 

    From a financial perspective, micro-retailers are not always the most efficient way to move products. Yet they also play an important role in building social cohesion within neighborhoods. By offering more services, the corner bodega can bring people together in ways that are impossible with e-commerce and big-box stores.  

    Whether the consumers are micro-firms buying from suppliers or e-commerce customers waiting for packages, “transparency is key to building a sustainable supply chain,” says Velázquez Martínez. “To change consumer habits, consumers need to be better educated on the impacts of their behaviors. With consumer-facing logistics, ‘The last shall be first, and the first last.’” More

  • in

    Strengthening electron-triggered light emission

    The way electrons interact with photons of light is a key part of many modern technologies, from lasers to solar panels to LEDs. But the interaction is inherently a weak one because of a major mismatch in scale: A wavelength of visible light is about 1,000 times larger than an electron, so the way the two things affect each other is limited by that disparity.

    Now, researchers at MIT and elsewhere have come up with an innovative way to make much stronger interactions between photons and electrons possible, in the process producing a hundredfold increase in the emission of light from a phenomenon called Smith-Purcell radiation. The finding has potential implications for both commercial applications and fundamental scientific research, although it will require more years of research to make it practical.

    The findings are reported today in the journal Nature, in a paper by MIT postdocs Yi Yang (now an assistant professor at the University of Hong Kong) and Charles Roques-Carmes, MIT professors Marin Soljačić and John Joannopoulos, and five others at MIT, Harvard University, and Technion-Israel Institute of Technology.

    In a combination of computer simulations and laboratory experiments, the team found that using a beam of electrons in combination with a specially designed photonic crystal — a slab of silicon on an insulator, etched with an array of nanometer-scale holes — they could theoretically predict stronger emission by many orders of magnitude than would ordinarily be possible in conventional Smith-Purcell radiation. They also experimentally recorded a one hundredfold increase in radiation in their proof-of-concept measurements.

    Unlike other approaches to producing sources of light or other electromagnetic radiation, the free-electron-based method is fully tunable — it can produce emissions of any desired wavelength, simply by adjusting the size of the photonic structure and the speed of the electrons. This may make it especially valuable for making sources of emission at wavelengths that are difficult to produce efficiently, including terahertz waves, ultraviolet light, and X-rays.

    The team has so far demonstrated the hundredfold enhancement in emission using a repurposed electron microscope to function as an electron beam source. But they say that the basic principle involved could potentially enable far greater enhancements using devices specifically adapted for this function.

    The approach is based on a concept called flatbands, which have been widely explored in recent years for condensed matter physics and photonics but have never been applied to affecting the basic interaction of photons and free electrons. The underlying principle involves the transfer of momentum from the electron to a group of photons, or vice versa. Whereas conventional light-electron interactions rely on producing light at a single angle, the photonic crystal is tuned in such a way that it enables the production of a whole range of angles.

    The same process could also be used in the opposite direction, using resonant light waves to propel electrons, increasing their velocity in a way that could potentially be harnessed to build miniaturized particle accelerators on a chip. These might ultimately be able to perform some functions that currently require giant underground tunnels, such as the 30-kilometer-wide Large Hadron Collider in Switzerland.

    “If you could actually build electron accelerators on a chip,” Soljačić says, “you could make much more compact accelerators for some of the applications of interest, which would still produce very energetic electrons. That obviously would be huge. For many applications, you wouldn’t have to build these huge facilities.”

    The new system could also potentially provide a highly controllable X-ray beam for radiotherapy purposes, Roques-Carmes says.

    And the system could be used to generate multiple entangled photons, a quantum effect that could be useful in the creation of quantum-based computational and communications systems, the researchers say. “You can use electrons to couple many photons together, which is a considerably hard problem if using a purely optical approach,” says Yang. “That is one of the most exciting future directions of our work.”

    Much work remains to translate these new findings into practical devices, Soljačić cautions. It may take some years to develop the necessary interfaces between the optical and electronic components and how to connect them on a single chip, and to develop the necessary on-chip electron source producing a continuous wavefront, among other challenges.

    “The reason this is exciting,” Roques-Carmes adds, “is because this is quite a different type of source.” While most technologies for generating light are restricted to very specific ranges of color or wavelength, and “it’s usually difficult to move that emission frequency. Here it’s completely tunable. Simply by changing the velocity of the electrons, you can change the emission frequency. … That excites us about the potential of these sources. Because they’re different, they offer new types of opportunities.”

    But, Soljačić concludes, “in order for them to become truly competitive with other types of sources, I think it will require some more years of research. I would say that with some serious effort, in two to five years they might start competing in at least some areas of radiation.”

    The research team also included Steven Kooi at MIT’s Institute for Soldier Nanotechnologies, Haoning Tang and Eric Mazur at Harvard University, Justin Beroz at MIT, and Ido Kaminer at Technion-Israel Institute of Technology. The work was supported by the U.S. Army Research Office through the Institute for Soldier Nanotechnologies, the U.S. Air Force Office of Scientific Research, and the U.S. Office of Naval Research. More