More stories

  • in

    Machinery of the state

    In Mai Hassan’s studies of Kenya, she documented the emergence of a sprawling administrative network officially billed as encouraging economic development, overseeing the population, and bolstering democracy. But Hassan’s field interviews and archival research revealed a more sinister purpose for the hundreds of administrative and security offices dotting the nation: “They were there to do the presidents’ bidding, which often involved coercing their own countrymen.”

    This research served as a catalyst for Hassan, who joined MIT as an associate professor of political science in July, to investigate what she calls the “politicized management of bureaucracy and the state.” She set out to “understand the motivations, capacities, and roles of people administering state programs and social functions,” she says. “I realized the state is not a faceless being, but instead comprised of bureaucrats carrying out functions on behalf of the state and the regime that runs it.”

    Today, Hassan’s portfolio encompasses not just the bureaucratic state but democratization efforts in Kenya and elsewhere in the East Africa region, including her native Sudan. Her research highlights the difficulties of democratization. “I’m finding that the conditions under which people come together for overthrowing an autocratic regime really matter, because those conditions may actually impede a nation from achieving democracy,” she says.

    A coordinated bureaucracy

    Hassan’s academic engagement with the state’s administrative machinery began during graduate school at Harvard University, where she earned her master’s and doctorate in government. While working with a community trash and sanitation program in some Kenyan Maasai communities, Hassan recalls “shepherding myself from office to office, meeting different bureaucrats to obtain the same approvals but for different jurisdictions.” The Kenyan state had recently set up hundreds of new local administrative units, motivated by what it claimed was the need for greater efficiency. But to Hassan’s eyes, “the administrative network was not well organized, seemed costly to maintain, and seemed to hinder — not bolster — development,” she says. What then, she wondered, was “the political logic behind such state restructuring?”

    Hassan began researching this bureaucratic transformation of Kenya, speaking with administrators in communities large and small who were charged with handling the business of the state. These studies yielded a wealth of findings for her dissertation, and for multiple journals.

    But upon finishing this tranche of research, Hassan realized that it was insufficient simply to study the structure of the state. “Understanding the role of new administrative structures for politics, development, and governance fundamentally requires that we understand who the government has put in charge of them,” she says. Among her insights:

    “The president’s office knows a lot of these administrators, and thinks about their strengths, limitations, and fit within a community,” says Hassan. Some administrators served the purposes of the central government by setting up water irrigation projects or building a new school. But in other villages, the state chose administrators who could act “much more coercively, ignoring development needs, throwing youth who supported the opposition into jail, and spending resources exclusively on policing.”

    Hassan’s work showed that in communities characterized by strong political opposition, “the local administration was always more coercive, regardless of an elected or autocratic president,” she says. Notably, the tenures of such officials proved shorter than those of their peers. “Once administrators get to know a community — going to church and the market with residents — it’s hard to coerce them,” explains Hassan.

    These short tenures come with costs, she notes: “Spending significant time in a station is useful for development, because you know exactly whom to hire if you want to build a school or get something done efficiently.” Politicizing these assignments undermines efforts at delivery of services and, more broadly, economic improvement nationwide. “Regimes that are more invested in retaining power must devote resources to establishing and maintaining control, resources that could otherwise be used for development and the welfare of citizens,” she says.

    Hassan wove together her research covering three presidents over a 50-year period, in the book, “Regime Threats and State Solutions: Bureaucratic Loyalty and Embeddedness in Kenya” (2020, Cambridge University Press), named a Foreign Affairs Best Book of 2020.

    Sudanese roots

    The role of the state in fulfilling the needs of its citizens has long fascinated Hassan. Her grandfather, who had served as Sudan’s ambassador to the USSR, talked to her about the advantages of a centralized government “that allocated resources to reduce inequality,” she says.

    Politics often dominated the conversation in gatherings of Hassan’s family and friends. Her parents immigrated to northern Virginia when she was very young, and many relatives joined them, part of a steady flow of Sudanese fleeing political turmoil and oppression.

    “A lot of people had expected more from the Sudanese state after independence and didn’t get it,” she says. “People had hopes for what the government could and should do.”

    Hassan’s Sudanese roots and ongoing connection to the Sudanese community have shaped her academic interests and goals. At the University of Virginia, she gravitated toward history and economics classes. But it was her time at the Ralph Bunche Summer institute that perhaps proved most pivotal in her journey. This five-week intensive program is offered by the American Political Science Association to introduce underrepresented undergraduate students to doctoral studies. “It was really compelling in this program to think rigorously about all the political ideas I’d heard as I was growing up, and find ways to challenge some assertions empirically,” she says.

    Regime change and civil society

    At Harvard, Hassan first set out to focus on Sudan for her doctoral program. “There wasn’t much scholarship on the country, and what there was lacked rigor,” she says. “That was something that needed to change.” But she decided to postpone this goal after realizing that she might be vulnerable as a student conducting field research there. She landed instead in Kenya, where she honed her interviewing and data collection skills.

    Today, empowered by her prior work, she has returned to Sudan. “I felt that the popular uprising in Sudan and ousting of the Islamist regime in 2019 should be documented and analyzed,” she says. “It was incredible that hundreds of thousands, if not millions, acted collectively to uproot a dictator, in the face of brutal violence from the state.”But “democracy is still uncertain there,” says Hassan. The broad coalition behind regime change “doesn’t know how to govern because different people and different sectors of society have different ideas about what democratic Sudan should look like,” she says. “Overthrowing an autocratic regime and having civil society come together to figure out what’s going to replace it require different things, and it’s unclear if a movement that accomplishes the first is well-suited to do the second.”

    Hassan believes that in order to create lasting democratization, “you need the hard work of building organizations, developing ways in which members learn to compromise among themselves, and make decisions and rules for how to move forward.”

    Hassan is enjoying the fall semester and teaching courses on autocracy and authoritarian regimes. She is excited as well about developing her work on African efforts at democratic mobilization in a political science department she describes as “policy-forward.”

    Over time, she hopes to connect with Institute scholars in the hard sciences to think about other challenges these nations are facing, such as climate change. “It’s really hot in Sudan, and it may be one of the first countries to become completely uninhabitable,” she says. “I’d like to explore strategies for growing crops differently or managing the exceedingly scarce resource of water, and figure out what kind of political discussions will be necessary to implement any changes. It is really critical to think about these problems in an interdisciplinary way.” More

  • in

    Engineers solve a mystery on the path to smaller, lighter batteries

    A discovery by MIT researchers could finally unlock the door to the design of a new kind of rechargeable lithium battery that is more lightweight, compact, and safe than current versions, and that has been pursued by labs around the world for years.

    The key to this potential leap in battery technology is replacing the liquid electrolyte that sits between the positive and negative electrodes with a much thinner, lighter layer of solid ceramic material, and replacing one of the electrodes with solid lithium metal. This would greatly reduce the overall size and weight of the battery and remove the safety risk associated with liquid electrolytes, which are flammable. But that quest has been beset with one big problem: dendrites.

    Dendrites, whose name comes from the Latin for branches, are projections of metal that can build up on the lithium surface and penetrate into the solid electrolyte, eventually crossing from one electrode to the other and shorting out the battery cell. Researchers haven’t been able to agree on what gives rise to these metal filaments, nor has there been much progress on how to prevent them and thus make lightweight solid-state batteries a practical option.

    The new research, being published today in the journal Joule in a paper by MIT Professor Yet-Ming Chiang, graduate student Cole Fincher, and five others at MIT and Brown University, seems to resolve the question of what causes dendrite formation. It also shows how dendrites can be prevented from crossing through the electrolyte.

    Chiang says in the group’s earlier work, they made a “surprising and unexpected” finding, which was that the hard, solid electrolyte material used for a solid-state battery can be penetrated by lithium, which is a very soft metal, during the process of charging and discharging the battery, as ions of lithium move between the two sides.

    This shuttling back and forth of ions causes the volume of the electrodes to change. That inevitably causes stresses in the solid electrolyte, which has to remain fully in contact with both of the electrodes that it is sandwiched between. “To deposit this metal, there has to be an expansion of the volume because you’re adding new mass,” Chiang says. “So, there’s an increase in volume on the side of the cell where the lithium is being deposited. And if there are even microscopic flaws present, this will generate a pressure on those flaws that can cause cracking.”

    Those stresses, the team has now shown, cause the cracks that allow dendrites to form. The solution to the problem turns out to be more stress, applied in just the right direction and with the right amount of force.

    While previously, some researchers thought that dendrites formed by a purely electrochemical process, rather than a mechanical one, the team’s experiments demonstrate that it is mechanical stresses that cause the problem.

    The process of dendrite formation normally takes place deep within the opaque materials of the battery cell and cannot be observed directly, so Fincher developed a way of making thin cells using a transparent electrolyte, allowing the whole process to be directly seen and recorded. “You can see what happens when you put a compression on the system, and you can see whether or not the dendrites behave in a way that’s commensurate with a corrosion process or a fracture process,” he says.

    The team demonstrated that they could directly manipulate the growth of dendrites simply by applying and releasing pressure, causing the dendrites to zig and zag in perfect alignment with the direction of the force.

    Applying mechanical stresses to the solid electrolyte doesn’t eliminate the formation of dendrites, but it does control the direction of their growth. This means they can be directed to remain parallel to the two electrodes and prevented from ever crossing to the other side, and thus rendered harmless.

    In their tests, the researchers used pressure induced by bending the material, which was formed into a beam with a weight at one end. But they say that in practice, there could be many different ways of producing the needed stress. For example, the electrolyte could be made with two layers of material that have different amounts of thermal expansion, so that there is an inherent bending of the material, as is done in some thermostats.

    Another approach would be to “dope” the material with atoms that would become embedded in it, distorting it and leaving it in a permanently stressed state. This is the same method used to produce the super-hard glass used in the screens of smart phones and tablets, Chiang explains. And the amount of pressure needed is not extreme: The experiments showed that pressures of 150 to 200 megapascals were sufficient to stop the dendrites from crossing the electrolyte.

    The required pressure is “commensurate with stresses that are commonly induced in commercial film growth processes and many other manufacturing processes,” so should not be difficult to implement in practice, Fincher adds.

    In fact, a different kind of stress, called stack pressure, is often applied to battery cells, by essentially squishing the material in the direction perpendicular to the battery’s plates — somewhat like compressing a sandwich by putting a weight on top of it. It was thought that this might help prevent the layers from separating. But the experiments have now demonstrated that pressure in that direction actually exacerbates dendrite formation. “We showed that this type of stack pressure actually accelerates dendrite-induced failure,” Fincher says.

    What is needed instead is pressure along the plane of the plates, as if the sandwich were being squeezed from the sides. “What we have shown in this work is that when you apply a compressive force you can force the dendrites to travel in the direction of the compression,” Fincher says, and if that direction is along the plane of the plates, the dendrites “will never get to the other side.”

    That could finally make it practical to produce batteries using solid electrolyte and metallic lithium electrodes. Not only would these pack more energy into a given volume and weight, but they would eliminate the need for liquid electrolytes, which are flammable materials.

    Having demonstrated the basic principles involved, the team’s next step will be to try to apply these to the creation of a functional prototype battery, Chiang says, and then to figure out exactly what manufacturing processes would be needed to produce such batteries in quantity. Though they have filed for a patent, the researchers don’t plan to commercialize the system themselves, he says, as there are already companies working on the development of solid-state batteries. “I would say this is an understanding of failure modes in solid-state batteries that we believe the industry needs to be aware of and try to use in designing better products,” he says.

    The research team included Christos Athanasiou and Brian Sheldon at Brown University, and Colin Gilgenbach, Michael Wang, and W. Craig Carter at MIT. The work was supported by the U.S. National Science Foundation, the U.S. Department of Defense, the U.S. Defense Advanced Research Projects Agency, and the U.S. Department of Energy. More

  • in

    Earth can regulate its own temperature over millennia, new study finds

    The Earth’s climate has undergone some big changes, from global volcanism to planet-cooling ice ages and dramatic shifts in solar radiation. And yet life, for the last 3.7 billion years, has kept on beating.

    Now, a study by MIT researchers in Science Advances confirms that the planet harbors a “stabilizing feedback” mechanism that acts over hundreds of thousands of years to pull the climate back from the brink, keeping global temperatures within a steady, habitable range.

    Just how does it accomplish this? A likely mechanism is “silicate weathering” — a geological process by which the slow and steady weathering of silicate rocks involves chemical reactions that ultimately draw carbon dioxide out of the atmosphere and into ocean sediments, trapping the gas in rocks.

    Scientists have long suspected that silicate weathering plays a major role in regulating the Earth’s carbon cycle. The mechanism of silicate weathering could provide a geologically constant force in keeping carbon dioxide — and global temperatures — in check. But there’s never been direct evidence for the continual operation of such a feedback, until now.

    The new findings are based on a study of paleoclimate data that record changes in average global temperatures over the last 66 million years. The MIT team applied a mathematical analysis to see whether the data revealed any patterns characteristic of stabilizing phenomena that reined in global temperatures on a  geologic timescale.

    They found that indeed there appears to be a consistent pattern in which the Earth’s temperature swings are dampened over timescales of hundreds of thousands of years. The duration of this effect is similar to the timescales over which silicate weathering is predicted to act.

    The results are the first to use actual data to confirm the existence of a stabilizing feedback, the mechanism of which is likely silicate weathering. This stabilizing feedback would explain how the Earth has remained habitable through dramatic climate events in the geologic past.

    “On the one hand, it’s good because we know that today’s global warming will eventually be canceled out through this stabilizing feedback,” says Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “But on the other hand, it will take hundreds of thousands of years to happen, so not fast enough to solve our present-day issues.”

    The study is co-authored by Arnscheidt and Daniel Rothman, professor of geophysics at MIT.

    Stability in data

    Scientists have previously seen hints of a climate-stabilizing effect in the Earth’s carbon cycle: Chemical analyses of ancient rocks have shown that the flux of carbon in and out of Earth’s surface environment has remained relatively balanced, even through dramatic swings in global temperature. Furthermore, models of silicate weathering predict that the process should have some stabilizing effect on the global climate. And finally, the fact of the Earth’s enduring habitability points to some inherent, geologic check on extreme temperature swings.

    “You have a planet whose climate was subjected to so many dramatic external changes. Why did life survive all this time? One argument is that we need some sort of stabilizing mechanism to keep temperatures suitable for life,” Arnscheidt says. “But it’s never been demonstrated from data that such a mechanism has consistently controlled Earth’s climate.”

    Arnscheidt and Rothman sought to confirm whether a stabilizing feedback has indeed been at work, by looking at data of global temperature fluctuations through geologic history. They worked with a range of global temperature records compiled by other scientists, from the chemical composition of ancient marine fossils and shells, as well as preserved Antarctic ice cores.

    “This whole study is only possible because there have been great advances in improving the resolution of these deep-sea temperature records,” Arnscheidt notes. “Now we have data going back 66 million years, with data points at most thousands of years apart.”

    Speeding to a stop

    To the data, the team applied the mathematical theory of stochastic differential equations, which is commonly used to reveal patterns in widely fluctuating datasets.

    “We realized this theory makes predictions for what you would expect Earth’s temperature history to look like if there had been feedbacks acting on certain timescales,” Arnscheidt explains.

    Using this approach, the team analyzed the history of average global temperatures over the last 66 million years, considering the entire period over different timescales, such as tens of thousands of years versus hundreds of thousands, to see whether any patterns of stabilizing feedback emerged within each timescale.

    “To some extent, it’s like your car is speeding down the street, and when you put on the brakes, you slide for a long time before you stop,” Rothman says. “There’s a timescale over which frictional resistance, or a stabilizing feedback, kicks in, when the system returns to a steady state.”

    Without stabilizing feedbacks, fluctuations of global temperature should grow with timescale. But the team’s analysis revealed a regime in which fluctuations did not grow, implying that a stabilizing mechanism reigned in the climate before fluctuations grew too extreme. The timescale for this stabilizing effect — hundreds of thousands of years — coincides with what scientists predict for silicate weathering.

    Interestingly, Arnscheidt and Rothman found that on longer timescales, the data did not reveal any stabilizing feedbacks. That is, there doesn’t appear to be any recurring pull-back of global temperatures on timescales longer than a million years. Over these longer timescales, then, what has kept global temperatures in check?

    “There’s an idea that chance may have played a major role in determining why, after more than 3 billion years, life still exists,” Rothman offers.

    In other words, as the Earth’s temperatures fluctuate over longer stretches, these fluctuations may just happen to be small enough in the geologic sense, to be within a range that a stabilizing feedback, such as silicate weathering, could periodically keep the climate in check, and more to the point, within a habitable zone.

    “There are two camps: Some say random chance is a good enough explanation, and others say there must be a stabilizing feedback,” Arnscheidt says. “We’re able to show, directly from data, that the answer is probably somewhere in between. In other words, there was some stabilization, but pure luck likely also played a role in keeping Earth continuously habitable.”

    This research was supported, in part, by a MathWorks fellowship and the National Science Foundation. More

  • in

    Keeping indoor humidity levels at a “sweet spot” may reduce spread of Covid-19

    We know proper indoor ventilation is key to reducing the spread of Covid-19. Now, a study by MIT researchers finds that indoor relative humidity may also influence transmission of the virus.

    Relative humidity is the amount of moisture in the air compared to the total moisture the air can hold at a given temperature before saturating and forming condensation.

    In a study appearing today in the Journal of the Royal Society Interface, the MIT team reports that maintaining an indoor relative humidity between 40 and 60 percent is associated with relatively lower rates of Covid-19 infections and deaths, while indoor conditions outside this range are associated with worse Covid-19 outcomes. To put this into perspective, most people are comfortable between 30 and 50 percent relative humidity, and an airplane cabin is at around 20 percent relative humidity.

    The findings are based on the team’s analysis of Covid-19 data combined with meteorological measurements from 121 countries, from January 2020 through August 2020. Their study suggests a strong connection between regional outbreaks and indoor relative humidity.

    In general, the researchers found that whenever a region experienced a rise in Covid-19 cases and deaths prevaccination, the estimated indoor relative humidity in that region, on average, was either lower than 40 percent or higher than 60 percent regardless of season. Nearly all regions in the study experienced fewer Covid-19 cases and deaths during periods when estimated indoor relative humidity was within a “sweet spot” between 40 and 60 percent.

    “There’s potentially a protective effect of this intermediate indoor relative humidity,” suggests lead author Connor Verheyen, a PhD student in medical engineering and medical physics in the Harvard-MIT Program in Health Sciences and Technology.

    “Indoor ventilation is still critical,” says co-author Lydia Bourouiba, director of the MIT Fluid Dynamics of Disease Transmission Laboratory and associate professor in the departments of Civil and Environmental Engineering and Mechanical Engineering, and at the Institute for Medical Engineering and Science at MIT. “However, we find that maintaining an indoor relative humidity in that sweet spot — of 40 to 60 percent — is associated with reduced Covid-19 cases and deaths.”

    Seasonal swing?

    Since the start of the Covid-19 pandemic, scientists have considered the possibility that the virus’ virulence swings with the seasons. Infections and associated deaths appear to rise in winter and ebb in summer. But studies looking to link the virus’ patterns to seasonal outdoor conditions have yielded mixed results.

    Verheyen and Bourouiba examined whether Covid-19 is influenced instead by indoor — rather than outdoor — conditions, and, specifically, relative humidity. After all, they note that most societies spend more than 90 percent of their time indoors, where the majority of viral transmission has been shown to occur. What’s more, indoor conditions can be quite different from outdoor conditions as a result of climate control systems, such as heaters that significantly dry out indoor air.

    Could indoor relative humidity have affected the spread and severity of Covid-19 around the world? And could it help explain the differences in health outcomes from region to region?

    Tracking humidity

    For answers, the team focused on the early period of the pandemic when vaccines were not yet available, reasoning that vaccinated populations would obscure the influence of any other factor such as indoor humidity. They gathered global Covid-19 data, including case counts and reported deaths, from January 2020 to August 2020,  and identified countries with at least 50 deaths, indicating at least one outbreak had occurred in those countries.

    In all, they focused on 121 countries where Covid-19 outbreaks occurred. For each country, they also tracked the local Covid-19 related policies, such as isolation, quarantine, and testing measures, and their statistical association with Covid-19 outcomes.

    For each day that Covid-19 data was available, they used meteorological data to calculate a country’s outdoor relative humidity. They then estimated the average indoor relative humidity, based on outdoor relative humidity and guidelines on temperature ranges for human comfort. For instance, guidelines report that humans are comfortable between 66 to 77 degrees Fahrenheit indoors. They also assumed that on average, most populations have the means to heat indoor spaces to comfortable temperatures. Finally, they also collected experimental data, which they used to validate their estimation approach.

    For every instance when outdoor temperatures were below the typical human comfort range, they assumed indoor spaces were heated to reach that comfort range. Based on the added heating, they calculated the associated drop in indoor relative humidity.

    In warmer times, both outdoor and indoor relative humidity for each country was about the same, but they quickly diverged in colder times. While outdoor humidity remained around 50 percent throughout the year, indoor relative humidity for countries in the Northern and Southern Hemispheres dropped below 40 percent in their respective colder periods, when Covid-19 cases and deaths also spiked in these regions.

    For countries in the tropics, relative humidity was about the same indoors and outdoors throughout the year, with a gradual rise indoors during the region’s summer season, when high outdoor humidity likely raised the indoor relative humidity over 60 percent. They found this rise mirrored the gradual increase in Covid-19 deaths in the tropics.

    “We saw more reported Covid-19 deaths on the low and high end of indoor relative humidity, and less in this sweet spot of 40 to 60 percent,” Verheyen says. “This intermediate relative humidity window is associated with a better outcome, meaning fewer deaths and a deceleration of the pandemic.”

    “We were very skeptical initially, especially as the Covid-19 data can be noisy and inconsistent,” Bourouiba says. “We thus were very thorough trying to poke holes in our own analysis, using a range of approaches to test the limits and robustness of the findings, including taking into account factors such as government intervention. Despite all our best efforts, we found that even when considering countries with very strong versus very weak Covid-19 mitigation policies, or wildly different outdoor conditions, indoor — rather than outdoor — relative humidity maintains an underlying strong and robust link with Covid-19 outcomes.”

    It’s still unclear how indoor relative humidity affects Covid-19 outcomes. The team’s follow-up studies suggest that pathogens may survive longer in respiratory droplets in both very dry and very humid conditions.

    “Our ongoing work shows that there are emerging hints of mechanistic links between these factors,” Bourouiba says. “For now however, we can say that indoor relative humidity emerges in a robust manner as another mitigation lever that organizations and individuals can monitor, adjust, and maintain in the optimal 40 to 60 percent range, in addition to proper ventillation.”

    This research was made possible, in part, by an MIT Alumni Class fund, the Richard and Susan Smith Family Foundation, the National Institutes of Health, and the National Science Foundation. More

  • in

    With new heat treatment, 3D-printed metals can withstand extreme conditions

    A new MIT-developed heat treatment transforms the microscopic structure of 3D-printed metals, making the materials stronger and more resilient in extreme thermal environments. The technique could make it possible to 3D print high-performance blades and vanes for power-generating gas turbines and jet engines, which would enable new designs with improved fuel consumption and energy efficiency.

    Today’s gas turbine blades are manufactured through conventional casting processes in which molten metal is poured into complex molds and directionally solidified. These components are made from some of the most heat-resistant metal alloys on Earth, as they are designed to rotate at high speeds in extremely hot gas, extracting work to generate electricity in power plants and thrust in jet engines.

    There is growing interest in manufacturing turbine blades through 3D-printing, which, in addition to its environmental and cost benefits, could allow manufacturers to quickly produce more intricate, energy-efficient blade geometries. But efforts to 3D-print turbine blades have yet to clear a big hurdle: creep.

    In metallurgy, creep refers to a metal’s tendency to permanently deform in the face of persistent mechanical stress and high temperatures. While researchers have explored printing turbine blades, they have found that the printing process produces fine grains on the order of tens to hundreds of microns in size — a microstructure that is especially vulnerable to creep.

    “In practice, this would mean a gas turbine would have a shorter life or less fuel efficiency,” says Zachary Cordero, the Boeing Career Development Professor in Aeronautics and Astronautics at MIT. “These are costly, undesirable outcomes.”

    Cordero and his colleagues found a way to improve the structure of 3D-printed alloys by adding an additional heat-treating step, which transforms the as-printed material’s fine grains into much larger “columnar” grains — a sturdier microstructure that should minimize the material’s creep potential, since the “columns” are aligned with the axis of greatest stress. The researchers say the method, outlined today in Additive Manufacturing, clears the way for industrial 3D-printing of gas turbine blades.

    “In the near future, we envision gas turbine manufacturers will print their blades and vanes at large-scale additive manufacturing plants, then post-process them using our heat treatment,” Cordero says. “3D-printing will enable new cooling architectures that can improve the thermal efficiency of a turbine, so that it produces the same amount of power while burning less fuel and ultimately emits less carbon dioxide.”

    Cordero’s co-authors on the study are lead author Dominic Peachey, Christopher Carter, and Andres Garcia-Jimenez at MIT, Anugrahaprada Mukundan and Marie-Agathe Charpagne of the University of Illinois at Urbana-Champaign, and Donovan Leonard of Oak Ridge National Laboratory.

    Triggering a transformation

    The team’s new method is a form of directional recrystallization — a heat treatment that passes a material through a hot zone at a precisely controlled speed to meld a material’s many microscopic grains into larger, sturdier, and more uniform crystals.

    Directional recrystallization was invented more than 80 years ago and has been applied to wrought materials. In their new study, the MIT team adapted directional recrystallization for 3D-printed superalloys.

    The team tested the method on 3D-printed nickel-based superalloys — metals that are typically cast and used in gas turbines. In a series of experiments, the researchers placed 3D-printed samples of rod-shaped superalloys in a room-temperature water bath placed just below an induction coil. They slowly drew each rod out of the water and through the coil at various speeds, dramatically heating the rods to temperatures varying between 1,200 and 1,245 degrees Celsius.

    They found that drawing the rods at a particular speed (2.5 millimeters per hour) and through a specific temperature (1,235 degrees Celsius) created a steep thermal gradient that triggered a transformation in the material’s printed, fine-grained microstructure.

    “The material starts as small grains with defects called dislocations, that are like a mangled spaghetti,” Cordero explains. “When you heat this material up, those defects can annihilate and reconfigure, and the grains are able to grow. We’re continuously elongating the grains by consuming the defective material and smaller grains — a process termed recrystallization.”

    Creep away

    After cooling the heat-treated rods, the researchers examined their microstructure using optical and electron microscopy, and found that the material’s printed microscopic grains were replaced with “columnar” grains, or long crystal-like regions that were significantly larger than the original grains.

    “We’ve completely transformed the structure,” says lead author Dominic Peachey. “We show we can increase the grain size by orders of magnitude, to massive columnar grains, which theoretically should lead to dramatic improvements in creep properties.”

    The team also showed they could manipulate the draw speed and temperature of the rod samples to tailor the material’s growing grains, creating regions of specific grain size and orientation. This level of control, Cordero says, can enable manufacturers to print turbine blades with site-specific microstructures that are resilient to specific operating conditions.

    Cordero plans to test the heat treatment on 3D-printed geometries that more closely resemble turbine blades. The team is also exploring ways to speed up the draw rate, as well as test a heat-treated structure’s resistance to creep. Then, they envision that the heat treatment could enable the practical application of 3D-printing to produce industrial-grade turbine blades, with more complex shapes and patterns.

    “New blade and vane geometries will enable more energy-efficient land-based gas turbines, as well as, eventually, aeroengines,” Cordero notes. “This could from a baseline perspective lead to lower carbon dioxide emissions, just through improved efficiency of these devices.”

    This research was supported, in part, by the U.S. Office of Naval Research. More

  • in

    MIT PhD students shed light on important water and food research

    One glance at the news lately will reveal countless headlines on the dire state of global water and food security. Pollution, supply chain disruptions, and the war in Ukraine are all threatening water and food systems, compounding climate change impacts from heat waves, drought, floods, and wildfires.

    Every year, MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) offers fellowships to outstanding MIT graduate students who are working on innovative ways to secure water and food supplies in light of these urgent worldwide threats. J-WAFS announced this year’s fellowship recipients last April. Aditya Ghodgaonkar and Devashish Gokhale were awarded Rasikbhai L. Meswani Fellowships for Water Solutions, which are made possible by a generous gift from Elina and Nikhil Meswani and family. James Zhang, Katharina Fransen, and Linzixuan (Rhoda) Zhang were awarded J-WAFS Fellowships for Water and Food Solutions. The J-WAFS Fellowship for Water and Food Solutions is funded in part by J-WAFS Research Affiliate companies: Xylem, Inc., a water technology company, and GoAigua, a company leading the digital transformation of the water industry.

    The five fellows were each awarded a stipend and full tuition for one semester. They also benefit from mentorship, networking connections, and opportunities to showcase their research.

    “This year’s cohort of J-WAFS fellows show an indefatigable drive to explore, create, and push back boundaries,” says John H. Lienhard, director of J-WAFS. “Their passion and determination to create positive change for humanity are evident in these unique video portraits, which describe their solutions-oriented research in water and food,” Lienhard adds.

    J-WAFS funder Community Jameel recently commissioned video portraitures of each student that highlight their work and their inspiration to solve challenges in water and food. More about each J-WAFS fellow and their research follows.

    Play video

    Katharina Fransen

    In Professor Bradley Olsen’s lab in the Department of Chemical Engineering, Katharina Fransen works to develop biologically-based, biodegradable plastics which can be used for food packing that won’t pollute the environment. Fransen, a third-year PhD student, is motivated by the challenge of protecting the most vulnerable global communities from waste generated by the materials that are essential to connecting them to the global food supply. “We can’t ensure that all of our plastic waste gets recycled or reused, and so we want to make sure that if it does escape into the environment it can degrade, and that’s kind of where a lot of my research really comes in,” says Fransen. Most of her work involves creating polymers, or “really long chains of chemicals,” kind of like the paper rings a lot of us looped into chains as kids, Fransen explains. The polymers are optimized for food packaging applications to keep food fresher for longer, preventing food waste. Fransen says she finds the work “really interesting from the scientific perspective as well as from the idea that [she’s] going to make the world a little better with these new materials.” She adds, “I think it is both really fulfilling and really exciting and engaging.”

    Play video

    Aditya Ghodgaonkar

    “When I went to Kenya this past spring break, I had an opportunity to meet a lot of farmers and talk to them about what kind of maintenance issues they face,” says Aditya Ghodgaonkar, PhD candidate in the Department of Mechanical Engineering. Ghodgaonkar works with Associate Professor Amos Winter in the Global Engineering and Research (GEAR) Lab, where he designs hydraulic components for drip irrigation systems to make them water-efficient, off-grid, inexpensive, and low-maintenance. On his trip to Kenya, Ghodgaonkar gained firsthand knowledge from farmers about a common problem they encounter: clogging of drip irrigation emitters. He learned that clogging can be an expensive technical challenge to diagnose, mitigate, and resolve. He decided to focus his attention on designing emitters that are resistant to clogging, testing with sand and passive hydrodynamic filtration back in the lab at MIT. “I got into this from an academic standpoint,” says Ghodgaonkar. “It is only once I started working on the emitters, spoke with industrial partners that make these emitters, spoke with farmers, that I really truly appreciated the impact of what we’re doing.”

    Play video

    Devashish Gokhale

    Devashish Gokhale is a PhD student advised by Professor Patrick Doyle in the Department of Chemical Engineering. Gokhale’s commitment to global water security stems from his childhood in Pune, India, where both flooding and drought can occur depending on the time of year. “I’ve had these experiences where there’s been too much water and also too little water” he recalls. At MIT, Gokhale is developing cost-effective, sustainable, and reusable materials for water treatment with a focus on the elimination of emerging contaminants and low-concentration pollutants like heavy metals. Specifically, he works on making and optimizing polymeric hydrogel microparticles that can absorb micropollutants. “I know how important it is to do something which is not just scientifically interesting, but something which is impactful in a real way,” says Gokhale. Before starting a research project he asks himself, “are people going to be able to afford this? Is it really going to reach the people who need it the most?” Adding these constraints in the beginning of the research process sometimes makes the problem more difficult to solve, but Gokhale notes that in the end, the solution is much more promising.

    Play video

    James Zhang

    “We don’t really think much about it, it’s transparent, odorless, we just turn on our sink in many parts of the world and it just flows through,” says James Zhang when talking about water. Yet he notes that “many other parts of the world face water scarcity and this will only get worse due to global climate change.” A PhD student in the Department of Mechanical Engineering, Zhang works in the Nano Engineering Laboratory with Professor Gang Chen. Zhang is working on a technology that uses light-induced evaporation to clean water. He is currently investigating the fundamental properties of how light at different wavelengths interacts with liquids at the surface, particularly with brackish water surfaces. With strong theoretical and experimental components, his research could lead to innovations in desalinating water at high energy efficiencies. Zhang hopes that the technology can one day “produce lots of clean water for communities around the world that currently don’t have access to fresh water,” and create a new appreciation for this common liquid that many of us might not think about on a day-to-day basis.

    Play video

    Linzixuan (Rhoda) Zhang

    “Around the world there are about 2 billion people currently suffering from micronutrient deficiency because they do not have access to very healthy, very fresh food,” says chemical engineering PhD candidate Linzixuan (Rhoda) Zhang. This fact led Zhang to develop a micronutrient delivery platform that fortifies foods with essential vitamins and nutrients. With her advisors, Professor Robert Langer and Research Scientist Ana Jaklenec, Zhang brings biomedical engineering approaches to global health issues. Zhang says that “one of the most serious problems is vitamin A deficiency, because vitamin A is not very stable.” She goes on to explain that although vitamin A is present in different vegetables, when the vegetables are cooked, vitamin A can easily degrade. Zhang helped develop a group of biodegradable polymers that can stabilize micronutrients under cooking and storage conditions. With this technology, vitamin A, for example, could be encapsulated and effectively stabilized under boiling water. The platform has also shown efficient release in a simulation of the stomach environment. Zhang says it is the “little, tiny steps every day that are pushing us forward to the final impactful product.” More

  • in

    Advancing the energy transition amidst global crises

    “The past six years have been the warmest on the planet, and our track record on climate change mitigation is drastically short of what it needs to be,” said Robert C. Armstrong, MIT Energy Initiative (MITEI) director and the Chevron Professor of Chemical Engineering, introducing MITEI’s 15th Annual Research Conference.

    At the symposium, participants from academia, industry, and finance acknowledged the deepening difficulties of decarbonizing a world rocked by geopolitical conflicts and suffering from supply chain disruptions, energy insecurity, inflation, and a persistent pandemic. In spite of this grim backdrop, the conference offered evidence of significant progress in the energy transition. Researchers provided glimpses of a low-carbon future, presenting advances in such areas as long-duration energy storage, carbon capture, and renewable technologies.

    In his keynote remarks, Ernest J. Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus, founding director of MITEI, and former U.S. secretary of energy, highlighted “four areas that have materially changed in the last year” that could shake up, and possibly accelerate, efforts to address climate change.

    Extreme weather seems to be propelling the public and policy makers of both U.S. parties toward “convergence … at least in recognition of the challenge,” Moniz said. He perceives a growing consensus that climate goals will require — in diminishing order of certainty — firm (always-on) power to complement renewable energy sources, a fuel (such as hydrogen) flowing alongside electricity, and removal of atmospheric carbon dioxide (CO2).

    Russia’s invasion of Ukraine, with its “weaponization of natural gas” and global energy impacts, underscores the idea that climate, energy security, and geopolitics “are now more or less recognized widely as one conversation.” Moniz pointed as well to new U.S. laws on climate change and infrastructure that will amplify the role of science and technology and “address the drive to technological dominance by China.”

    The rapid transformation of energy systems will require a comprehensive industrial policy, Moniz said. Government and industry must select and rapidly develop low-carbon fuels, firm power sources (possibly including nuclear power), CO2 removal systems, and long-duration energy storage technologies. “We will need to make progress on all fronts literally in this decade to come close to our goals for climate change mitigation,” he concluded.

    Global cooperation?

    Over two days, conference participants delved into many of the issues Moniz raised. In one of the first panels, scholars pondered whether the international community could forge a coordinated climate change response. The United States’ rift with China, especially over technology trade policies, loomed large.

    “Hatred of China is a bipartisan hobby and passion, but a blanket approach isn’t right, even for the sake of national security,” said Yasheng Huang, the Epoch Foundation Professor of Global Economics and Management at the MIT Sloan School of Management. “Although the United States and China working together would have huge effects for both countries, it is politically unpalatable in the short term,” said F. Taylor Fravel, the Arthur and Ruth Sloan Professor of Political Science and director of the MIT Security Studies Program. John E. Parsons, deputy director for research at the MIT Center for Energy and Environmental Policy Research, suggested that the United States should use this moment “to get our own act together … and start doing things,” such as building nuclear power plants in a cost-effective way.

    Debating carbon removal

    Several panels took up the matter of carbon emissions and the most promising technologies for contending with them. Charles Harvey, MIT professor of civil and environmental engineering, and Howard Herzog, a senior research engineer at MITEI, set the stage early, debating whether capturing carbon was essential to reaching net-zero targets.

    “I have no trouble getting to net zero without carbon capture and storage,” said David Keith, the Gordon McKay Professor of Applied Physics at Harvard University, in a subsequent roundtable. Carbon capture seems more risky to Keith than solar geoengineering, which involves injecting sulfur into the stratosphere to offset CO2 and its heat-trapping impacts.

    There are new ways of moving carbon from where it’s a problem to where it’s safer. Kripa K. Varanasi, MIT professor of mechanical engineering, described a process for modulating the pH of ocean water to remove CO2. Timothy Krysiek, managing director for Equinor Ventures, talked about construction of a 900-kilometer pipeline transporting CO2 from northern Germany to a large-scale storage site located in Norwegian waters 3,000 meters below the seabed. “We can use these offshore Norwegian assets as a giant carbon sink for Europe,” he said.

    A startup showcase featured additional approaches to the carbon challenge. Mantel, which received MITEI Seed Fund money, is developing molten salt material to capture carbon for long-term storage or for use in generating electricity. Verdox has come up with an electrochemical process for capturing dilute CO2 from the atmosphere.

    But while much of the global warming discussion focuses on CO2, other greenhouse gases are menacing. Another panel discussed measuring and mitigating these pollutants. “Methane has 82 times more warming power than CO2 from the point of emission,” said Desirée L. Plata, MIT associate professor of civil and environmental engineering. “Cutting methane is the strongest lever we have to slow climate change in the next 25 years — really the only lever.”

    Steven Hamburg, chief scientist and senior vice president of the Environmental Defense Fund, cautioned that emission of hydrogen molecules into the atmosphere can cause increases in other greenhouse gases such as methane, ozone, and water vapor. As researchers and industry turn to hydrogen as a fuel or as a feedstock for commercial processes, “we will need to minimize leakage … or risk increasing warming,” he said.

    Supply chains, markets, and new energy ventures

    In panels on energy storage and the clean energy supply chain, there were interesting discussions of challenges ahead. High-density energy materials such as lithium, cobalt, nickel, copper, and vanadium for grid-scale energy storage, electric vehicles (EVs), and other clean energy technologies, can be difficult to source. “These often come from water-stressed regions, and we need to be super thoughtful about environmental stresses,” said Elsa Olivetti, the Esther and Harold E. Edgerton Associate Professor in Materials Science and Engineering. She also noted that in light of the explosive growth in demand for metals such as lithium, recycling EVs won’t be of much help. “The amount of material coming back from end-of-life batteries is minor,” she said, until EVs are much further along in their adoption cycle.

    Arvind Sanger, founder and managing partner of Geosphere Capital, said that the United States should be developing its own rare earths and minerals, although gaining the know-how will take time, and overcoming “NIMBYism” (not in my backyard-ism) is a challenge. Sanger emphasized that we must continue to use “denser sources of energy” to catalyze the energy transition over the next decade. In particular, Sanger noted that “for every transition technology, steel is needed,” and steel is made in furnaces that use coal and natural gas. “It’s completely woolly-headed to think we can just go to a zero-fossil fuel future in a hurry,” he said.

    The topic of power markets occupied another panel, which focused on ways to ensure the distribution of reliable and affordable zero-carbon energy. Integrating intermittent resources such as wind and solar into the grid requires a suite of retail markets and new digital tools, said Anuradha Annaswamy, director of MIT’s Active-Adaptive Control Laboratory. Tim Schittekatte, a postdoc at the MIT Sloan School of Management, proposed auctions as a way of insuring consumers against periods of high market costs.

    Another panel described the very different investment needs of new energy startups, such as longer research and development phases. Hooisweng Ow, technology principal at Eni Next LLC Ventures, which is developing drilling technology for geothermal energy, recommends joint development and partnerships to reduce risk. Michael Kearney SM ’11, PhD ’19, SM ’19 is a partner at The Engine, a venture firm built by MIT investing in path-breaking technology to solve the toughest challenges in climate and other problems. Kearney believes the emergence of new technologies and markets will bring on “a labor transition on an order of magnitude never seen before in this country,” he said. “Workforce development is not a natural zone for startups … and this will have to change.”

    Supporting the global South

    The opportunities and challenges of the energy transition look quite different in the developing world. In conversation with Robert Armstrong, Luhut Binsar Pandjaitan, the coordinating minister for maritime affairs and investment of the Republic of Indonesia, reported that his “nation is rich with solar, wind, and energy transition minerals like nickel and copper,” but cannot on its own tackle developing renewable energy or reducing carbon emissions and improving grid infrastructure. “Education is a top priority, and we are very far behind in high technologies,” he said. “We need help and support from MIT to achieve our target,” he said.

    Technologies that could springboard Indonesia and other nations of the global South toward their climate goals are emerging in MITEI-supported projects and at young companies MITEI helped spawn. Among the promising innovations unveiled at the conference are new materials and designs for cooling buildings in hot climates and reducing the environmental costs of construction, and a sponge-like substance that passively sucks moisture out of the air to lower the energy required for running air conditioners in humid climates.

    Other ideas on the move from lab to market have great potential for industrialized nations as well, such as a computational framework for maximizing the energy output of ocean-based wind farms; a process for using ammonia as a renewable fuel with no CO2 emissions; long-duration energy storage derived from the oxidation of iron; and a laser-based method for unlocking geothermal steam to drive power plants. More

  • in

    Ocean microbes get their diet through a surprising mix of sources, study finds

    One of the smallest and mightiest organisms on the planet is a plant-like bacterium known to marine biologists as Prochlorococcus. The green-tinted microbe measures less than a micron across, and its populations suffuse through the upper layers of the ocean, where a single teaspoon of seawater can hold millions of the tiny organisms.

    Prochlorococcus grows through photosynthesis, using sunlight to convert the atmosphere’s carbon dioxide into organic carbon molecules. The microbe is responsible for 5 percent of the world’s photosynthesizing activity, and scientists have assumed that photosynthesis is the microbe’s go-to strategy for acquiring the carbon it needs to grow.

    But a new MIT study in Nature Microbiology today has found that Prochlorococcus relies on another carbon-feeding strategy, more than previously thought.

    Organisms that use a mix of strategies to provide carbon are known as mixotrophs. Most marine plankton are mixotrophs. And while Prochlorococcus is known to occasionally dabble in mixotrophy, scientists have assumed the microbe primarily lives a phototrophic lifestyle.

    The new MIT study shows that in fact, Prochlorococcus may be more of a mixotroph than it lets on. The microbe may get as much as one-third of its carbon through a second strategy: consuming the dissolved remains of other dead microbes.

    The new estimate may have implications for climate models, as the microbe is a significant force in capturing and “fixing” carbon in the Earth’s atmosphere and ocean.

    “If we wish to predict what will happen to carbon fixation in a different climate, or predict where Prochlorococcus will or will not live in the future, we probably won’t get it right if we’re missing a process that accounts for one-third of the population’s carbon supply,” says Mick Follows, a professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS), and its Department of Civil and Environmental Engineering.

    The study’s co-authors include first author and MIT postdoc Zhen Wu, along with collaborators from the University of Haifa, the Leibniz-Institute for Baltic Sea Research, the Leibniz-Institute of Freshwater Ecology and Inland Fisheries, and Potsdam University.

    Persistent plankton

    Since Prochlorococcus was first discovered in the Sargasso Sea in 1986, by MIT Institute Professor Sallie “Penny” Chisholm and others, the microbe has been observed throughout the world’s oceans, inhabiting the upper sunlit layers ranging from the surface down to about 160 meters. Within this range, light levels vary, and the microbe has evolved a number of ways to photosynthesize carbon in even low-lit regions.

    The organism has also evolved ways to consume organic compounds including glucose and certain amino acids, which could help the microbe survive for limited periods of time in dark ocean regions. But surviving on organic compounds alone is a bit like only eating junk food, and there is evidence that Prochlorococcus will die after a week in regions where photosynthesis is not an option.

    And yet, researchers including Daniel Sher of the University of Haifa, who is a co-author of the new study, have observed healthy populations of Prochlorococcus that persist deep in the sunlit zone, where the light intensity should be too low to maintain a population. This suggests that the microbes must be switching to a non-photosynthesizing, mixotrophic lifestyle in order to consume other organic sources of carbon.

    “It seems that at least some Prochlorococcus are using existing organic carbon in a mixotrophic way,” Follows says. “That stimulated the question: How much?”

    What light cannot explain

    In their new paper, Follows, Wu, Sher, and their colleagues looked to quantify the amount of carbon that Prochlorococcus is consuming through processes other than photosynthesis.

    The team looked first to measurements taken by Sher’s team, which previously took ocean samples at various depths in the Mediterranean Sea and measured the concentration of phytoplankton, including Prochlorococcus, along with the associated intensity of light and the concentration of nitrogen — an essential nutrient that is richly available in deeper layers of the ocean and that plankton can assimilate to make proteins.

    Wu and Follows used this data, and similar information from the Pacific Ocean, along with previous work from Chisholm’s lab, which established the rate of photosynthesis that Prochlorococcus could carry out in a given intensity of light.

    “We converted that light intensity profile into a potential growth rate — how fast the population of Prochlorococcus could grow if it was acquiring all it’s carbon by photosynthesis, and light is the limiting factor,” Follows explains.

    The team then compared this calculated rate to growth rates that were previously observed in the Pacific Ocean by several other research teams.

    “This data showed that, below a certain depth, there’s a lot of growth happening that photosynthesis simply cannot explain,” Follows says. “Some other process must be at work to make up the difference in carbon supply.”

    The researchers inferred that, in deeper, darker regions of the ocean, Prochlorococcus populations are able to survive and thrive by resorting to mixotrophy, including consuming organic carbon from detritus. Specifically, the microbe may be carrying out osmotrophy — a process by which an organism passively absorbs organic carbon molecules via osmosis.

    Judging by how fast the microbe is estimated to be growing below the sunlit zone, the team calculates that Prochlorococcus obtains up to one-third of its carbon diet through mixotrophic strategies.

    “It’s kind of like going from a specialist to a generalist lifestyle,” Follows says. “If I only eat pizza, then if I’m 20 miles from a pizza place, I’m in trouble, whereas if I eat burgers as well, I could go to the nearby McDonald’s. People had thought of Prochlorococcus as a specialist, where they do this one thing (photosynthesis) really well. But it turns out they may have more of a generalist lifestyle than we previously thought.”

    Chisholm, who has both literally and figuratively written the book on Prochlorococcus, says the group’s findings “expand the range of conditions under which their populations can not only survive, but also thrive. This study changes the way we think about the role of Prochlorococcus in the microbial food web.”

    This research was supported, in part, by the Israel Science Foundation, the U.S. National Science Foundation, and the Simons Foundation. More