More stories

  • in

    Microparticles could help prevent vitamin A deficiency

    Vitamin A deficiency is the world’s leading cause of childhood blindness, and in severe cases, it can be fatal. About one-third of the global population of preschool-aged children suffer from this vitamin deficiency, which is most prevalent in sub-Saharan Africa and South Asia.

    MIT researchers have now developed a new way to fortify foods with vitamin A, which they hope could help to improve the health of millions of people around the world. In a new study, they showed that encapsulating vitamin A in a protective polymer prevents the nutrient from being broken down during cooking or storage.

    “Vitamin A is a very important micronutrient, but it’s an unstable molecule,” says Ana Jaklenec, a research scientist at MIT’s Koch Institute for Integrative Cancer Research. “We wanted to see if our encapsulated vitamin A could fortify a food vehicle like bouillon cubes or flour, throughout storage and cooking, and whether the vitamin A could remain biologically active and be absorbed.”

    In a small clinical trial, the researchers showed that when people ate bread fortified with encapsulated vitamin A, the bioavailability of the nutrient was similar to when they consumed vitamin A on its own. The technology has been licensed to two companies that hope to develop it for use in food products.

    “This is a study that our team is really excited about because it shows that everything we did in test tubes and animals works safely and effectively in humans,” says Robert Langer, the David H. Koch Institute Professor at MIT and a member of the Koch Institute. “We hope this opens the door for someday helping millions, if not billions, of people in the developing world.”

    Jaklenec and Langer are the senior authors of the new study, which appears this week in the Proceedings of the National Academy of Sciences. The paper’s lead author is former MIT postdoc Wen Tang, who is now an associate professor at South China University of Technology.

    Nutrient stability

    Vitamin A is critical not only for vision but also the functioning of the immune system and organs such as the heart and lungs. Efforts to add vitamin A to bread or other foods such as bouillon cubes, which are commonly consumed in West African countries, have been largely unsuccessful because the vitamin breaks down during storage or cooking.

    In a 2019 study, the MIT team showed that they could use a polymer called BMC to encapsulate nutrients, including iron, vitamin A, and several others. They showed that this protective coating improved the shelf life of the nutrients, and that people who consumed bread fortified with encapsulated iron were able to absorb the iron.

    BMC is classified by the FDA as “generally regarded as safe,” and is already used in coatings for drugs and dietary supplements. In the new study, the researchers focused on using this polymer to encapsulate vitamin A, a nutrient that is very sensitive to temperature and ultraviolet light.

    Using an industrial process known as a spinning disc process, the researchers mixed vitamin A with the polymer to form particles 100 to 200 microns in diameter. They also coated the particles with starch, which prevents them from sticking to each other.

    The researchers found that vitamin A encapsulated in the polymer particles were more resistant to degradation by intense light, high temperatures, or boiling water. Under those conditions, much more vitamin A remained active than when the vitamin A was free or when it was delivered in a form called VitA 250, which is currently the most stable form of vitamin A used for food fortification.

    The researchers also showed that the encapsulated particles could be easily incorporated into flour or bouillon cubes. To test how well they would survive long-term storage, the researchers exposed the cubes to harsh conditions, as recommended by the World Health Organization: 40 degrees Celsius (104 degrees Fahrenheit) and 75 percent humidity. Under those conditions, the encapsulated vitamin A was much more stable than other forms of vitamin A. 

    “The enhanced stability of vitamin A with our technology can ensure that the vitamin A-fortified food does provide the recommended daily uptake of vitamin A, even after long-term storage in a hot humidified environment, and cooking processes such as boiling or baking,” Tang says. “People who are suffering from vitamin A deficiency and want to get vitamin A through fortified food will benefit, without changing their daily routines, and without wondering how much vitamin A is still in the food.”

    Vitamin absorption

    When the researchers cooked their encapsulated particles and then fed them to animals, they found that 30 percent of the vitamin A was absorbed, the same as free uncooked vitamin A, compared to about 3 percent of free vitamin A that had been cooked.

    Working with Biofortis, a company that does dietary clinical testing, the researchers then evaluated how well vitamin A was absorbed in people who ate foods fortified with the particles. For this study, the researchers incorporated the particles into bread, then measured vitamin A levels in the blood over a 24-hour period after the bread was consumed. They found that when vitamin A was encapsulated in the BMC polymer, it was absorbed from the food at levels comparable to free vitamin A, indicating that it is readily released in bioactive form.

    Two companies have licensed the technology and are focusing on developing products fortified with vitamin A and other nutrients. A benefit corporation called Particles for Humanity, funded by the Bill and Melinda Gates Foundation, is working with partners in Africa to incorporate this technology into existing fortification efforts. Another company called VitaKey, founded by Jaklenec, Langer, and others, is working on using this approach to add nutrients to a variety of foods and beverages.

    The research was funded by the Bill and Melinda Gates Foundation. Other authors of the paper include Jia Zhuang, Aaron Anselmo, Xian Xu, Aranda Duan, Ruojie Zhang, James Sugarman, Yingying Zeng, Evan Rosenberg, Tyler Graf, Kevin McHugh, Stephany Tzeng, Adam Behrens, Lisa Freed, Lihong Jing, Surangi Jayawardena, Shelley Weinstock, Xiao Le, Christopher Sears, James Oxley, John Daristotle, and Joe Collins. More

  • in

    Pursuing a practical approach to research

    Koroush Shirvan, the John Clark Hardwick Career Development Professor in the Department of Nuclear Science and Engineering (NSE), knows that the nuclear industry has traditionally been wary of innovations until they are shown to have proven utility. As a result, he has relentlessly focused on practical applications in his research, work that has netted him the 2022 Reactor Technology Award from the American Nuclear Society. “The award has usually recognized practical contributions to the field of reactor design and has not often gone to academia,” Shirvan says.

    One of these “practical contributions” is in the field of accident-tolerant fuels, a program launched by the U.S. Nuclear Regulatory Commission in the wake of the 2011 Fukushima Daiichi incident. The goal within this program, says Shirvan, is to develop new forms of nuclear fuels that can tolerate heat. His team, with students from over 16 countries, is working on numerous possibilities that range in composition and method of production.

    Another aspect of Shirvan’s research focuses on how radiation impacts heat transfer mechanisms in the reactor. The team found fuel corrosion to be the driving force. “[The research] informs how nuclear fuels perform in the reactor, from a practical point of view,” Shirvan says.

    Optimizing nuclear reactor design

    A summer internship when Shirvan was an undergraduate at the University of Florida at Gainesville seeded his drive to focus on practical applications in his studies. A nearby nuclear utility was losing millions because of crud accumulating on fuel rods. Over time, the company was solving the problem by using more fuel, before it had extracted all the life from earlier batches.

    Placement of fuel rods in nuclear reactors is a complex problem with many factors — the life of the fuel, location of hot spots — affecting outcomes. Nuclear reactors change their configuration of fuel rods every 18-24 months to optimize close to 15-20 constraints, leading to roughly 200-800 assemblies. The mind-boggling nature of the problem means that plants have to rely on experienced engineers.

    During his internship, Shirvan optimized the program used to place fuel rods in the reactor. He found that certain rods in assemblies were more prone to the crud deposits, and reworked their configurations, optimizing for these rods’ performance instead of adding assemblies.

    In recent years, Shirvan has applied a branch of artificial intelligence — reinforcement learning — to the configuration problem and created a software program used by the largest U.S. nuclear utility. “This program gives even a layperson the ability to reconfigure the fuels and the reactor without having expert knowledge,” Shirvan says.

    From advanced math to counting jelly beans

    Shirvan’s own expertise in nuclear science and engineering developed quite organically. He grew up in Tehran, Iran, and when he was 14 the family moved to Gainesville, where Shirvan’s aunt and family live. He remembers an awkward couple of years at the new high school where he was grouped in with newly arrived international students, and placed in entry-level classes. “I went from doing advanced mathematics in Iran to counting jelly beans,” he laughs.

    Shirvan applied to the University of Florida for his undergraduate studies since it made economic sense; the school gave full scholarships to Floridian students who received a certain minimum SAT score. Shirvan qualified. His uncle, who was a professor in the nuclear engineering department then, encouraged Shirvan to take classes in the department. Under his uncle’s mentorship, the courses Shirvan took, and his internship, cemented his love of the interdisciplinary approach that the field demanded.

    Having always known that he wanted to teach — he remembers finishing his math tests early in Tehran so he could earn the reward of being class monitor — Shirvan knew graduate school was next. His uncle encouraged him to apply to MIT and to the University of Michigan, home to reputable programs in the field. Shirvan chose MIT because “only at MIT was there a program on nuclear design. There were faculty dedicated to designing new reactors, looking at multiple disciplines, and putting all of that together.” He went on to pursue his master’s and doctoral studies at NSE under the supervision of Professor Mujid Kazimi, focusing on compact pressurized and boiling water reactor designs. When Kazimi passed away suddenly in 2015, Shirvan was a research scientist, and switched to tenure track to guide the professor’s team.

    Another project that Shirvan took in 2015: leadership of MIT’s course on nuclear reactor technology for utility executives. Offered only by the Institute, the program is an introduction to nuclear engineering and safety for personnel who might not have much background in the area. “It’s a great course because you get to see what the real problems are in the energy sector … like grid stability,” Shirvan says.

    A multipronged approach to savings

    Another very real problem nuclear utilities face is cost. Contrary to what one hears on the news, one of the biggest stumbling blocks to building new nuclear facilities in the United States is cost, which today can be up to three times that of renewables, Shirvan says. While many approaches such as advanced manufacturing have been tried, Shirvan believes that the solution to decrease expenditures lies in designing more compact reactors.

    His team has developed an open-source advanced nuclear cost tool and has focused on two different designs: a small water reactor using compact steam technology and a horizontal gas reactor. Compactness also means making fuels more efficient, as Shirvan’s work does, and in improving the heat exchange device. It’s all back to the basics and bringing “commercial viable arguments in with your research,” Shirvan explains.

    Shirvan is excited about the future of the U.S. nuclear industry, and that the 2022 Inflation Reduction Act grants the same subsidies to nuclear as it does for renewables. In this new level playing field, advanced nuclear still has a long way to go in terms of affordability, he admits. “It’s time to push forward with cost-effective design,” Shirvan says, “I look forward to supporting this by continuing to guide these efforts with research from my team.” More

  • in

    A healthy wind

    Nearly 10 percent of today’s electricity in the United States comes from wind power. The renewable energy source benefits climate, air quality, and public health by displacing emissions of greenhouse gases and air pollutants that would otherwise be produced by fossil-fuel-based power plants.

    A new MIT study finds that the health benefits associated with wind power could more than quadruple if operators prioritized turning down output from the most polluting fossil-fuel-based power plants when energy from wind is available.

    In the study, published today in Science Advances, researchers analyzed the hourly activity of wind turbines, as well as the reported emissions from every fossil-fuel-based power plant in the country, between the years 2011 and 2017. They traced emissions across the country and mapped the pollutants to affected demographic populations. They then calculated the regional air quality and associated health costs to each community.

    The researchers found that in 2014, wind power that was associated with state-level policies improved air quality overall, resulting in $2 billion in health benefits across the country. However, only roughly 30 percent of these health benefits reached disadvantaged communities.

    The team further found that if the electricity industry were to reduce the output of the most polluting fossil-fuel-based power plants, rather than the most cost-saving plants, in times of wind-generated power, the overall health benefits could quadruple to $8.4 billion nationwide. However, the results would have a similar demographic breakdown.

    “We found that prioritizing health is a great way to maximize benefits in a widespread way across the U.S., which is a very positive thing. But it suggests it’s not going to address disparities,” says study co-author Noelle Selin, a professor in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences at MIT. “In order to address air pollution disparities, you can’t just focus on the electricity sector or renewables and count on the overall air pollution benefits addressing these real and persistent racial and ethnic disparities. You’ll need to look at other air pollution sources, as well as the underlying systemic factors that determine where plants are sited and where people live.”

    Selin’s co-authors are lead author and former MIT graduate student Minghao Qiu PhD ’21, now at Stanford University, and Corwin Zigler at the University of Texas at Austin.

    Turn-down service

    In their new study, the team looked for patterns between periods of wind power generation and the activity of fossil-fuel-based power plants, to see how regional electricity markets adjusted the output of power plants in response to influxes of renewable energy.

    “One of the technical challenges, and the contribution of this work, is trying to identify which are the power plants that respond to this increasing wind power,” Qiu notes.

    To do so, the researchers compared two historical datasets from the period between 2011 and 2017: an hour-by-hour record of energy output of wind turbines across the country, and a detailed record of emissions measurements from every fossil-fuel-based power plant in the U.S. The datasets covered each of seven major regional electricity markets, each market providing energy to one or multiple states.

    “California and New York are each their own market, whereas the New England market covers around seven states, and the Midwest covers more,” Qiu explains. “We also cover about 95 percent of all the wind power in the U.S.”

    In general, they observed that, in times when wind power was available, markets adjusted by essentially scaling back the power output of natural gas and sub-bituminous coal-fired power plants. They noted that the plants that were turned down were likely chosen for cost-saving reasons, as certain plants were less costly to turn down than others.

    The team then used a sophisticated atmospheric chemistry model to simulate the wind patterns and chemical transport of emissions across the country, and determined where and at what concentrations the emissions generated fine particulates and ozone — two pollutants that are known to damage air quality and human health. Finally, the researchers mapped the general demographic populations across the country, based on U.S. census data, and applied a standard epidemiological approach to calculate a population’s health cost as a result of their pollution exposure.

    This analysis revealed that, in the year 2014, a general cost-saving approach to displacing fossil-fuel-based energy in times of wind energy resulted in $2 billion in health benefits, or savings, across the country. A smaller share of these benefits went to disadvantaged populations, such as communities of color and low-income communities, though this disparity varied by state.

    “It’s a more complex story than we initially thought,” Qiu says. “Certain population groups are exposed to a higher level of air pollution, and those would be low-income people and racial minority groups. What we see is, developing wind power could reduce this gap in certain states but further increase it in other states, depending on which fossil-fuel plants are displaced.”

    Tweaking power

    The researchers then examined how the pattern of emissions and the associated health benefits would change if they prioritized turning down different fossil-fuel-based plants in times of wind-generated power. They tweaked the emissions data to reflect several alternative scenarios: one in which the most health-damaging, polluting power plants are turned down first; and two other scenarios in which plants producing the most sulfur dioxide and carbon dioxide respectively, are first to reduce their output.

    They found that while each scenario increased health benefits overall, and the first scenario in particular could quadruple health benefits, the original disparity persisted: Communities of color and low-income communities still experienced smaller health benefits than more well-off communities.

    “We got to the end of the road and said, there’s no way we can address this disparity by being smarter in deciding which plants to displace,” Selin says.

    Nevertheless, the study can help identify ways to improve the health of the general population, says Julian Marshall, a professor of environmental engineering at the University of Washington.

    “The detailed information provided by the scenarios in this paper can offer a roadmap to electricity-grid operators and to state air-quality regulators regarding which power plants are highly damaging to human health and also are likely to noticeably reduce emissions if wind-generated electricity increases,” says Marshall, who was not involved in the study.

    “One of the things that makes me optimistic about this area is, there’s a lot more attention to environmental justice and equity issues,” Selin concludes. “Our role is to figure out the strategies that are most impactful in addressing those challenges.”

    This work was supported, in part, by the U.S. Environmental Protection Agency, and by the National Institutes of Health. More

  • in

    Reversing the charge

    Owners of electric vehicles (EVs) are accustomed to plugging into charging stations at home and at work and filling up their batteries with electricity from the power grid. But someday soon, when these drivers plug in, their cars will also have the capacity to reverse the flow and send electrons back to the grid. As the number of EVs climbs, the fleet’s batteries could serve as a cost-effective, large-scale energy source, with potentially dramatic impacts on the energy transition, according to a new paper published by an MIT team in the journal Energy Advances.

    “At scale, vehicle-to-grid (V2G) can boost renewable energy growth, displacing the need for stationary energy storage and decreasing reliance on firm [always-on] generators, such as natural gas, that are traditionally used to balance wind and solar intermittency,” says Jim Owens, lead author and a doctoral student in the MIT Department of Chemical Engineering. Additional authors include Emre Gençer, a principal research scientist at the MIT Energy Initiative (MITEI), and Ian Miller, a research specialist for MITEI at the time of the study.

    The group’s work is the first comprehensive, systems-based analysis of future power systems, drawing on a novel mix of computational models integrating such factors as carbon emission goals, variable renewable energy (VRE) generation, and costs of building energy storage, production, and transmission infrastructure.

    “We explored not just how EVs could provide service back to the grid — thinking of these vehicles almost like energy storage on wheels — but also the value of V2G applications to the entire energy system and if EVs could reduce the cost of decarbonizing the power system,” says Gençer. “The results were surprising; I personally didn’t believe we’d have so much potential here.”

    Displacing new infrastructure

    As the United States and other nations pursue stringent goals to limit carbon emissions, electrification of transportation has taken off, with the rate of EV adoption rapidly accelerating. (Some projections show EVs supplanting internal combustion vehicles over the next 30 years.) With the rise of emission-free driving, though, there will be increased demand for energy. “The challenge is ensuring both that there’s enough electricity to charge the vehicles and that this electricity is coming from renewable sources,” says Gençer.

    But solar and wind energy is intermittent. Without adequate backup for these sources, such as stationary energy storage facilities using lithium-ion batteries, for instance, or large-scale, natural gas- or hydrogen-fueled power plants, achieving clean energy goals will prove elusive. More vexing, costs for building the necessary new energy infrastructure runs to the hundreds of billions.

    This is precisely where V2G can play a critical, and welcome, role, the researchers reported. In their case study of a theoretical New England power system meeting strict carbon constraints, for instance, the team found that participation from just 13.9 percent of the region’s 8 million light-duty (passenger) EVs displaced 14.7 gigawatts of stationary energy storage. This added up to $700 million in savings — the anticipated costs of building new storage capacity.

    Their paper also described the role EV batteries could play at times of peak demand, such as hot summer days. “V2G technology has the ability to inject electricity back into the system to cover these episodes, so we don’t need to install or invest in additional natural gas turbines,” says Owens. “The way that EVs and V2G can influence the future of our power systems is one of the most exciting and novel aspects of our study.”

    Modeling power

    To investigate the impacts of V2G on their hypothetical New England power system, the researchers integrated their EV travel and V2G service models with two of MITEI’s existing modeling tools: the Sustainable Energy System Analysis Modeling Environment (SESAME) to project vehicle fleet and electricity demand growth, and GenX, which models the investment and operation costs of electricity generation, storage, and transmission systems. They incorporated such inputs as different EV participation rates, costs of generation for conventional and renewable power suppliers, charging infrastructure upgrades, travel demand for vehicles, changes in electricity demand, and EV battery costs.

    Their analysis found benefits from V2G applications in power systems (in terms of displacing energy storage and firm generation) at all levels of carbon emission restrictions, including one with no emissions caps at all. However, their models suggest that V2G delivers the greatest value to the power system when carbon constraints are most aggressive — at 10 grams of carbon dioxide per kilowatt hour load. Total system savings from V2G ranged from $183 million to $1,326 million, reflecting EV participation rates between 5 percent and 80 percent.

    “Our study has begun to uncover the inherent value V2G has for a future power system, demonstrating that there is a lot of money we can save that would otherwise be spent on storage and firm generation,” says Owens.

    Harnessing V2G

    For scientists seeking ways to decarbonize the economy, the vision of millions of EVs parked in garages or in office spaces and plugged into the grid for 90 percent of their operating lives proves an irresistible provocation. “There is all this storage sitting right there, a huge available capacity that will only grow, and it is wasted unless we take full advantage of it,” says Gençer.

    This is not a distant prospect. Startup companies are currently testing software that would allow two-way communication between EVs and grid operators or other entities. With the right algorithms, EVs would charge from and dispatch energy to the grid according to profiles tailored to each car owner’s needs, never depleting the battery and endangering a commute.

    “We don’t assume all vehicles will be available to send energy back to the grid at the same time, at 6 p.m. for instance, when most commuters return home in the early evening,” says Gençer. He believes that the vastly varied schedules of EV drivers will make enough battery power available to cover spikes in electricity use over an average 24-hour period. And there are other potential sources of battery power down the road, such as electric school buses that are employed only for short stints during the day and then sit idle.

    The MIT team acknowledges the challenges of V2G consumer buy-in. While EV owners relish a clean, green drive, they may not be as enthusiastic handing over access to their car’s battery to a utility or an aggregator working with power system operators. Policies and incentives would help.

    “Since you’re providing a service to the grid, much as solar panel users do, you could be paid for your participation, and paid at a premium when electricity prices are very high,” says Gençer.

    “People may not be willing to participate ’round the clock, but if we have blackout scenarios like in Texas last year, or hot-day congestion on transmission lines, maybe we can turn on these vehicles for 24 to 48 hours, sending energy back to the system,” adds Owens. “If there’s a power outage and people wave a bunch of money at you, you might be willing to talk.”

    “Basically, I think this comes back to all of us being in this together, right?” says Gençer. “As you contribute to society by giving this service to the grid, you will get the full benefit of reducing system costs, and also help to decarbonize the system faster and to a greater extent.”

    Actionable insights

    Owens, who is building his dissertation on V2G research, is now investigating the potential impact of heavy-duty electric vehicles in decarbonizing the power system. “The last-mile delivery trucks of companies like Amazon and FedEx are likely to be the earliest adopters of EVs,” Owen says. “They are appealing because they have regularly scheduled routes during the day and go back to the depot at night, which makes them very useful for providing electricity and balancing services in the power system.”

    Owens is committed to “providing insights that are actionable by system planners, operators, and to a certain extent, investors,” he says. His work might come into play in determining what kind of charging infrastructure should be built, and where.

    “Our analysis is really timely because the EV market has not yet been developed,” says Gençer. “This means we can share our insights with vehicle manufacturers and system operators — potentially influencing them to invest in V2G technologies, avoiding the costs of building utility-scale storage, and enabling the transition to a cleaner future. It’s a huge win, within our grasp.”

    The research for this study was funded by MITEI’s Future Energy Systems Center. More

  • in

    Machinery of the state

    In Mai Hassan’s studies of Kenya, she documented the emergence of a sprawling administrative network officially billed as encouraging economic development, overseeing the population, and bolstering democracy. But Hassan’s field interviews and archival research revealed a more sinister purpose for the hundreds of administrative and security offices dotting the nation: “They were there to do the presidents’ bidding, which often involved coercing their own countrymen.”

    This research served as a catalyst for Hassan, who joined MIT as an associate professor of political science in July, to investigate what she calls the “politicized management of bureaucracy and the state.” She set out to “understand the motivations, capacities, and roles of people administering state programs and social functions,” she says. “I realized the state is not a faceless being, but instead comprised of bureaucrats carrying out functions on behalf of the state and the regime that runs it.”

    Today, Hassan’s portfolio encompasses not just the bureaucratic state but democratization efforts in Kenya and elsewhere in the East Africa region, including her native Sudan. Her research highlights the difficulties of democratization. “I’m finding that the conditions under which people come together for overthrowing an autocratic regime really matter, because those conditions may actually impede a nation from achieving democracy,” she says.

    A coordinated bureaucracy

    Hassan’s academic engagement with the state’s administrative machinery began during graduate school at Harvard University, where she earned her master’s and doctorate in government. While working with a community trash and sanitation program in some Kenyan Maasai communities, Hassan recalls “shepherding myself from office to office, meeting different bureaucrats to obtain the same approvals but for different jurisdictions.” The Kenyan state had recently set up hundreds of new local administrative units, motivated by what it claimed was the need for greater efficiency. But to Hassan’s eyes, “the administrative network was not well organized, seemed costly to maintain, and seemed to hinder — not bolster — development,” she says. What then, she wondered, was “the political logic behind such state restructuring?”

    Hassan began researching this bureaucratic transformation of Kenya, speaking with administrators in communities large and small who were charged with handling the business of the state. These studies yielded a wealth of findings for her dissertation, and for multiple journals.

    But upon finishing this tranche of research, Hassan realized that it was insufficient simply to study the structure of the state. “Understanding the role of new administrative structures for politics, development, and governance fundamentally requires that we understand who the government has put in charge of them,” she says. Among her insights:

    “The president’s office knows a lot of these administrators, and thinks about their strengths, limitations, and fit within a community,” says Hassan. Some administrators served the purposes of the central government by setting up water irrigation projects or building a new school. But in other villages, the state chose administrators who could act “much more coercively, ignoring development needs, throwing youth who supported the opposition into jail, and spending resources exclusively on policing.”

    Hassan’s work showed that in communities characterized by strong political opposition, “the local administration was always more coercive, regardless of an elected or autocratic president,” she says. Notably, the tenures of such officials proved shorter than those of their peers. “Once administrators get to know a community — going to church and the market with residents — it’s hard to coerce them,” explains Hassan.

    These short tenures come with costs, she notes: “Spending significant time in a station is useful for development, because you know exactly whom to hire if you want to build a school or get something done efficiently.” Politicizing these assignments undermines efforts at delivery of services and, more broadly, economic improvement nationwide. “Regimes that are more invested in retaining power must devote resources to establishing and maintaining control, resources that could otherwise be used for development and the welfare of citizens,” she says.

    Hassan wove together her research covering three presidents over a 50-year period, in the book, “Regime Threats and State Solutions: Bureaucratic Loyalty and Embeddedness in Kenya” (2020, Cambridge University Press), named a Foreign Affairs Best Book of 2020.

    Sudanese roots

    The role of the state in fulfilling the needs of its citizens has long fascinated Hassan. Her grandfather, who had served as Sudan’s ambassador to the USSR, talked to her about the advantages of a centralized government “that allocated resources to reduce inequality,” she says.

    Politics often dominated the conversation in gatherings of Hassan’s family and friends. Her parents immigrated to northern Virginia when she was very young, and many relatives joined them, part of a steady flow of Sudanese fleeing political turmoil and oppression.

    “A lot of people had expected more from the Sudanese state after independence and didn’t get it,” she says. “People had hopes for what the government could and should do.”

    Hassan’s Sudanese roots and ongoing connection to the Sudanese community have shaped her academic interests and goals. At the University of Virginia, she gravitated toward history and economics classes. But it was her time at the Ralph Bunche Summer institute that perhaps proved most pivotal in her journey. This five-week intensive program is offered by the American Political Science Association to introduce underrepresented undergraduate students to doctoral studies. “It was really compelling in this program to think rigorously about all the political ideas I’d heard as I was growing up, and find ways to challenge some assertions empirically,” she says.

    Regime change and civil society

    At Harvard, Hassan first set out to focus on Sudan for her doctoral program. “There wasn’t much scholarship on the country, and what there was lacked rigor,” she says. “That was something that needed to change.” But she decided to postpone this goal after realizing that she might be vulnerable as a student conducting field research there. She landed instead in Kenya, where she honed her interviewing and data collection skills.

    Today, empowered by her prior work, she has returned to Sudan. “I felt that the popular uprising in Sudan and ousting of the Islamist regime in 2019 should be documented and analyzed,” she says. “It was incredible that hundreds of thousands, if not millions, acted collectively to uproot a dictator, in the face of brutal violence from the state.”But “democracy is still uncertain there,” says Hassan. The broad coalition behind regime change “doesn’t know how to govern because different people and different sectors of society have different ideas about what democratic Sudan should look like,” she says. “Overthrowing an autocratic regime and having civil society come together to figure out what’s going to replace it require different things, and it’s unclear if a movement that accomplishes the first is well-suited to do the second.”

    Hassan believes that in order to create lasting democratization, “you need the hard work of building organizations, developing ways in which members learn to compromise among themselves, and make decisions and rules for how to move forward.”

    Hassan is enjoying the fall semester and teaching courses on autocracy and authoritarian regimes. She is excited as well about developing her work on African efforts at democratic mobilization in a political science department she describes as “policy-forward.”

    Over time, she hopes to connect with Institute scholars in the hard sciences to think about other challenges these nations are facing, such as climate change. “It’s really hot in Sudan, and it may be one of the first countries to become completely uninhabitable,” she says. “I’d like to explore strategies for growing crops differently or managing the exceedingly scarce resource of water, and figure out what kind of political discussions will be necessary to implement any changes. It is really critical to think about these problems in an interdisciplinary way.” More

  • in

    Engineers solve a mystery on the path to smaller, lighter batteries

    A discovery by MIT researchers could finally unlock the door to the design of a new kind of rechargeable lithium battery that is more lightweight, compact, and safe than current versions, and that has been pursued by labs around the world for years.

    The key to this potential leap in battery technology is replacing the liquid electrolyte that sits between the positive and negative electrodes with a much thinner, lighter layer of solid ceramic material, and replacing one of the electrodes with solid lithium metal. This would greatly reduce the overall size and weight of the battery and remove the safety risk associated with liquid electrolytes, which are flammable. But that quest has been beset with one big problem: dendrites.

    Dendrites, whose name comes from the Latin for branches, are projections of metal that can build up on the lithium surface and penetrate into the solid electrolyte, eventually crossing from one electrode to the other and shorting out the battery cell. Researchers haven’t been able to agree on what gives rise to these metal filaments, nor has there been much progress on how to prevent them and thus make lightweight solid-state batteries a practical option.

    The new research, being published today in the journal Joule in a paper by MIT Professor Yet-Ming Chiang, graduate student Cole Fincher, and five others at MIT and Brown University, seems to resolve the question of what causes dendrite formation. It also shows how dendrites can be prevented from crossing through the electrolyte.

    Chiang says in the group’s earlier work, they made a “surprising and unexpected” finding, which was that the hard, solid electrolyte material used for a solid-state battery can be penetrated by lithium, which is a very soft metal, during the process of charging and discharging the battery, as ions of lithium move between the two sides.

    This shuttling back and forth of ions causes the volume of the electrodes to change. That inevitably causes stresses in the solid electrolyte, which has to remain fully in contact with both of the electrodes that it is sandwiched between. “To deposit this metal, there has to be an expansion of the volume because you’re adding new mass,” Chiang says. “So, there’s an increase in volume on the side of the cell where the lithium is being deposited. And if there are even microscopic flaws present, this will generate a pressure on those flaws that can cause cracking.”

    Those stresses, the team has now shown, cause the cracks that allow dendrites to form. The solution to the problem turns out to be more stress, applied in just the right direction and with the right amount of force.

    While previously, some researchers thought that dendrites formed by a purely electrochemical process, rather than a mechanical one, the team’s experiments demonstrate that it is mechanical stresses that cause the problem.

    The process of dendrite formation normally takes place deep within the opaque materials of the battery cell and cannot be observed directly, so Fincher developed a way of making thin cells using a transparent electrolyte, allowing the whole process to be directly seen and recorded. “You can see what happens when you put a compression on the system, and you can see whether or not the dendrites behave in a way that’s commensurate with a corrosion process or a fracture process,” he says.

    The team demonstrated that they could directly manipulate the growth of dendrites simply by applying and releasing pressure, causing the dendrites to zig and zag in perfect alignment with the direction of the force.

    Applying mechanical stresses to the solid electrolyte doesn’t eliminate the formation of dendrites, but it does control the direction of their growth. This means they can be directed to remain parallel to the two electrodes and prevented from ever crossing to the other side, and thus rendered harmless.

    In their tests, the researchers used pressure induced by bending the material, which was formed into a beam with a weight at one end. But they say that in practice, there could be many different ways of producing the needed stress. For example, the electrolyte could be made with two layers of material that have different amounts of thermal expansion, so that there is an inherent bending of the material, as is done in some thermostats.

    Another approach would be to “dope” the material with atoms that would become embedded in it, distorting it and leaving it in a permanently stressed state. This is the same method used to produce the super-hard glass used in the screens of smart phones and tablets, Chiang explains. And the amount of pressure needed is not extreme: The experiments showed that pressures of 150 to 200 megapascals were sufficient to stop the dendrites from crossing the electrolyte.

    The required pressure is “commensurate with stresses that are commonly induced in commercial film growth processes and many other manufacturing processes,” so should not be difficult to implement in practice, Fincher adds.

    In fact, a different kind of stress, called stack pressure, is often applied to battery cells, by essentially squishing the material in the direction perpendicular to the battery’s plates — somewhat like compressing a sandwich by putting a weight on top of it. It was thought that this might help prevent the layers from separating. But the experiments have now demonstrated that pressure in that direction actually exacerbates dendrite formation. “We showed that this type of stack pressure actually accelerates dendrite-induced failure,” Fincher says.

    What is needed instead is pressure along the plane of the plates, as if the sandwich were being squeezed from the sides. “What we have shown in this work is that when you apply a compressive force you can force the dendrites to travel in the direction of the compression,” Fincher says, and if that direction is along the plane of the plates, the dendrites “will never get to the other side.”

    That could finally make it practical to produce batteries using solid electrolyte and metallic lithium electrodes. Not only would these pack more energy into a given volume and weight, but they would eliminate the need for liquid electrolytes, which are flammable materials.

    Having demonstrated the basic principles involved, the team’s next step will be to try to apply these to the creation of a functional prototype battery, Chiang says, and then to figure out exactly what manufacturing processes would be needed to produce such batteries in quantity. Though they have filed for a patent, the researchers don’t plan to commercialize the system themselves, he says, as there are already companies working on the development of solid-state batteries. “I would say this is an understanding of failure modes in solid-state batteries that we believe the industry needs to be aware of and try to use in designing better products,” he says.

    The research team included Christos Athanasiou and Brian Sheldon at Brown University, and Colin Gilgenbach, Michael Wang, and W. Craig Carter at MIT. The work was supported by the U.S. National Science Foundation, the U.S. Department of Defense, the U.S. Defense Advanced Research Projects Agency, and the U.S. Department of Energy. More

  • in

    Earth can regulate its own temperature over millennia, new study finds

    The Earth’s climate has undergone some big changes, from global volcanism to planet-cooling ice ages and dramatic shifts in solar radiation. And yet life, for the last 3.7 billion years, has kept on beating.

    Now, a study by MIT researchers in Science Advances confirms that the planet harbors a “stabilizing feedback” mechanism that acts over hundreds of thousands of years to pull the climate back from the brink, keeping global temperatures within a steady, habitable range.

    Just how does it accomplish this? A likely mechanism is “silicate weathering” — a geological process by which the slow and steady weathering of silicate rocks involves chemical reactions that ultimately draw carbon dioxide out of the atmosphere and into ocean sediments, trapping the gas in rocks.

    Scientists have long suspected that silicate weathering plays a major role in regulating the Earth’s carbon cycle. The mechanism of silicate weathering could provide a geologically constant force in keeping carbon dioxide — and global temperatures — in check. But there’s never been direct evidence for the continual operation of such a feedback, until now.

    The new findings are based on a study of paleoclimate data that record changes in average global temperatures over the last 66 million years. The MIT team applied a mathematical analysis to see whether the data revealed any patterns characteristic of stabilizing phenomena that reined in global temperatures on a  geologic timescale.

    They found that indeed there appears to be a consistent pattern in which the Earth’s temperature swings are dampened over timescales of hundreds of thousands of years. The duration of this effect is similar to the timescales over which silicate weathering is predicted to act.

    The results are the first to use actual data to confirm the existence of a stabilizing feedback, the mechanism of which is likely silicate weathering. This stabilizing feedback would explain how the Earth has remained habitable through dramatic climate events in the geologic past.

    “On the one hand, it’s good because we know that today’s global warming will eventually be canceled out through this stabilizing feedback,” says Constantin Arnscheidt, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “But on the other hand, it will take hundreds of thousands of years to happen, so not fast enough to solve our present-day issues.”

    The study is co-authored by Arnscheidt and Daniel Rothman, professor of geophysics at MIT.

    Stability in data

    Scientists have previously seen hints of a climate-stabilizing effect in the Earth’s carbon cycle: Chemical analyses of ancient rocks have shown that the flux of carbon in and out of Earth’s surface environment has remained relatively balanced, even through dramatic swings in global temperature. Furthermore, models of silicate weathering predict that the process should have some stabilizing effect on the global climate. And finally, the fact of the Earth’s enduring habitability points to some inherent, geologic check on extreme temperature swings.

    “You have a planet whose climate was subjected to so many dramatic external changes. Why did life survive all this time? One argument is that we need some sort of stabilizing mechanism to keep temperatures suitable for life,” Arnscheidt says. “But it’s never been demonstrated from data that such a mechanism has consistently controlled Earth’s climate.”

    Arnscheidt and Rothman sought to confirm whether a stabilizing feedback has indeed been at work, by looking at data of global temperature fluctuations through geologic history. They worked with a range of global temperature records compiled by other scientists, from the chemical composition of ancient marine fossils and shells, as well as preserved Antarctic ice cores.

    “This whole study is only possible because there have been great advances in improving the resolution of these deep-sea temperature records,” Arnscheidt notes. “Now we have data going back 66 million years, with data points at most thousands of years apart.”

    Speeding to a stop

    To the data, the team applied the mathematical theory of stochastic differential equations, which is commonly used to reveal patterns in widely fluctuating datasets.

    “We realized this theory makes predictions for what you would expect Earth’s temperature history to look like if there had been feedbacks acting on certain timescales,” Arnscheidt explains.

    Using this approach, the team analyzed the history of average global temperatures over the last 66 million years, considering the entire period over different timescales, such as tens of thousands of years versus hundreds of thousands, to see whether any patterns of stabilizing feedback emerged within each timescale.

    “To some extent, it’s like your car is speeding down the street, and when you put on the brakes, you slide for a long time before you stop,” Rothman says. “There’s a timescale over which frictional resistance, or a stabilizing feedback, kicks in, when the system returns to a steady state.”

    Without stabilizing feedbacks, fluctuations of global temperature should grow with timescale. But the team’s analysis revealed a regime in which fluctuations did not grow, implying that a stabilizing mechanism reigned in the climate before fluctuations grew too extreme. The timescale for this stabilizing effect — hundreds of thousands of years — coincides with what scientists predict for silicate weathering.

    Interestingly, Arnscheidt and Rothman found that on longer timescales, the data did not reveal any stabilizing feedbacks. That is, there doesn’t appear to be any recurring pull-back of global temperatures on timescales longer than a million years. Over these longer timescales, then, what has kept global temperatures in check?

    “There’s an idea that chance may have played a major role in determining why, after more than 3 billion years, life still exists,” Rothman offers.

    In other words, as the Earth’s temperatures fluctuate over longer stretches, these fluctuations may just happen to be small enough in the geologic sense, to be within a range that a stabilizing feedback, such as silicate weathering, could periodically keep the climate in check, and more to the point, within a habitable zone.

    “There are two camps: Some say random chance is a good enough explanation, and others say there must be a stabilizing feedback,” Arnscheidt says. “We’re able to show, directly from data, that the answer is probably somewhere in between. In other words, there was some stabilization, but pure luck likely also played a role in keeping Earth continuously habitable.”

    This research was supported, in part, by a MathWorks fellowship and the National Science Foundation. More

  • in

    Keeping indoor humidity levels at a “sweet spot” may reduce spread of Covid-19

    We know proper indoor ventilation is key to reducing the spread of Covid-19. Now, a study by MIT researchers finds that indoor relative humidity may also influence transmission of the virus.

    Relative humidity is the amount of moisture in the air compared to the total moisture the air can hold at a given temperature before saturating and forming condensation.

    In a study appearing today in the Journal of the Royal Society Interface, the MIT team reports that maintaining an indoor relative humidity between 40 and 60 percent is associated with relatively lower rates of Covid-19 infections and deaths, while indoor conditions outside this range are associated with worse Covid-19 outcomes. To put this into perspective, most people are comfortable between 30 and 50 percent relative humidity, and an airplane cabin is at around 20 percent relative humidity.

    The findings are based on the team’s analysis of Covid-19 data combined with meteorological measurements from 121 countries, from January 2020 through August 2020. Their study suggests a strong connection between regional outbreaks and indoor relative humidity.

    In general, the researchers found that whenever a region experienced a rise in Covid-19 cases and deaths prevaccination, the estimated indoor relative humidity in that region, on average, was either lower than 40 percent or higher than 60 percent regardless of season. Nearly all regions in the study experienced fewer Covid-19 cases and deaths during periods when estimated indoor relative humidity was within a “sweet spot” between 40 and 60 percent.

    “There’s potentially a protective effect of this intermediate indoor relative humidity,” suggests lead author Connor Verheyen, a PhD student in medical engineering and medical physics in the Harvard-MIT Program in Health Sciences and Technology.

    “Indoor ventilation is still critical,” says co-author Lydia Bourouiba, director of the MIT Fluid Dynamics of Disease Transmission Laboratory and associate professor in the departments of Civil and Environmental Engineering and Mechanical Engineering, and at the Institute for Medical Engineering and Science at MIT. “However, we find that maintaining an indoor relative humidity in that sweet spot — of 40 to 60 percent — is associated with reduced Covid-19 cases and deaths.”

    Seasonal swing?

    Since the start of the Covid-19 pandemic, scientists have considered the possibility that the virus’ virulence swings with the seasons. Infections and associated deaths appear to rise in winter and ebb in summer. But studies looking to link the virus’ patterns to seasonal outdoor conditions have yielded mixed results.

    Verheyen and Bourouiba examined whether Covid-19 is influenced instead by indoor — rather than outdoor — conditions, and, specifically, relative humidity. After all, they note that most societies spend more than 90 percent of their time indoors, where the majority of viral transmission has been shown to occur. What’s more, indoor conditions can be quite different from outdoor conditions as a result of climate control systems, such as heaters that significantly dry out indoor air.

    Could indoor relative humidity have affected the spread and severity of Covid-19 around the world? And could it help explain the differences in health outcomes from region to region?

    Tracking humidity

    For answers, the team focused on the early period of the pandemic when vaccines were not yet available, reasoning that vaccinated populations would obscure the influence of any other factor such as indoor humidity. They gathered global Covid-19 data, including case counts and reported deaths, from January 2020 to August 2020,  and identified countries with at least 50 deaths, indicating at least one outbreak had occurred in those countries.

    In all, they focused on 121 countries where Covid-19 outbreaks occurred. For each country, they also tracked the local Covid-19 related policies, such as isolation, quarantine, and testing measures, and their statistical association with Covid-19 outcomes.

    For each day that Covid-19 data was available, they used meteorological data to calculate a country’s outdoor relative humidity. They then estimated the average indoor relative humidity, based on outdoor relative humidity and guidelines on temperature ranges for human comfort. For instance, guidelines report that humans are comfortable between 66 to 77 degrees Fahrenheit indoors. They also assumed that on average, most populations have the means to heat indoor spaces to comfortable temperatures. Finally, they also collected experimental data, which they used to validate their estimation approach.

    For every instance when outdoor temperatures were below the typical human comfort range, they assumed indoor spaces were heated to reach that comfort range. Based on the added heating, they calculated the associated drop in indoor relative humidity.

    In warmer times, both outdoor and indoor relative humidity for each country was about the same, but they quickly diverged in colder times. While outdoor humidity remained around 50 percent throughout the year, indoor relative humidity for countries in the Northern and Southern Hemispheres dropped below 40 percent in their respective colder periods, when Covid-19 cases and deaths also spiked in these regions.

    For countries in the tropics, relative humidity was about the same indoors and outdoors throughout the year, with a gradual rise indoors during the region’s summer season, when high outdoor humidity likely raised the indoor relative humidity over 60 percent. They found this rise mirrored the gradual increase in Covid-19 deaths in the tropics.

    “We saw more reported Covid-19 deaths on the low and high end of indoor relative humidity, and less in this sweet spot of 40 to 60 percent,” Verheyen says. “This intermediate relative humidity window is associated with a better outcome, meaning fewer deaths and a deceleration of the pandemic.”

    “We were very skeptical initially, especially as the Covid-19 data can be noisy and inconsistent,” Bourouiba says. “We thus were very thorough trying to poke holes in our own analysis, using a range of approaches to test the limits and robustness of the findings, including taking into account factors such as government intervention. Despite all our best efforts, we found that even when considering countries with very strong versus very weak Covid-19 mitigation policies, or wildly different outdoor conditions, indoor — rather than outdoor — relative humidity maintains an underlying strong and robust link with Covid-19 outcomes.”

    It’s still unclear how indoor relative humidity affects Covid-19 outcomes. The team’s follow-up studies suggest that pathogens may survive longer in respiratory droplets in both very dry and very humid conditions.

    “Our ongoing work shows that there are emerging hints of mechanistic links between these factors,” Bourouiba says. “For now however, we can say that indoor relative humidity emerges in a robust manner as another mitigation lever that organizations and individuals can monitor, adjust, and maintain in the optimal 40 to 60 percent range, in addition to proper ventillation.”

    This research was made possible, in part, by an MIT Alumni Class fund, the Richard and Susan Smith Family Foundation, the National Institutes of Health, and the National Science Foundation. More