More stories

  • in

    Why the Earth needs a course correction now

    The massive impact of the Covid-19 pandemic on lives and economies underscores that our collective survival and well-being hinges on our willingness to confront environmental threats that have global consequences. Key to protecting lives and making communities more resilient to such threats will be an emphasis on proactive, science-based decision-making at all levels of society. And among the most serious risks that science can help illuminate and alleviate are those resulting from human-induced climate change.

    To minimize those risks, the Paris Agreement aims to commit nearly 200 nations to implement greenhouse gas emissions-reduction policies consistent with keeping the increase in the global average temperature since preindustrial times to well below 2 degrees Celsius — and pursue efforts to further limit that increase to 1.5 C. Recognizing that the first set of submitted near-term Paris pledges, known as Nationally Determined Contributions (NDCs), are inadequate by themselves to put the globe on track to meet those long-term targets and thus avoid the worst consequences of climate change, the accord calls for participating nations to strengthen their NDCs over time. To that end, the United States and a few other nations announced more stringent emissions-reduction goals for 2030 at the virtual climate summit convened by President Joe Biden in April.  

    To support decision-makers now engaged in or impacted by this ongoing, international effort to stabilize the climate, the MIT Joint Program on the Science and Policy of Global Change has released its 2021 Global Change Outlook. Based on a rigorous, integrated analysis of population and economic growth, technological change, NDCs, Covid-19 impacts, and other factors, the report presents the Joint Program’s latest projections for the future of the Earth’s energy, food, water and climate systems, as well as prospects for achieving the Paris Agreement’s short and long-term climate goals.

    Projections are provided for a baseline “Paris Forever” scenario, in which current (as of March 2021) NDCs are maintained in perpetuity; a Paris 2 C scenario that caps global warming at 2 C by 2100; and two scenarios — “Accelerated Actions” (which includes the newly announced U.S. goal for 2030) and Paris 1.5 C — which limit warming to 1.5 C by 2100. Uncertainty is quantified using 400-member ensembles of projections for each scenario. This year’s outlook introduces a visualization tool that enables a higher-resolution exploration of the first three scenarios.

    Energy

    More aggressive emissions-reduction policies would accelerate a shift away from fossil fuels and toward renewable energy sources between now and 2050.

    Under the Paris Forever scenario, the share of fossil fuels in the world’s energy mix drops during this period from about 80 percent to 70 percent, wind and solar expand nearly six-fold and natural gas by 50 percent, and electric vehicles (EVs) account for 38 percent of the light-duty vehicle (LDV) fleet. In the Paris 2 C scenario, the fossil fuel share drops to about 50 percent, wind and solar energy grow almost nine times and natural gas use expands by 25 percent, and EVs account for 50 percent of the global LDV fleet. The Accelerated Actions scenario squeezes out fossil fuels further and makes two-thirds of global LDVs electric.  

    “Electricity generation from renewable sources becomes a dominant source of power by 2050 in all scenarios, providing 70-80 percent of global power generation by mid-century in the climate stabilization scenarios,” says Joint Program Deputy Director Sergey Paltsev, a lead author of the report. “Climate policies essentially eliminate coal-based generation, while natural gas still keeps a sizeable share because of the need to support variable renewables. Resolving long-term energy storage issues are critical to full decarbonization.”

    Food and water

    Under the Paris Forever scenario, agriculture and food production will keep growing. This will increase pressure for land-use change, water use, and use of energy-intensive inputs, which will also lead to higher greenhouse gas (GHG) emissions. The Paris 2 C scenario shows low impacts on agriculture and food production trends by mid-century. Although economic growth tends to shift demand toward more protein-rich food sources, higher carbon costs associated with livestock production drive demand downward, decreasing its prices, and such impacts are transmitted to the food sector.

    The Paris Forever scenario indicates that more than half of the world’s population will undergo stresses on its water supply by 2050, and that three of every 10 people will live in water basins where compounding societal and environmental pressures on water resources will be experienced. The majority of expected increases in population under heightened water stress by mid-century cannot be avoided or reduced by climate mitigation efforts alone. Worldwide increases in population, economic growth, and associated water demands are largely a challenge of sustainability — one that can only be alleviated through widespread transformations of water systems’ storage capacity, conveyance, and water-use efficiencies.

    Climate and Paris goals

    The outlook shows a wide gap between current (as of March 2021) GHG emissions-reduction commitments and those needed to put the world on track to meet the Paris Agreement’s long-term climate goals.

    “Our projected global climate responses under the Paris Forever scenario indicate with near-certainty that the world will surpass critical GHG concentration thresholds and climate targets in the coming decades,” says Joint Program Deputy Director C. Adam Schlosser, a lead author of the report.

    Under Paris Forever, the world is likely to exceed 2 C global climate warming by 2065, 2.8 C by 2100, and 4.1 C by 2150. While many countries have made good progress toward their NDCs and declared more ambitious GHG emissions mitigation goals, financing to assist the least-developed countries in sustainable development is not forthcoming at the levels needed.

    The report’s projections indicate that the long-term climate targets of the Paris Agreement remain achievable, but come with different levels of risk. The Paris 2 C scenario shows negligible likelihood of even the “coolest” trajectories remaining below 1.5 C at the end of the century. The Paris 1.5 C scenario, however, can virtually assure the world of remaining below 2 C of global warming.

    An important consequence of climate change is altered precipitation levels. Between now and 2050 under Paris Forever, global precipitation will likely increase by about 1.5 centimeters per year — approximately an additional 7,400 cubic kilometers (or nearly 2 quadrillion gallons) each year. By 2100, the total change in precipitation will most likely rise to about 4 cm/year (or 21,200 km3/yr) — nearly triple that of the mid-century change. Paris 2 C halves global precipitation increases, and Paris 1.5 C reduces them to almost a third of the Paris Forever increases. These aggressive mitigation scenarios convey considerable reductions in flood risk and associated adaptation costs.

    Reducing climate risk

    For the first time, the outlook explores two well-known sets of risks posed by climate change. Research highlighted in this report indicates that elevated climate-related physical risks will continue to evolve by mid-century, along with heightened transition risks that arise from shifts in the political, technological, social, and economic landscapes that are likely to occur during the transition to a low-carbon economy.

    “Our outlook shows that we could dramatically reduce overall climate risk through more ambitious and accelerated policy measures and investments aligned with meeting the Paris Agreement’s long-term 1.5 C or 2 C climate targets,” says MIT Joint Program Director Ronald Prinn. “Decision-makers in government, industry, and financial institutions can play a key role in moving us further along this path.” More

  • in

    MITEI researchers build a supply chain model to support the hydrogen economy

    Over the past decades, the need for carbon-free energy has driven increasing interest in hydrogen as an environmentally clean fuel. But shifting the economy away from fossils fuels to clean-burning hydrogen will require significant adjustments in current supply chains. To facilitate this transition, an MIT-led team of researchers has developed a new hydrogen supply chain planning model.

    “We propose flexible scheduling for trucks and pipelines, allowing them to serve as both storage and transmission,” says Guannan He, a postdoc at the MIT Energy Initiative (MITEI) and lead author of a recent paper published by IEEE Transactions on Sustainable Energy. “This is very important to green hydrogen produced from intermittent renewables, because this can provide extra flexibility to meet variability in supply and demand.”

    Hydrogen has been widely recognized as a promising path to decarbonizing many sectors of the economy because it packs in more energy by weight than even gasoline or natural gas, yet generates zero emissions when used as an energy source. Producing hydrogen, however, can generate significant emissions. According to the U.S. Office of Energy Efficiency and Renewable Energy, 95 percent of the hydrogen produced today is generated through steam methane reforming (SMR), an energy-intensive process in which methane reacts with water to produce hydrogen and carbon monoxide. A secondary part of this process adds steam to the cooled gas to convert carbon monoxide to carbon dioxide (CO2) and produce more hydrogen.

    Ultimately, hydrogen production today accounts for about 4 percent of CO2 emissions globally, says He, and that number will rise significantly if hydrogen becomes popular as a fuel for electric vehicles and such industrial processes as steel refining and ammonia production. Realizing the vision of creating an entirely decarbonized hydrogen economy therefore depends on using renewable energy to produce hydrogen, a task often accomplished through electrolysis, a process that extracts hydrogen from water electrochemically.

    However, using renewable energy requires storage to move energy from times and places with peak generation to those with peak demand. And, storage is expensive.

    The researchers expanded their thinking about storage to address this key concern: They used trucks in their model both as a means of fuel transmission and of storage — since hydrogen can be readily stored in idled trucks. This tactic reduces costs in the hydrogen supply chain by about 9 percent by bringing down the need for other storage solutions, says He. “We found it very important to use the trucks in this way,” says He. “It can reduce the cost of the system and encourage renewable-based hydrogen production, instead of gas-based production.”

    Developing the model

    Previous studies have attempted to assess the potential benefit of hydrogen storage in power systems, but they have not considered infrastructure investment needs from the perspective of a whole hydrogen supply chain, He says. And such work is critical to enabling a hydrogen economy.

    For the new model, the research team — He; MITEI research scientists Emre Gençer and Dharik Mallapragada; Abhishek Bose, an MIT master’s student in technology and policy; and Clara F. Heuberger, a researcher at Shell Global Solutions International B.V. — adopted the perspective of a central planner interested in minimizing system costs and maximizing societal benefit. The researchers looked at costs associated with the four main steps in the hydrogen supply chain: production, storage, compression, and transmission. “Unless we take a holistic approach to analyzing the entire supply chain, it is hard to determine the prospects for hydrogen. This work fills that gap in the literature,” Gençer says.

    To ensure their model was as comprehensive as possible, the researchers included a wide range of hydrogen-related technologies, including SMR with and without carbon capture and storage, hydrogen transport as a gas or liquid, and transmission via pipeline and trucks. “We have developed a scalable modeling and decision-making tool for a hydrogen supply chain that fully captures the flexibility of various resources as well as components,” Gençer says.

    While considering all options, in the end the researchers found that pipelines were a less flexible option than trucks for transmission (although retrofitting gas pipelines could make hydrogen pipelines cost-effective for some uses), and trucking hydrogen gas was less expensive than trucking hydrogen in liquid form, since liquefaction has much higher energy consumption and capital costs than gas compression.

    They then proposed a flexible scheduling and routing model for hydrogen trucks that would enable the vehicles to be used as both transmission and storage, as needed. Computationally, this was a particularly challenging step, according to He. “This is a very complex optimization model,” he says. “We propose some techniques to reduce the complexity of the model.”

    The team chose to use judicious approximations for the number of trucks in the system and the needed commitment of SMR units, applying clustering and integer relaxation techniques. This enabled them to greatly improve the computational performance of their program without significantly impacting results in terms of cost and investment outcomes.

    Case study of Northeast

    Once the model was built, the researchers tested it by exploring the future hydrogen infrastructure needs of the U.S. Northeast under various carbon policy and hydrogen demand scenarios. Using 20 representative weeks from seven years of data, they simulated annual operations and determined the optimal mix of hydrogen infrastructure types given different carbon prices and the capital costs of electrolyzers.

    “We showed that steam methane reforming of natural gas with carbon capture will constitute a significant fraction of hydrogen production and production capacity even under very high carbon price scenarios,” Gençer says.

    However, He says the results also suggest there is real synergy between the use of electrolysis for hydrogen generation and the use of compressed-gas trucks for transmission and storage. This finding is important, he explains, because “once we invest in these assets, we cannot easily switch to others.”

    He adds that trucks are a significantly more flexible investment than stationary infrastructure, such as pipes and transmission lines; trucks can easily be rerouted to serve new energy-generation facilities and new areas of demand, or even be left sitting to provide storage until more transmission capacity is needed. By comparison, building new electricity transmission lines or pipelines takes time — and they cannot be quickly adapted to changing needs.

    “You have more renewables integrated into the system every day. People are installing rooftop solar panels, so you need more assets to transmit energy to other parts of the system,” He says, explaining that a flexible supply chain can make the most of renewable generation. “A transmission line can take 10 years to build, during which time those renewables cannot be used as well. Using smaller-scale, distributed, portable storage or mobile storage can solve this problem in a timely manner.”

    Indeed, He and other colleagues recently conducted related research into the potential application of utility-scale portable energy storage in California. In a paper published in Joule in February, they showed that mobilizing energy storage can significantly increase revenues from storage in many regions and improve renewable energy integration. “It’s more flexible” than such stationary solutions as additional grid capacity, He says. “When you don’t need mobile storage anymore, you can convert it into stationary storage.”

    Now that He and his colleagues have created their hydrogen supply chain planning model, the next step, according to He, is to provide planners with broad access to the tool. “We are developing open-source code so people can use it to develop optimal assets for different sectors,” He says. “We are trying to make the model better.”

    This research was supported by Shell New Energies Research and Technology and the MIT Energy Initiative Low-Carbon Energy Centers for Electric Power Systems and Carbon Capture, Utilization, and Storage. The research reported in Joule was supported by the National Natural and Science Foundation of China and a grant from the U.S. Department of Energy. More

  • in

    Tiny particles power chemical reactions

    MIT engineers have discovered a new way of generating electricity using tiny carbon particles that can create a current simply by interacting with liquid surrounding them.

    The liquid, an organic solvent, draws electrons out of the particles, generating a current that could be used to drive chemical reactions or to power micro- or nanoscale robots, the researchers say.

    “This mechanism is new, and this way of generating energy is completely new,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT. “This technology is intriguing because all you have to do is flow a solvent through a bed of these particles. This allows you to do electrochemistry, but with no wires.”

    In a new study describing this phenomenon, the researchers showed that they could use this electric current to drive a reaction known as alcohol oxidation — an organic chemical reaction that is important in the chemical industry.

    Strano is the senior author of the paper, which appears today in Nature Communications. The lead authors of the study are MIT graduate student Albert Tianxiang Liu and former MIT researcher Yuichiro Kunai. Other authors include former graduate student Anton Cottrill, postdocs Amir Kaplan and Hyunah Kim, graduate student Ge Zhang, and recent MIT graduates Rafid Mollah and Yannick Eatmon.

    Unique properties

    The new discovery grew out of Strano’s research on carbon nanotubes — hollow tubes made of a lattice of carbon atoms, which have unique electrical properties. In 2010, Strano demonstrated, for the first time, that carbon nanotubes can generate “thermopower waves.” When a carbon nanotube is coated with layer of fuel, moving pulses of heat, or thermopower waves, travel along the tube, creating an electrical current.

    That work led Strano and his students to uncover a related feature of carbon nanotubes. They found that when part of a nanotube is coated with a Teflon-like polymer, it creates an asymmetry that makes it possible for electrons to flow from the coated to the uncoated part of the tube, generating an electrical current. Those electrons can be drawn out by submerging the particles in a solvent that is hungry for electrons.

    To harness this special capability, the researchers created electricity-generating particles by grinding up carbon nanotubes and forming them into a sheet of paper-like material. One side of each sheet was coated with a Teflon-like polymer, and the researchers then cut out small particles, which can be any shape or size. For this study, they made particles that were 250 microns by 250 microns.

    When these particles are submerged in an organic solvent such as acetonitrile, the solvent adheres to the uncoated surface of the particles and begins pulling electrons out of them.

    “The solvent takes electrons away, and the system tries to equilibrate by moving electrons,” Strano says. “There’s no sophisticated battery chemistry inside. It’s just a particle and you put it into solvent and it starts generating an electric field.”

    “This research cleverly shows how to extract the ubiquitous (and often unnoticed) electric energy stored in an electronic material for on-site electrochemical synthesis,” says Jun Yao, an assistant professor of electrical and computer engineering at the University of Massachusetts at Amherst, who was not involved in the study. “The beauty is that it points to a generic methodology that can be readily expanded to the use of different materials and applications in different synthetic systems.”

    Particle power

    The current version of the particles can generate about 0.7 volts of electricity per particle. In this study, the researchers also showed that they can form arrays of hundreds of particles in a small test tube. This “packed bed” reactor generates enough energy to power a chemical reaction called an alcohol oxidation, in which an alcohol is converted to an aldehyde or a ketone. Usually, this reaction is not performed using electrochemistry because it would require too much external current.

    “Because the packed bed reactor is compact, it has more flexibility in terms of applications than a large electrochemical reactor,” Zhang says. “The particles can be made very small, and they don’t require any external wires in order to drive the electrochemical reaction.”

    In future work, Strano hopes to use this kind of energy generation to build polymers using only carbon dioxide as a starting material. In a related project, he has already created polymers that can regenerate themselves using carbon dioxide as a building material, in a process powered by solar energy. This work is inspired by carbon fixation, the set of chemical reactions that plants use to build sugars from carbon dioxide, using energy from the sun.

    In the longer term, this approach could also be used to power micro- or nanoscale robots. Strano’s lab has already begun building robots at that scale, which could one day be used as diagnostic or environmental sensors. The idea of being able to scavenge energy from the environment to power these kinds of robots is appealing, he says.

    “It means you don’t have to put the energy storage on board,” he says. “What we like about this mechanism is that you can take the energy, at least in part, from the environment.”

    The research was funded by the U.S. Department of Energy and a seed grant from the MIT Energy Initiative. More

  • in

    Exploring the future of humanitarian technology

    The year 2030 serves as the resolution to the United Nation’s Agenda for Sustainable Development. The agenda, adopted in 2015 by all UN member states including the United States, mobilizes global efforts to protect the planet, end poverty, foster peace, and safeguard the rights of all people. Nine years out from the target date, the sustainable development goals of the agenda still remain ambitious, and as relevant as ever.

    MIT Lincoln Laboratory has been growing its efforts to provide technology solutions in support of such goals. “We need to discuss innovative ways that advanced technology can address some of these most pressing humanitarian, climate, and health challenges,” says Jon Pitts, who leads Lincoln Laboratory’s Humanitarian Assistance and Disaster Relief Systems Group.

    To help foster these discussions, Pitts and Mischa Shattuck, who serves as the senior humanitarian advisor at Lincoln Laboratory, recently launched a new lecture series, called the Future of Humanitarian Technology.

    In the inaugural session on April 28, Lincoln Laboratory researchers presented three topics inherently linked to each other — those of climate change, disaster response, and global health. The webinar was free and open to the public.

    Play video

    The Future of Humanitarian Technology: MIT Lincoln Laboratory hosted a seminar exploring climate change, disaster response, and global health technology and how these areas might look ten years from now.

    Accelerating sustainable technology

    Deb Campbell, a senior staff member in the HADR Systems Group, started the session with a discussion of how to accelerate the national and global response to climate change.

    “Because the timeline is so short and challenges so complex, it is essential to make good, evidence-based decisions on how to get to where we need to go,” she said. “We call this approach systems analysis and architecture, and by taking this approach we can create a national climate change resilience roadmap.”

    This roadmap implements more of what we already know how to do, for example utilizing wind and solar energy, and identifies gaps where research and development are needed to reach specific goals. One example is the transition to a fully zero-emission vehicle (ZEV) fleet in the United States in the coming decades; California has already directed that all of the state’s new car sales be ZEV by 2035. Systems analysis indicates that achieving this “fleet turnover” will require improved electric grid infrastructure, more charging stations, batteries with higher capacity and faster charging, and greener fuels as the transition is made from combustion engines.

    Campbell also stressed the importance of using regional proving grounds to accelerate the transition of new technologies across the country and globe. These proving grounds refer to areas where climate-related prototypes can be evaluated under the pressures of real-world conditions. For example, the Northeast has older, stressed energy infrastructure that needs upgrading to meet future demand, and is the most natural place to begin implementing and testing new systems. The Southwest, which faces water shortages, can test technologies for even more efficient use of water resources and ways to harvest water from air. Today, Campbell and her team are conducting a study to investigate a regional proving ground concept in Massachusetts.

    “We will need to continuously asses technology development and drive investments to meet these aggressive timelines,” Campbell added.

    Improving disaster response

    The United States experiences more natural disasters than any other country in the world and has spent $800 billion in last 10 years on recovery, which on average takes seven years.

    “At the core of disaster support is information,” said Chad Council, also a researcher in the HADR Systems Group. “Knowing where impacts are and the severity of those impact drives decisions on the quantity and type of support. This can lay the ground work for a successful recovery … We know that the current approach is too slow and costly for years to come.”

    By 2030, Council contends that the government could save lives and reduce costs by leveraging a national remote sensing platform for disaster response. It would use an open architecture that integrates advanced sensor data, field data, modeling, and analytics driven by artificial intelligence to deliver critical information in a standard way to emergency managers across the country. This platform could allow for highly accurate virtual site inspections, wide area search-and-rescue, determination of road damage at city-wide scales, and debris quantifications.

    “To be clear, there’s no one-size-fits-all sensor platform. Some systems are good for a large-scale disaster, but for a small disaster, it might be faster for local transportation department to fly a small drone to image damage,” Council said. “The key is if this national platform is developed to produce the same data as local governments are used to, then this platform will be familiar and trustworthy when that level of disaster response is needed.”

    Over the next two years, the team plans to continue to work with the Federal Emergency Management Agency, the U.S. National Guard, national laboratories, and academia on this open architecture. In parallel, a prototype remote sensing asset will be shared across state and local governments to gain enthusiasm and trust. According to Council, a national remote sensing strategy for disaster response could be employed by the end of 2029.

    Predicting disease outbreaks

    Kajal Claypool, a senior staff member in the Biological and Chemical Technologies Group, concluded with a discussion on using artificial intelligence to predict and mitigate the spread of disease.

    She asks us to fast-forward nine years, and imagine we have convergence of three global health disasters: a new variant of Covid-30 spreading across globe, vector-borne diseases spreading in central and south America, and the first carrier with Ebola has flown into Atlanta. “Well, what if we were able to bring together data from existing surveillance systems, social media, environmental conditions, weather, political unrest, and migration, and use AI analytics to predict an outbreak down to a geolocation, and that first carrier never gets on the airplane?” she asked. “None of these are a far stretch.”

    Artificial intelligence has been used to tackle some of these ideas, but the solutions are one-offs and siloed, Claypool said. One of the greatest impediments to using AI tools to solve global health challenges is harmonizing data, the process of bringing together data of varying semantics and file formats and transforming it into one cohesive dataset.

    “We believe the right solution is to build a federated, open, and secure data platform where data can be shared across stakeholders and nations without loss of control at the nation, state, or stakeholder level,” Claypool said. “These siloes must be broken down and capabilities available for low- and middle-income nations.”

    Over next few years, the laboratory team aims to develop this global health AI platform, building it one disease and one region as a time. The proof of concept will start with malaria, which kills 1.2 million people annually. While there are a number of interventions available today to fight malaria outbreaks, including vaccines, Claypool said that the prediction of hot spots and the decision support needed to intervene is essential. The next major milestone would be to provide data-driven diagnostics and interventions across the globe for other disease conditions.

    “It’s an ambitious but achievable vision. It needs the right partnerships, trust, and vision to make this a reality, and reduce transmission of disease and save lives globally,” she said.

    Addressing humanitarian challenges is a growing R&D focus at Lincoln Laboratory. Last fall, the organization established a new research division, Biotechnology and Human Systems, to further explore global issues around climate change, health, and humanitarian assistance. 

    “Our goal is to build collaboration and communication with a broader community around all of these topics. They are all terribly important and complex and require significant global effort to make a difference,” Pitts says.

    The next event in this series will take place in September. More

  • in

    Accelerating AI at the speed of light

    Improved computing power and an exponential increase in data have helped fuel the rapid rise of artificial intelligence. But as AI systems become more sophisticated, they’ll need even more computational power to address their needs, which traditional computing hardware most likely won’t be able to keep up with. To solve the problem, MIT spinout Lightelligence is developing the next generation of computing hardware.

    The Lightelligence solution makes use of the silicon fabrication platform used for traditional semiconductor chips, but in a novel way. Rather than building chips that use electricity to carry out computations, Lightelligence develops components powered by light that are low energy and fast, and they might just be the hardware we need to power the AI revolution. Compared to traditional architectures, the optical chips made by Lightelligence offer orders of magnitude improvement in terms of high speed, low latency, and low power consumption.

    In order to perform arithmetic operations, electronic chips need to combine tens, sometimes hundreds, of logic gates. To perform this process requires the electronic chip transistors to switch off and on for multiple clock periods. Every time a logic gate transistor switches, it generates heat and consumes power.

    Not so with the chips produced by Lightelligence. In the optical domain, arithmetic computations are done with physics instead of with logic gate transistors that require multiple clocks. More clocks means a slower time to get a result. “We precisely control how the photons interact with each other inside the chip,” says Yichen Shen PhD ’16, co-founder and CEO of Lightelligence. “It’s just light propagating through the chip, photons interfering with each other. The nature of the interference does the mathematics that we want it to do.”

    This process of interference generates very little heat, which means Shen’s optical computing chips enable much lower power consumption than their electron-powered counterparts. Shen points out that we’ve made use of fiber optics for long-distance communication for decades. “Think of the optical fibers spread across the bottom of the Pacific Ocean, and the light propagating through thousands of kilometers without losing much power. Lightelligence is bringing this concept for long-distance communication to on-chip compute.”

    With most forecasters projecting an end to Moore’s Law sometime in 2025, Shen believes his optic-driven solution is poised to address many of the computational challenges of the future. “We’re changing the fundamental way computing is done, and I think we’re doing it at the right time in history,” says Shen. “We believe optics is going to be the next computing platform, at least for linear operations like AI.”

    To be clear, Shen does not envision optics replacing the entire electronic computing industry. Rather, Lightelligence aims to accelerate certain linear algebra operations to perform quick, power-efficient tasks like those found in artificial neural networks.

    Much of AI compute happens in the cloud at data centers like the ones supporting Amazon or Microsoft. Because AI algorithms are computationally intensive, AI compute takes up a large percentage of data center capacity. Picture tens of thousands of servers, running continuously, burning millions of dollars worth of electricity. Now imagine replacing some of those conventional servers with Lightelligence servers that burn much less power at a fraction of the cost. “Our optical chips would greatly reduce the cost of data centers, or, put another way, greatly increase the computational capability of those data centers for AI applications,” says Shen.  

    And what about self-driving vehicles? They rely on cameras and AI computation to make quick decisions. But a conventional digital electronic chip doesn’t “think” quickly enough to make the decisions necessary at high speeds. Faster computational imaging leads to faster decision-making. “Our chip completes these decision-making tasks at a fraction of the time of regular chips, which would enable the AI system within the car to make much quicker decisions and more precise decisions, enabling safer driving,” says Shen.

    Lightelligence boasts an all-MIT founding team, supported by 40 technical experts, including machine learning pioneers, leading photonic researchers, and semiconductor industry veterans intent on revolutionizing computing technology. Shen did his PhD work in the Department of Physics with professors Marin Soljajic and John Joannoupolos, where he developed an interest in the intersection of photonics and AI. “I realized that computation is a key enabler of modern artificial intelligence, and faster computing hardware would be needed to complement the growth of faster, smarter AI algorithms,” he says.

    Lightelligence was founded in 2017 when Shen teamed up with Soljajic and two other MIT alumni. Fellow co-founder Huaiyu Meng SM ’14, PhD ’18 received his doctorate in electrical engineering and now serves as Lightelligence’s vice president of photonics. Rounding out the founding team is Spencer Powers MBA ’16. Powers, who received his MBA from MIT Sloan School of Management, is also a Lightelligence board member with extensive experience in the startup world.

    Shen and his team are not alone in this new field of optical computing, but they do have key advantages over their competitors. First off, they invented the technology at the Institute. Lightelligence is also the first company to have built a complete system of optical computing hardware, which it accomplished in April 2019. Shen is self-assured in the innovation potential of Lightelligence and what it could mean for the future, regardless of the competition. “There are new stories of teams working in this space, but we’re not only the first, we’re the fastest in terms of execution. I stand by that,” he says.

    But there’s another reason Shen’s not worried about the competition. He likens this stage in the evolution of the technology to the era when transistors were replacing vacuum tubes. Several transistor companies were making the leap, but they weren’t competing with each other so much as they were innovating to compete with the incumbent industry. “Having more competitors doing optical computing is good for us at this stage,” says Shen. “It makes for a louder voice, a bigger community to expand and enhance the whole ecosystem for optical computing.”

    By 2021, Shen anticipates that Lightelligence will have de-risked 80-90 percent of the technical challenges necessary for optical computing to be a viable commercial product. In the meantime, Lightelligence is making the most of its status as the newest member of the MIT Startup Exchange accelerator, STEX25, building deep relationships with tier-one customers on several niche applications where there is a pressing need for high-performance hardware, such as data centers and manufacturers. More

  • in

    From gas to solar, bringing meaningful change to Nigeria’s energy systems

    Growing up, Awele Uwagwu’s view of energy was deeply influenced by the oil and gas industry. He was born and raised in Port Harcourt, a city on the southern coast of Nigeria, and his hometown shaped his initial interest in understanding the role of energy in our lives.

    “I basically grew up in a city colored by oil and gas,” says Uwagwu. “Many of the jobs in that area are in the oil sector, and I saw a lot of large companies coming in and creating new buildings and infrastructure. That very much tailored my interest in the energy sector. I kept thinking: What is all of this stuff going on, and what are all these big machines that I see every day? The more sinister side of it was: Why is the water bad? Why is the air bad? And, what can I do about it?”

    Uwagwu has shaped much of his educational and professional journey around answering that question: “What can I do about it?” He is now a senior at MIT, majoring in chemical engineering with a minor in energy studies.

    After attending high school in Nigeria’s capital city, Abuja, Uwagwu decided to pursue a degree in chemical engineering and briefly attended the University of Illinois at Urbana-Champaign in 2016. Unfortunately, the impacts of a global crash in oil prices made the situation difficult back in Nigeria, so he returned home and found employment at an oil services company working on a water purification process.

    It was during this time that he decided to apply to MIT. “I wanted to go to a really great place,” he says, “and I wanted to take my chances.” After only a few months of working at his new job, he was accepted to MIT.

    “At this point in my life I had a much clearer picture of what I wanted to do. I knew I wanted to be in the energy sector and make some sort of impact. But I didn’t quite know how I was going to do that,” he says.

    With this in mind, Uwagwu met with Rachel Shulman, the undergraduate academic coordinator at the MIT Energy Initiative, to learn about the different ways that MIT is engaged in energy. He eventually decided to become an energy studies minor and concentrate in energy engineering studies through the 10-ENG: Energy program in the Department of Chemical Engineering. Additionally, he participated in the Undergraduate Research Opportunities Program (UROP) in the lab of William H. Green, the Hoyt C. Hottel Professor in Chemical Engineering, focusing on understanding the different reaction pathways for the production of soot from the combustion of carbon.

    After this engaging experience, he reconnected with Shulman to get involved with another UROP, this time with a strong focus in renewable energy. She pointed him toward Ian Mathews — a postdoc in the Photovoltaic Research Laboratory and founder of Sensai Analytics — to discuss ways he could make a beneficial impact on the energy industry in Nigeria. This conversation led to a second UROP, under the supervision of Mathews. In that project, Uwagwu worked to figure out how cost-effective solar energy would be in Nigeria compared to petrol-powered generators, which are commonly used to supplement the unreliable national grid.

    “The idea we had is that these generators are really, really bad for the environment, whereas solar is cheap and better for the environment,” Uwagwu says. “But we needed to know if solar is actually affordable.” After setting up a software model and connecting with Leke Oyefeso, a friend back home, to get data on generators, they concluded that solar was cost-comparable and often cheaper than the generators.

    Armed with this information and another completed UROP, Uwagwu thought, “What happens next?” Quickly an idea started forming, so he and Oyefeso went to Venture Mentoring Services at MIT to figure out how to leverage this knowledge to start a company that could deliver a unique and much-needed product to the Nigerian market.

    They ran through many different potential business plans and ideas, eventually deciding on creating software to design solar systems that are tailored to Nigeria’s specific needs and context. Having come up with the initial idea, they “chatted with people on the solar scene back home to see if this is even useful or if they even need this.”

    Through these discussions and market research, it became increasingly clear to them what sort of novel and pivotal product they could offer to help accelerate Nigeria’s burgeoning solar sector, and their initial idea took on a new shape: solar design software coupled with an online marketplace that connects solar providers to funding sources and energy consumers. In recognition of his unique venture, Uwagwu received a prestigious Legatum Fellowship, a program that offers entrepreneurial MIT students strong mentoring and networking opportunities, educational experiences, and substantial financial support.

    Since its founding in the summer of 2020, their startup, Idagba, has been hard at work getting its product ready for market. Starting a company in the midst of Covid-19 has created a set of unique challenges for Uwagwu and his team, especially as they operate on a whole other continent from their target market.

    “We wanted to travel to Lagos last summer but were unable to do so,” he says. “We can’t make the software without talking to the people and businesses who are going to use it, so there are a lot of Zoom and phone calls going on.”

    In spite of these challenges, Idagba is well on its path to commercialization. “Currently we are developing our minimum viable product,” comments Uwagwu. “The software is going to be very affordable, so there’s very little barrier for entry. We really want to help create this market for solar.”

    In some ways, Idagba is drawing lessons from the success of Mo Ibrahim and his mobile phone company, Celtel. In the late 1990s, Celtel was able to quickly and drastically lower the overall price of cell phones across many countries in Africa, allowing for the widespread adoption of mobile communication at a much faster pace than had been anticipated. To Uwagwu, this same idea can be replicated for solar markets. “We want to reduce the financial and technical barriers to entry for solar like he did for telecom.”

    This won’t be easy, but Uwagwu is up to the task. He sees his company taking off in three phases. The first is getting the design software online. After that has been accomplished — by mid-2021 — comes the hard part: getting customers and solar businesses connected and using the program. Once they have an existing user base and proven cash flow, the ultimate goal of the company is to create and facilitate an ecosystem of people wanting to push solar energy forward. This will make Idagba, as Uwagwu puts it, “the hub of solar energy in Nigeria.” Idagba has a long way to go before reaching that point, but Uwagwu is confident that the building blocks are in place to ensure its success.

    After graduating in June, Uwagwu will be taking up a full-time position at the prestigious consulting firm Bain and Company, where he plans to gain even more experience and connections to help grow his company. This opportunity will provide him with the knowledge and expertise to come back to Idagba and, as he says, “commit my life to this.”

    “This idea may seem ambitious and slightly nonsensical right now,” says Uwagwu, “but this venture has the potential to significantly push Nigeria away from unsustainable fossil fuel consumption to a much cleaner path.” More

  • in

    Taking an indirect path into a bright future

    Matthew Johnston was a physics senior looking to postpone his entry into adulting. He had an intense four years at MIT; when he wasn’t in class, he was playing baseball and working various tech development gigs.

    Johnston had led the MIT Engineers baseball team to a conference championship, becoming the first player in his team’s history to be named a three-time Google Cloud Academic All-American. He put an exclamation mark on his career by hitting four home runs in his final game. 

    Johnston also developed a novel method of producing solar devices as a researcher with GridEdge Solar at MIT, and worked on a tax-loss harvesting research project as an intern at Impact Labs in San Francisco, California. As he contemplated post-graduation life, he liked the idea of gaining new experiences before committing to a company.

    Remotely Down Under

    MISTI-Australia matched him with an internship at Sydney-based Okra Solar, which manufactures smart solar charge controllers in Shenzhen, China, to help power off-the-grid remote villages in Southeast Asian countries such as Cambodia and the Philippines, as well as in Nigeria. 

    “I felt that I had so much more to learn before committing to a full-time job, and I wanted to see the world,” he says. “Working an internship for Okra in Sydney seemed like it would be the perfect buffer between university life and life in the real world. If all went well, maybe I would end up living in Sydney a while longer.”

    After graduating in May 2020 with a BS in physics, a minor in computer science, and a concentration in philosophy, he prepared to live in Sydney, with the possibility of travel to Shenzhen, when he received a familiar pitch: a curveball. 

    Like everyone else, he had hoped that the pandemic would wind down before his Down Under move, but when that didn’t happen, he pivoted to sharing a place with friends in Southern California, where they could hike and camp in nearby Sequoia National Park when they weren’t working remotely.

    On Okra’s software team, he focused on data science to streamline the maintenance and improve the reliability of Okra’s solar energy systems. However, his remote status didn’t mesh with an ongoing project to identify remote villages without grid access. So, he launched his own data project: designing a model to identify shaded solar panels based on their daily power output. That project was placed on hold until they could get more reliable data, but he gained experience setting up machine-learning problems as he developed a pipeline to retrieve, process, and load the data to train the model.

    “This project helped me understand that most of the effort in a data science problem goes into sourcing and processing the data. Unfortunately, it seemed that it was just a bit too early for the model to perform accurately.”

    Team-powered engine

    Coordinating with a team of 23 people from more than 10 unique cultures, scattered across 11 countries in different time zones, presented yet another challenge. He responded by developing a productive workflow by leaving questions in his code reviews that would be answered by the next morning.

    “Working remotely is ultimately a bigger barrier to team cohesion than productivity,” he says. He overcame that hurdle as well; the Aussie team took a liking to him and nicknamed him Jonno. “They’re an awesome group to be around and aren’t afraid to laugh at themselves.”   

    Soon, Jonno was helping the service delivery team efficiently diagnose and resolve real issues in the field using sensor data. By automating the maintenance process in this way, Okra makes it possible for energy companies to deploy and manage last-mile energy projects at scale. Several months later, when he began contributing to the firmware team, he also took on the project of calculating a battery’s state of charge, with the goal to open-source a robust and reliable algorithm.

    “Matt excelled despite the circumstance,” says Okra Solar co-founder and CEO Afnan Hannan. “Matt contributed to developing Okra’s automated field alerts system that monitors the health and performance of Okra’s solar systems, which are deployed across Southeast Asia and Africa. Additionally, Matt led the development of a state-of-the-art Kalman filter-based online state-of-charge (SoC) algorithm. This included research, prototyping, developing back-testing infrastructure, and finally implementing and deploying the solution on Okra’s microcontroller. An accurate and stable SoC has been a vital part of Okra’s cutting-edge Battery Sharing feature, for which we have Matt to thank.” 

    Full power

    After six months, Johnston joined Okra full time in January, moving to Phnom Penh, Cambodia, to join some of the team in person and immerse himself into firmware and data science. In the short term, the goal is to electrify villages to provide access to much cheaper and more accessible energy.

    “Previously, the only way many of these villages could access electricity was by charging a car battery using a diesel generator,” he says. “This process is very expensive, and it is impossible to charge many batteries simultaneously. In contrast, Okra provides, cheap, accessible, and renewable energy for the entire village.”

    For Johnston to see an Okra project firsthand, some villages are a 30-minute boat ride from their nearest town. He and others travel there to demonstrate small appliances that many in the world take for granted, such as using an electric blender to make a smoothie.

    “It’s really amazing to see how hard-to-reach these villages are and how much electricity can help them,” says Johnston. “Something as simple as using a rice cooker instead of a wood fire can save a family countless hours of chopping wood. It also helps us think about how we can improve our product, both for the users and the energy companies.”   

    “In the long term, the vision is that by providing electricity, we can introduce the possibility of online education and more productive uses of power, allowing these communities to join the modern economy.”

    While getting to Phnom Penh was a challenge, he credits MIT for hitting yet another home run.

    “I think two of the biggest things I learned from both baseball and physics were how to learn challenging things and how to overcome failure. It takes persistence to keep digging for more information and practicing what you’ve already failed, and this same way of thinking has helped me to develop my professional skills. At the same time, I am grateful for the time I spent studying philosophy. Thinking deeply about what might lead to a meaningful life for myself and for others has led me to stumble upon opportunities like this one.” More

  • in

    Phonon catalysis could lead to a new field

    Batteries and fuel cells often rely on a process known as ion diffusion to function. In ion diffusion, ionized atoms move through solid materials, similar to the process of water being absorbed by rice when cooked. Just like cooking rice, ion diffusion is incredibly temperature-dependent and requires high temperatures to happen fast.

    This temperature dependence can be limiting, as the materials used in some systems like fuel cells need to withstand high temperatures sometimes in excess of 1,000 degrees Celsius. In a new study, a team of researchers at MIT and the University of Muenster in Germany showed a new effect, where ion diffusion is enhanced while the material remains cold, by only exciting a select number of vibrations known as phonons. This new approach — which the team refers to as “phonon catalysis” — could lead to an entirely new field of research. Their work was published in Cell Reports Physical Science.

    In the study, the research team used a computational model to determine which vibrations actually caused ions to move during ion diffusion. Rather than increasing the temperature of the entire material, they increased the temperature of just those specific vibrations in a process they refer to as targeted phonon excitation.

    “We only heated up the vibrations that matter, and in doing so we were able to show that you could keep the material cold, but have it behave just like it’s very hot,” says Asegun Henry, professor of mechanical engineering and co-author of the study.

    This ability to keep materials cool during ion diffusion could have a wide range of applications. In the example of fuel cells, if the entire cell doesn’t need to be exposed to extremely high temperatures engineers could use cheaper materials to build them. This would lower the cost of fuel cells and would help them last longer — solving the issue of the short lifetime of many fuel cells.

    The process could also have implications for lithium-ion batteries.

    “Discovering new ion conductors is critical to advance lithium batteries, and opportunities include enabling the use of lithium metal, which can potentially double the energy of lithium-ion batteries. Unfortunately, the fundamental understanding of ion conduction is lacking,” adds Yang Shao-Horn, W.M. Keck Professor of Energy and co-author.

    This new work builds upon her previous research, specifically the work of Sokseiha Muy PhD ’18 on design principles for ion conductors, which shows lowering phonon energy in structures reduces the barrier for ion diffusion and potentially increases ion conductivity. Kiarash Gordiz, a postdoc working jointly with Henry’s Atomistic Simulation and Energy Research Group and Shao-Horn’s Electrochemical Energy Laboratory, wondered if they could combine Shao-Horn’s research on ion conduction with Henry’s research on heat transfer.

    “Using Professor Shao-Horn’s previous work on ion conductors as a starting point, we set out to determine exactly which phonon modes are contributing to ion diffusion,” says Gordiz.

    Henry, Gordiz, and their team used a model for lithium phosphate, which is often found in lithium-ion batteries. Using a computational method known as normal mode analysis, along with nudged elastic-band calculations and molecular dynamics simulations, the research group quantitatively computed how much each phonon contributes to the ion diffusion process in lithium phosphate.

    Armed with this knowledge, researchers could use lasers to selectively excite or heat up specific phonons, rather than exposing the entire material to high temperatures. This method could open up a new world of possibilities.

    The dawn of a new field

    Henry believes this method could lead to the creation of a new research field — one he refers to as “phonon catalysis.” While the new work focuses specifically on ion diffusion, Henry sees applications in chemical reactions, phase transformations, and other temperature-dependent phenomena.

    “Our group is fascinated by the idea that you may be able to catalyze all kinds of things now that we have the technique to figure out which phonons matter,” says Henry. “All of these reactions that usually require extreme temperatures could now happen at room temperature.”

    Henry and his team have begun exploring potential applications for phonon catalysis. Gordiz has been looking at using the method for lithium superionic conductors, which could be used in clean energy storage. The team is also considering applications such as a room-temperature superconductor and even the creation of diamonds, which require extremely high pressure and temperatures that could be triggered at much lower temperatures through phonon catalysis.

    “This idea of selective excitation, focusing only on the parts that you need rather than everything, could be a very big kind of paradigm shift for how we operate things,” says Henry. “We need to start thinking of temperature as a spectrum and not just a single number.”

    The researchers plan to show more examples of targeted phonon excitation working in different materials. Moving forward, they hope to demonstrate their computational model works experimentally in these materials.  More