More stories

  • in

    How to solve a bottleneck for CO2 capture and conversion

    Removing carbon dioxide from the atmosphere efficiently is often seen as a crucial need for combatting climate change, but systems for removing carbon dioxide suffer from a tradeoff. Chemical compounds that efficiently remove CO₂ from the air do not easily release it once captured, and compounds that release CO₂ efficiently are not very efficient at capturing it. Optimizing one part of the cycle tends to make the other part worse.Now, using nanoscale filtering membranes, researchers at MIT have added a simple intermediate step that facilitates both parts of the cycle. The new approach could improve the efficiency of electrochemical carbon dioxide capture and release by six times and cut costs by at least 20 percent, they say.The new findings are reported today in the journal ACS Energy Letters, in a paper by MIT doctoral students Simon Rufer, Tal Joseph, and Zara Aamer, and professor of mechanical engineering Kripa Varanasi.“We need to think about scale from the get-go when it comes to carbon capture, as making a meaningful impact requires processing gigatons of CO₂,” says Varanasi. “Having this mindset helps us pinpoint critical bottlenecks and design innovative solutions with real potential for impact. That’s the driving force behind our work.”Many carbon-capture systems work using chemicals called hydroxides, which readily combine with carbon dioxide to form carbonate. That carbonate is fed into an electrochemical cell, where the carbonate reacts with an acid to form water and release carbon dioxide. The process can take ordinary air with only about 400 parts per million of carbon dioxide and generate a stream of 100 percent pure carbon dioxide, which can then be used to make fuels or other products.Both the capture and release steps operate in the same water-based solution, but the first step needs a solution with a high concentration of hydroxide ions, and the second step needs one high in carbonate ions. “You can see how these two steps are at odds,” says Varanasi. “These two systems are circulating the same sorbent back and forth. They’re operating on the exact same liquid. But because they need two different types of liquids to operate optimally, it’s impossible to operate both systems at their most efficient points.”The team’s solution was to decouple the two parts of the system and introduce a third part in between. Essentially, after the hydroxide in the first step has been mostly chemically converted to carbonate, special nanofiltration membranes then separate ions in the solution based on their charge. Carbonate ions have a charge of 2, while hydroxide ions have a charge of 1. “The nanofiltration is able to separate these two pretty well,” Rufer says.Once separated, the hydroxide ions are fed back to the absorption side of the system, while the carbonates are sent ahead to the electrochemical release stage. That way, both ends of the system can operate at their more efficient ranges. Varanasi explains that in the electrochemical release step, protons are being added to the carbonate to cause the conversion to carbon dioxide and water, but if hydroxide ions are also present, the protons will react with those ions instead, producing just water.“If you don’t separate these hydroxides and carbonates,” Rufer says, “the way the system fails is you’ll add protons to hydroxide instead of carbonate, and so you’ll just be making water rather than extracting carbon dioxide. That’s where the efficiency is lost. Using nanofiltration to prevent this was something that we aren’t aware of anyone proposing before.”Testing showed that the nanofiltration could separate the carbonate from the hydroxide solution with about 95 percent efficiency, validating the concept under realistic conditions, Rufer says. The next step was to assess how much of an effect this would have on the overall efficiency and economics of the process. They created a techno-economic model, incorporating electrochemical efficiency, voltage, absorption rate, capital costs, nanofiltration efficiency, and other factors.The analysis showed that present systems cost at least $600 per ton of carbon dioxide captured, while with the nanofiltration component added, that drops to about $450 a ton. What’s more, the new system is much more stable, continuing to operate at high efficiency even under variations in the ion concentrations in the solution. “In the old system without nanofiltration, you’re sort of operating on a knife’s edge,” Rufer says; if the concentration varies even slightly in one direction or the other, efficiency drops off drastically. “But with our nanofiltration system, it kind of acts as a buffer where it becomes a lot more forgiving. You have a much broader operational regime, and you can achieve significantly lower costs.”He adds that this approach could apply not only to the direct air capture systems they studied specifically, but also to point-source systems — which are attached directly to the emissions sources such as power plant emissions — or to the next stage of the process, converting captured carbon dioxide into useful products such as fuel or chemical feedstocks.  Those conversion processes, he says, “are also bottlenecked in this carbonate and hydroxide tradeoff.”In addition, this technology could lead to safer alternative chemistries for carbon capture, Varanasi says. “A lot of these absorbents can at times be toxic, or damaging to the environment. By using a system like ours, you can improve the reaction rate, so you can choose chemistries that might not have the best absorption rate initially but can be improved to enable safety.”Varanasi adds that “the really nice thing about this is we’ve been able to do this with what’s commercially available,” and with a system that can easily be retrofitted to existing carbon-capture installations. If the costs can be further brought down to about $200 a ton, it could be viable for widespread adoption. With ongoing work, he says, “we’re confident that we’ll have something that can become economically viable” and that will ultimately produce valuable, saleable products.Rufer notes that even today, “people are buying carbon credits at a cost of over $500 per ton. So, at this cost we’re projecting, it is already commercially viable in that there are some buyers who are willing to pay that price.” But by bringing the price down further, that should increase the number of buyers who would consider buying the credit, he says. “It’s just a question of how widespread we can make it.” Recognizing this growing market demand, Varanasi says, “Our goal is to provide industry scalable, cost-effective, and reliable technologies and systems that enable them to directly meet their decarbonization targets.”The research was supported by Shell International Exploration and Production Inc. through the MIT Energy Initiative, and the U.S. National Science Foundation, and made use of the facilities at MIT.nano. More

  • in

    SLB joins the MIT.nano Consortium

    SLB, a global company creating technology to address the world’s energy challenges, has joined the MIT.nano Consortium.The MIT.nano Consortium is a platform for academia-industry collaboration, fostering research and innovation in nanoscale science and engineering.“The addition of SLB to the MIT.nano Consortium represents a powerful synergy between academic innovation and leading industry,” says Vladimir Bulović, the founding faculty director of MIT.nano and the Fariborz Masseh (1990) Professor of Emerging Technologies at MIT. “SLB’s expertise in developing energy technologies and its commitment to decarbonization aligns with MIT‘s mission to address the many challenges of climate change. Their addition to the consortium, and collaborations that will follow, will empower the MIT.nano community to advance critical research in this domain.”For 100 years, SLB has developed strategies and systems to unlock access to energy beneath the Earth’s surface. The company’s founder, Conrad Schlumberger, conceived the idea of using electrical measurements to map subsurface rock bodies back in 1912. Since then, SLB has continued to open new fronts in energy exploration—innovating in oil and gas, scaling new technologies, and designing digital solutions. Applying decades of innovation in science and engineering, SLB has committed to accelerating the decarbonization of the energy sector and supporting the global transition to low-carbon energy systems.With more than 900 facilities in over 120 countries, SLB adds to the global industry perspective of the MIT.nano Consortium and the broader MIT research community.“Taking a nanoscale approach to the scientific and technological challenges we face in the decarbonization domains is an endeavor that SLB is excited to embark on with MIT.nano,” says Smaine Zeroug, SLB research director and ambassador to MIT. “We are confident our engagement with MIT.nano and the extensive research network they offer access to will ultimately lead to field-viable solutions.”SLB has a longstanding relationship with MIT. The company, formerly named Schlumberger, donated specialized software to the MIT Seismic Visualization Laboratory in 1999 to enable MIT researchers and students to use three-dimensional seismic data in their studies of the Earth’s upper crust. SLB is also a current member of the MIT CSAIL Alliances.As a member of the MIT.nano consortium, SLB will gain unparalleled access to MIT.nano’s dynamic user community, providing opportunities to share expertise and guide advances in nanoscale technology.MIT.nano continues to welcome new companies as sustaining members. For details, and to see a list of current members, visit the MIT.nano Consortium page. More

  • in

    For clean ammonia, MIT engineers propose going underground

    Ammonia is the most widely produced chemical in the world today, used primarily as a source for nitrogen fertilizer. Its production is also a major source of greenhouse gas emissions — the highest in the whole chemical industry.Now, a team of researchers at MIT has developed an innovative way of making ammonia without the usual fossil-fuel-powered chemical plants that require high heat and pressure. Instead, they have found a way to use the Earth itself as a geochemical reactor, producing ammonia underground. The processes uses Earth’s naturally occurring heat and pressure, provided free of charge and free of emissions, as well as the reactivity of minerals already present in the ground.The trick the team devised is to inject water underground, into an area of iron-rich subsurface rock. The water carries with it a source of nitrogen and particles of a metal catalyst, allowing the water to react with the iron to generate clean hydrogen, which in turn reacts with the nitrogen to make ammonia. A second well is then used to pump that ammonia up to the surface.The process, which has been demonstrated in the lab but not yet in a natural setting, is described today in the journal Joule. The paper’s co-authors are MIT professors of materials science and engineering Iwnetim Abate and Ju Li, graduate student Yifan Gao, and five others at MIT.“When I first produced ammonia from rock in the lab, I was so excited,” Gao recalls. “I realized this represented an entirely new and never-reported approach to ammonia synthesis.’”The standard method for making ammonia is called the Haber-Bosch process, which was developed in Germany in the early 20th century to replace natural sources of nitrogen fertilizer such as mined deposits of bat guano, which were becoming depleted. But the Haber-Bosch process is very energy intensive: It requires temperatures of 400 degrees Celsius and pressures of 200 atmospheres, and this means it needs huge installations in order to be efficient. Some areas of the world, such as sub-Saharan Africa and Southeast Asia, have few or no such plants in operation.  As a result, the shortage or extremely high cost of fertilizer in these regions has limited their agricultural production.The Haber-Bosch process “is good. It works,” Abate says. “Without it, we wouldn’t have been able to feed 2 out of the total 8 billion people in the world right now, he says, referring to the portion of the world’s population whose food is grown with ammonia-based fertilizers. But because of the emissions and energy demands, a better process is needed, he says.Burning fuel to generate heat is responsible for about 20 percent of the greenhouse gases emitted from plants using the Haber-Bosch process. Making hydrogen accounts for the remaining 80 percent.  But ammonia, the molecule NH3, is made up only of nitrogen and hydrogen. There’s no carbon in the formula, so where do the carbon emissions come from? The standard way of producing the needed hydrogen is by processing methane gas with steam, breaking down the gas into pure hydrogen, which gets used, and carbon dioxide gas that gets released into the air.Other processes exist for making low- or no-emissions hydrogen, such as by using solar or wind-generated electricity to split water into oxygen and hydrogen, but that process can be expensive. That’s why Abate and his team worked on developing a system to produce what they call geological hydrogen. Some places in the world, including some in Africa, have been found to naturally generate hydrogen underground through chemical reactions between water and iron-rich rocks. These pockets of naturally occurring hydrogen can be mined, just like natural methane reservoirs, but the extent and locations of such deposits are still relatively unexplored.Abate realized this process could be created or enhanced by pumping water, laced with copper and nickel catalyst particles to speed up the process, into the ground in places where such iron-rich rocks were already present. “We can use the Earth as a factory to produce clean flows of hydrogen,” he says.He recalls thinking about the problem of the emissions from hydrogen production for ammonia: “The ‘aha!’ moment for me was thinking, how about we link this process of geological hydrogen production with the process of making Haber-Bosch ammonia?”That would solve the biggest problem of the underground hydrogen production process, which is how to capture and store the gas once it’s produced. Hydrogen is a very tiny molecule — the smallest of them all — and hard to contain. But by implementing the entire Haber-Bosch process underground, the only material that would need to be sent to the surface would be the ammonia itself, which is easy to capture, store, and transport.The only extra ingredient needed to complete the process was the addition of a source of nitrogen, such as nitrate or nitrogen gas, into the water-catalyst mixture being injected into the ground. Then, as the hydrogen gets released from water molecules after interacting with the iron-rich rocks, it can immediately bond with the nitrogen atoms also carried in the water, with the deep underground environment providing the high temperatures and pressures required by the Haber-Bosch process. A second well near the injection well then pumps the ammonia out and into tanks on the surface.“We call this geological ammonia,” Abate says, “because we are using subsurface temperature, pressure, chemistry, and geologically existing rocks to produce ammonia directly.”Whereas transporting hydrogen requires expensive equipment to cool and liquefy it, and virtually no pipelines exist for its transport (except near oil refinery sites), transporting ammonia is easier and cheaper. It’s about one-sixth the cost of transporting hydrogen, and there are already more than 5,000 miles of ammonia pipelines and 10,000 terminals in place in the U.S. alone. What’s more, Abate explains, ammonia, unlike hydrogen, already has a substantial commercial market in place, with production volume projected to grow by two to three times by 2050, as it is used not only for fertilizer but also as feedstock for a wide variety of chemical processes.For example, ammonia can be burned directly in gas turbines, engines, and industrial furnaces, providing a carbon-free alternative to fossil fuels. It is being explored for maritime shipping and aviation as an alternative fuel, and as a possible space propellant.Another upside to geological ammonia is that untreated wastewater, including agricultural runoff, which tends to be rich in nitrogen already, could serve as the water source and be treated in the process. “We can tackle the problem of treating wastewater, while also making something of value out of this waste,” Abate says.Gao adds that this process “involves no direct carbon emissions, presenting a potential pathway to reduce global CO2 emissions by up to 1 percent.” To arrive at this point, he says, the team “overcame numerous challenges and learned from many failed attempts. For example, we tested a wide range of conditions and catalysts before identifying the most effective one.”The project was seed-funded under a flagship project of MIT’s Climate Grand Challenges program, the Center for the Electrification and Decarbonization of Industry. Professor Yet-Ming Chiang, co-director of the center, says “I don’t think there’s been any previous example of deliberately using the Earth as a chemical reactor. That’s one of the key novel points of this approach.”  Chiang emphasizes that even though it is a geological process, it happens very fast, not on geological timescales. “The reaction is fundamentally over in a matter of hours,” he says. “The reaction is so fast that this answers one of the key questions: Do you have to wait for geological times? And the answer is absolutely no.”Professor Elsa Olivetti, a mission director of the newly established Climate Project at MIT, says, “The creative thinking by this team is invaluable to MIT’s ability to have impact at scale. Coupling these exciting results with, for example, advanced understanding of the geology surrounding hydrogen accumulations represent the whole-of-Institute efforts the Climate Project aims to support.”“This is a significant breakthrough for the future of sustainable development,” says Geoffrey Ellis, a geologist at the U.S. Geological Survey, who was not associated with this work. He adds, “While there is clearly more work that needs to be done to validate this at the pilot stage and to get this to the commercial scale, the concept that has been demonstrated is truly transformative.  The approach of engineering a system to optimize the natural process of nitrate reduction by Fe2+ is ingenious and will likely lead to further innovations along these lines.”The initial work on the process has been done in the laboratory, so the next step will be to prove the process using a real underground site. “We think that kind of experiment can be done within the next one to two years,” Abate says. This could open doors to using a similar approach for other chemical production processes, he adds.The team has applied for a patent and aims to work towards bringing the process to market.“Moving forward,” Gao says, “our focus will be on optimizing the process conditions and scaling up tests, with the goal of enabling practical applications for geological ammonia in the near future.”The research team also included Ming Lei, Bachu Sravan Kumar, Hugh Smith, Seok Hee Han, and Lokesh Sangabattula, all at MIT. Additional funding was provided by the National Science Foundation and was carried out, in part, through the use of MIT.nano facilities. More

  • in

    Nanoscale transistors could enable more efficient electronics

    Silicon transistors, which are used to amplify and switch signals, are a critical component in most electronic devices, from smartphones to automobiles. But silicon semiconductor technology is held back by a fundamental physical limit that prevents transistors from operating below a certain voltage.This limit, known as “Boltzmann tyranny,” hinders the energy efficiency of computers and other electronics, especially with the rapid development of artificial intelligence technologies that demand faster computation.In an effort to overcome this fundamental limit of silicon, MIT researchers fabricated a different type of three-dimensional transistor using a unique set of ultrathin semiconductor materials.Their devices, featuring vertical nanowires only a few nanometers wide, can deliver performance comparable to state-of-the-art silicon transistors while operating efficiently at much lower voltages than conventional devices.“This is a technology with the potential to replace silicon, so you could use it with all the functions that silicon currently has, but with much better energy efficiency,” says Yanjie Shao, an MIT postdoc and lead author of a paper on the new transistors.The transistors leverage quantum mechanical properties to simultaneously achieve low-voltage operation and high performance within an area of just a few square nanometers. Their extremely small size would enable more of these 3D transistors to be packed onto a computer chip, resulting in fast, powerful electronics that are also more energy-efficient.“With conventional physics, there is only so far you can go. The work of Yanjie shows that we can do better than that, but we have to use different physics. There are many challenges yet to be overcome for this approach to be commercial in the future, but conceptually, it really is a breakthrough,” says senior author Jesús del Alamo, the Donner Professor of Engineering in the MIT Department of Electrical Engineering and Computer Science (EECS).They are joined on the paper by Ju Li, the Tokyo Electric Power Company Professor in Nuclear Engineering and professor of materials science and engineering at MIT; EECS graduate student Hao Tang; MIT postdoc Baoming Wang; and professors Marco Pala and David Esseni of the University of Udine in Italy. The research appears today in Nature Electronics.Surpassing siliconIn electronic devices, silicon transistors often operate as switches. Applying a voltage to the transistor causes electrons to move over an energy barrier from one side to the other, switching the transistor from “off” to “on.” By switching, transistors represent binary digits to perform computation.A transistor’s switching slope reflects the sharpness of the “off” to “on” transition. The steeper the slope, the less voltage is needed to turn on the transistor and the greater its energy efficiency.But because of how electrons move across an energy barrier, Boltzmann tyranny requires a certain minimum voltage to switch the transistor at room temperature.To overcome the physical limit of silicon, the MIT researchers used a different set of semiconductor materials — gallium antimonide and indium arsenide — and designed their devices to leverage a unique phenomenon in quantum mechanics called quantum tunneling.Quantum tunneling is the ability of electrons to penetrate barriers. The researchers fabricated tunneling transistors, which leverage this property to encourage electrons to push through the energy barrier rather than going over it.“Now, you can turn the device on and off very easily,” Shao says.But while tunneling transistors can enable sharp switching slopes, they typically operate with low current, which hampers the performance of an electronic device. Higher current is necessary to create powerful transistor switches for demanding applications.Fine-grained fabricationUsing tools at MIT.nano, MIT’s state-of-the-art facility for nanoscale research, the engineers were able to carefully control the 3D geometry of their transistors, creating vertical nanowire heterostructures with a diameter of only 6 nanometers. They believe these are the smallest 3D transistors reported to date.Such precise engineering enabled them to achieve a sharp switching slope and high current simultaneously. This is possible because of a phenomenon called quantum confinement.Quantum confinement occurs when an electron is confined to a space that is so small that it can’t move around. When this happens, the effective mass of the electron and the properties of the material change, enabling stronger tunneling of the electron through a barrier.Because the transistors are so small, the researchers can engineer a very strong quantum confinement effect while also fabricating an extremely thin barrier.“We have a lot of flexibility to design these material heterostructures so we can achieve a very thin tunneling barrier, which enables us to get very high current,” Shao says.Precisely fabricating devices that were small enough to accomplish this was a major challenge.“We are really into single-nanometer dimensions with this work. Very few groups in the world can make good transistors in that range. Yanjie is extraordinarily capable to craft such well-functioning transistors that are so extremely small,” says del Alamo.When the researchers tested their devices, the sharpness of the switching slope was below the fundamental limit that can be achieved with conventional silicon transistors. Their devices also performed about 20 times better than similar tunneling transistors.“This is the first time we have been able to achieve such sharp switching steepness with this design,” Shao adds.The researchers are now striving to enhance their fabrication methods to make transistors more uniform across an entire chip. With such small devices, even a 1-nanometer variance can change the behavior of the electrons and affect device operation. They are also exploring vertical fin-shaped structures, in addition to vertical nanowire transistors, which could potentially improve the uniformity of devices on a chip.“This work definitively steps in the right direction, significantly improving the broken-gap tunnel field effect transistor (TFET) performance. It demonstrates steep-slope together with a record drive-current. It highlights the importance of small dimensions, extreme confinement, and low-defectivity materials and interfaces in the fabricated broken-gap TFET. These features have been realized through a well-mastered and nanometer-size-controlled process,” says Aryan Afzalian, a principal member of the technical staff at the nanoelectronics research organization imec, who was not involved with this work.This research is funded, in part, by Intel Corporation. More

  • in

    Scientists develop an affordable sensor for lead contamination

    Engineers at MIT, Nanytang Technological University, and several companies have developed a compact and inexpensive technology for detecting and measuring lead concentrations in water, potentially enabling a significant advance in tackling this persistent global health issue.The World Health Organization estimates that 240 million people worldwide are exposed to drinking water that contains unsafe amounts of toxic lead, which can affect brain development in children, cause birth defects, and produce a variety of neurological, cardiac, and other damaging effects. In the United States alone, an estimated 10 million households still get drinking water delivered through lead pipes.“It’s an unaddressed public health crisis that leads to over 1 million deaths annually,” says Jia Xu Brian Sia, an MIT postdoc and the senior author of the paper describing the new technology.But testing for lead in water requires expensive, cumbersome equipment and typically requires days to get results. Or, it uses simple test strips that simply reveal a yes-or-no answer about the presence of lead but no information about its concentration. Current EPA regulations require drinking water to contain no more that 15 parts per billion of lead, a concentration so low it is difficult to detect.The new system, which could be ready for commercial deployment within two or three years, could detect lead concentrations as low as 1 part per billion, with high accuracy, using a simple chip-based detector housed in a handheld device. The technology gives nearly instant quantitative measurements and requires just a droplet of water.The findings are described in a paper appearing today in the journal Nature Communications, by Sia, MIT graduate student and lead author Luigi Ranno, Professor Juejun Hu, and 12 others at MIT and other institutions in academia and industry.The team set out to find a simple detection method based on the use of photonic chips, which use light to perform measurements. The challenging part was finding a way to attach to the photonic chip surface certain ring-shaped molecules known as crown ethers, which can capture specific ions such as lead. After years of effort, they were able to achieve that attachment via a chemical process known as Fischer esterification. “That is one of the essential breakthroughs we have made in this technology,” Sia says.In testing the new chip, the researchers showed that it can detect lead in water at concentrations as low as one part per billion. At much higher concentrations, which may be relevant for testing environmental contamination such as mine tailings, the accuracy is within 4 percent.The device works in water with varying levels of acidity, ranging from pH values of 6 to 8, “which covers most environmental samples,” Sia says. They have tested the device with seawater as well as tap water, and verified the accuracy of the measurements.In order to achieve such levels of accuracy, current testing requires a device called an inductive coupled plasma mass spectrometer. “These setups can be big and expensive,” Sia says. The sample processing can take days and requires experienced technical personnel.While the new chip system they developed is “the core part of the innovation,” Ranno says, further work will be needed to develop this into an integrated, handheld device for practical use. “For making an actual product, you would need to package it into a usable form factor,” he explains. This would involve having a small chip-based laser coupled to the photonic chip. “It’s a matter of mechanical design, some optical design, some chemistry, and figuring out the supply chain,” he says. While that takes time, he says, the underlying concepts are straightforward.The system can be adapted to detect other similar contaminants in water, including cadmium, copper, lithium, barium, cesium, and radium, Ranno says. The device could be used with simple cartridges that can be swapped out to detect different elements, each using slightly different crown ethers that can bind to a specific ion.“There’s this problem that people don’t measure their water enough, especially in the developing countries,” Ranno says. “And that’s because they need to collect the water, prepare the sample, and bring it to these huge instruments that are extremely expensive.” Instead, “having this handheld device, something compact that even untrained personnel can just bring to the source for on-site monitoring, at low costs,” could make regular, ongoing widespread testing feasible.Hu, who is the John F. Elliott Professor of Materials Science and Engineering, says, “I’m hoping this will be quickly implemented, so we can benefit human society. This is a good example of a technology coming from a lab innovation where it may actually make a very tangible impact on society, which is of course very fulfilling.”“If this study can be extended to simultaneous detection of multiple metal elements, especially the presently concerning radioactive elements, its potential would be immense,” says Hou Wang, an associate professor of environmental science and engineering at Hunan University in China, who was not associated with this work.Wang adds, “This research has engineered a sensor capable of instantaneously detecting lead concentration in water. This can be utilized in real-time to monitor the lead pollution concentration in wastewater discharged from industries such as battery manufacturing and lead smelting, facilitating the establishment of industrial wastewater monitoring systems. I think the innovative aspects and developmental potential of this research are quite commendable.”Wang Qian, a principal research scientist at the Institute of Materials Research in Singapore, who also was not affiliated with this work, says, “The ability for the pervasive, portable, and quantitative detection of lead has proved to be challenging primarily due to cost concerns. This work demonstrates the potential to do so in a highly integrated form factor and is compatible with large-scale, low-cost manufacturing.”The team included researchers at MIT, at Nanyang Technological University and Temasek Laboratories in Singapore, at the University of Southampton in the U.K., and at companies Fingate Technologies, in Singapore, and Vulcan Photonics, headquartered in Malaysia. The work used facilities at MIT.nano, the Harvard University Center for Nanoscale Systems, NTU’s Center for Micro- and Nano-Electronics, and the Nanyang Nanofabrication Center. More

  • in

    Seizing solar’s bright future

    Consider the dizzying ascent of solar energy in the United States: In the past decade, solar capacity increased nearly 900 percent, with electricity production eight times greater in 2023 than in 2014. The jump from 2022 to 2023 alone was 51 percent, with a record 32 gigawatts (GW) of solar installations coming online. In the past four years, more solar has been added to the grid than any other form of generation. Installed solar now tops 179 GW, enough to power nearly 33 million homes. The U.S. Department of Energy (DOE) is so bullish on the sun that its decarbonization plans envision solar satisfying 45 percent of the nation’s electricity demands by 2050.But the continued rapid expansion of solar requires advances in technology, notably to improve the efficiency and durability of solar photovoltaic (PV) materials and manufacturing. That’s where Optigon, a three-year-old MIT spinout company, comes in.“Our goal is to build tools for research and industry that can accelerate the energy transition,” says Dane deQuilettes, the company’s co-founder and chief science officer. “The technology we have developed for solar will enable measurements and analysis of materials as they are being made both in lab and on the manufacturing line, dramatically speeding up the optimization of PV.”With roots in MIT’s vibrant solar research community, Optigon is poised for a 2024 rollout of technology it believes will drastically pick up the pace of solar power and other clean energy projects.Beyond siliconSilicon, the material mainstay of most PV, is limited by the laws of physics in the efficiencies it can achieve converting photons from the sun into electrical energy. Silicon-based solar cells can theoretically reach power conversion levels of just 30 percent, and real-world efficiency levels hover in the low 20s. But beyond the physical limitations of silicon, there is another issue at play for many researchers and the solar industry in the United States and elsewhere: China dominates the silicon PV market, from supply chains to manufacturing.Scientists are eagerly pursuing alternative materials, either for enhancing silicon’s solar conversion capacity or for replacing silicon altogether.In the past decade, a family of crystal-structured semiconductors known as perovskites has risen to the fore as a next-generation PV material candidate. Perovskite devices lend themselves to a novel manufacturing process using printing technology that could circumvent the supply chain juggernaut China has built for silicon. Perovskite solar cells can be stacked on each other or layered atop silicon PV, to achieve higher conversion efficiencies. Because perovskite technology is flexible and lightweight, modules can be used on roofs and other structures that cannot support heavier silicon PV, lowering costs and enabling a wider range of building-integrated solar devices.But these new materials require testing, both during R&D and then on assembly lines, where missing or defective optical, electrical, or dimensional properties in the nano-sized crystal structures can negatively impact the end product.“The actual measurement and data analysis processes have been really, really slow, because you have to use a bunch of separate tools that are all very manual,” says Optigon co-founder and chief executive officer Anthony Troupe ’21. “We wanted to come up with tools for automating detection of a material’s properties, for determining whether it could make a good or bad solar cell, and then for optimizing it.”“Our approach packed several non-contact, optical measurements using different types of light sources and detectors into a single system, which together provide a holistic, cross-sectional view of the material,” says Brandon Motes ’21, ME ’22, co-founder and chief technical officer.“This breakthrough in achieving millisecond timescales for data collection and analysis means we can take research-quality tools and actually put them on a full production system, getting extremely detailed information about products being built at massive, gigawatt scale in real-time,” says Troupe.This streamlined system takes measurements “in the snap of the fingers, unlike the traditional tools,” says Joseph Berry, director of the US Manufacturing of Advanced Perovskites Consortium and a senior research scientist at the National Renewable Energy Laboratory. “Optigon’s techniques are high precision and allow high throughput, which means they can be used in a lot of contexts where you want rapid feedback and the ability to develop materials very, very quickly.”According to Berry, Optigon’s technology may give the solar industry not just better materials, but the ability to pump out high-quality PV products at a brisker clip than is currently possible. “If Optigon is successful in deploying their technology, then we can more rapidly develop the materials that we need, manufacturing with the requisite precision again and again,” he says. “This could lead to the next generation of PV modules at a much, much lower cost.”Measuring makes the differenceWith Small Business Innovation Research funding from DOE to commercialize its products and a grant from the Massachusetts Clean Energy Center, Optigon has settled into a space at the climate technology incubator Greentown Labs in Somerville, Massachusetts. Here, the team is preparing for this spring’s launch of its first commercial product, whose genesis lies in MIT’s GridEdge Solar Research Program.Led by Vladimir Bulović, a professor of electrical engineering and the director of MIT.nano, the GridEdge program was established with funding from the Tata Trusts to develop lightweight, flexible, and inexpensive solar cells for distribution to rural communities around the globe. When deQuilettes joined the group in 2017 as a postdoc, he was tasked with directing the program and building the infrastructure to study and make perovskite solar modules.“We were trying to understand once we made the material whether or not it was good,” he recalls. “There were no good commercial metrology [the science of measurements] tools for materials beyond silicon, so we started to build our own.” Recognizing the group’s need for greater expertise on the problem, especially in the areas of electrical, software, and mechanical engineering, deQuilettes put a call out for undergraduate researchers to help build metrology tools for new solar materials.“Forty people inquired, but when I met Brandon and Anthony, something clicked; it was clear we had a complementary skill set,” says deQuilettes. “We started working together, with Anthony coming up with beautiful designs to integrate multiple measurements, and Brandon creating boards to control all of the hardware, including different types of lasers. We started filing multiple patents and that was when we saw it all coming together.”“We knew from the start that metrology could vastly improve not just materials, but production yields,” says Troupe. Adds deQuilettes, “Our goal was getting to the highest performance orders of magnitude faster than it would ordinarily take, so we developed tools that would not just be useful for research labs but for manufacturing lines to give live feedback on quality.”The device Optigon designed for industry is the size of a football, “with sensor packages crammed into a tiny form factor, taking measurements as material flows directly underneath,” says Motes. “We have also thought carefully about ways to make interaction with this tool as seamless and, dare I say, as enjoyable as possible, streaming data to both a dashboard an operator can watch and to a custom database.”Photovoltaics is just the startThe company may have already found its market niche. “A research group paid us to use our in-house prototype because they have such a burning need to get these sorts of measurements,” says Troupe, and according to Motes, “Potential customers ask us if they can buy the system now.” deQuilettes says, “Our hope is that we become the de facto company for doing any sort of characterization metrology in the United States and beyond.”Challenges lie ahead for Optigon: product launches, full-scale manufacturing, technical assistance, and sales. Greentown Labs offers support, as does MIT’s own rich community of solar researchers and entrepreneurs. But the founders are already thinking about next phases.“We are not limiting ourselves to the photovoltaics area,” says deQuilettes. “We’re planning on working in other clean energy materials such as batteries and fuel cells.”That’s because the team wants to make the maximum impact on the climate challenge. “We’ve thought a lot about the potential our tools will have on reducing carbon emissions, and we’ve done a really in-depth analysis looking at how our system can increase production yields of solar panels and other energy technologies, reducing materials and energy wasted in conventional optimization,” deQuilettes says. “If we look across all these sectors, we can expect to offset about 1,000 million metric tons of CO2 [carbon dioxide] per year in the not-too-distant future.”The team has written scale into its business plan. “We want to be the key enabler for bringing these new energy technologies to market,” says Motes. “We envision being deployed on every manufacturing line making these types of materials. It’s our goal to walk around and know that if we see a solar panel deployed, there’s a pretty high likelihood that it will be one we measured at some point.” More

  • in

    Propelling atomically layered magnets toward green computers

    Globally, computation is booming at an unprecedented rate, fueled by the boons of artificial intelligence. With this, the staggering energy demand of the world’s computing infrastructure has become a major concern, and the development of computing devices that are far more energy-efficient is a leading challenge for the scientific community. 

    Use of magnetic materials to build computing devices like memories and processors has emerged as a promising avenue for creating “beyond-CMOS” computers, which would use far less energy compared to traditional computers. Magnetization switching in magnets can be used in computation the same way that a transistor switches from open or closed to represent the 0s and 1s of binary code. 

    While much of the research along this direction has focused on using bulk magnetic materials, a new class of magnetic materials — called two-dimensional van der Waals magnets — provides superior properties that can improve the scalability and energy efficiency of magnetic devices to make them commercially viable. 

    Although the benefits of shifting to 2D magnetic materials are evident, their practical induction into computers has been hindered by some fundamental challenges. Until recently, 2D magnetic materials could operate only at very low temperatures, much like superconductors. So bringing their operating temperatures above room temperature has remained a primary goal. Additionally, for use in computers, it is important that they can be controlled electrically, without the need for magnetic fields. Bridging this fundamental gap, where 2D magnetic materials can be electrically switched above room temperature without any magnetic fields, could potentially catapult the translation of 2D magnets into the next generation of “green” computers.

    A team of MIT researchers has now achieved this critical milestone by designing a “van der Waals atomically layered heterostructure” device where a 2D van der Waals magnet, iron gallium telluride, is interfaced with another 2D material, tungsten ditelluride. In an open-access paper published March 15 in Science Advances, the team shows that the magnet can be toggled between the 0 and 1 states simply by applying pulses of electrical current across their two-layer device. 

    Play video

    The Future of Spintronics: Manipulating Spins in Atomic Layers without External Magnetic FieldsVideo: Deblina Sarkar

    “Our device enables robust magnetization switching without the need for an external magnetic field, opening up unprecedented opportunities for ultra-low power and environmentally sustainable computing technology for big data and AI,” says lead author Deblina Sarkar, the AT&T Career Development Assistant Professor at the MIT Media Lab and Center for Neurobiological Engineering, and head of the Nano-Cybernetic Biotrek research group. “Moreover, the atomically layered structure of our device provides unique capabilities including improved interface and possibilities of gate voltage tunability, as well as flexible and transparent spintronic technologies.”

    Sarkar is joined on the paper by first author Shivam Kajale, a graduate student in Sarkar’s research group at the Media Lab; Thanh Nguyen, a graduate student in the Department of Nuclear Science and Engineering (NSE); Nguyen Tuan Hung, an MIT visiting scholar in NSE and an assistant professor at Tohoku University in Japan; and Mingda Li, associate professor of NSE.

    Breaking the mirror symmetries 

    When electric current flows through heavy metals like platinum or tantalum, the electrons get segregated in the materials based on their spin component, a phenomenon called the spin Hall effect, says Kajale. The way this segregation happens depends on the material, and particularly its symmetries.

    “The conversion of electric current to spin currents in heavy metals lies at the heart of controlling magnets electrically,” Kajale notes. “The microscopic structure of conventionally used materials, like platinum, have a kind of mirror symmetry, which restricts the spin currents only to in-plane spin polarization.”

    Kajale explains that two mirror symmetries must be broken to produce an “out-of-plane” spin component that can be transferred to a magnetic layer to induce field-free switching. “Electrical current can ‘break’ the mirror symmetry along one plane in platinum, but its crystal structure prevents the mirror symmetry from being broken in a second plane.”

    In their earlier experiments, the researchers used a small magnetic field to break the second mirror plane. To get rid of the need for a magnetic nudge, Kajale and Sarkar and colleagues looked instead for a material with a structure that could break the second mirror plane without outside help. This led them to another 2D material, tungsten ditelluride. The tungsten ditelluride that the researchers used has an orthorhombic crystal structure. The material itself has one broken mirror plane. Thus, by applying current along its low-symmetry axis (parallel to the broken mirror plane), the resulting spin current has an out-of-plane spin component that can directly induce switching in the ultra-thin magnet interfaced with the tungsten ditelluride. 

    “Because it’s also a 2D van der Waals material, it can also ensure that when we stack the two materials together, we get pristine interfaces and a good flow of electron spins between the materials,” says Kajale. 

    Becoming more energy-efficient 

    Computer memory and processors built from magnetic materials use less energy than traditional silicon-based devices. And the van der Waals magnets can offer higher energy efficiency and better scalability compared to bulk magnetic material, the researchers note. 

    The electrical current density used for switching the magnet translates to how much energy is dissipated during switching. A lower density means a much more energy-efficient material. “The new design has one of the lowest current densities in van der Waals magnetic materials,” Kajale says. “This new design has an order of magnitude lower in terms of the switching current required in bulk materials. This translates to something like two orders of magnitude improvement in energy efficiency.”

    The research team is now looking at similar low-symmetry van der Waals materials to see if they can reduce current density even further. They are also hoping to collaborate with other researchers to find ways to manufacture the 2D magnetic switch devices at commercial scale. 

    This work was carried out, in part, using the facilities at MIT.nano. It was funded by the Media Lab, the U.S. National Science Foundation, and the U.S. Department of Energy. More

  • in

    New MIT.nano equipment to accelerate innovation in “tough tech” sectors

    A new set of advanced nanofabrication equipment will make MIT.nano one of the world’s most advanced research facilities in microelectronics and related technologies, unlocking new opportunities for experimentation and widening the path for promising inventions to become impactful new products.

    The equipment, provided by Applied Materials, will significantly expand MIT.nano’s nanofabrication capabilities, making them compatible with wafers — thin, round slices of semiconductor material — up to 200 millimeters, or 8 inches, in diameter, a size widely used in industry. The new tools will allow researchers to prototype a vast array of new microelectronic devices using state-of-the-art materials and fabrication processes. At the same time, the 200-millimeter compatibility will support close collaboration with industry and enable innovations to be rapidly adopted by companies and mass produced.

    MIT.nano’s leaders say the equipment, which will also be available to scientists outside of MIT, will dramatically enhance their facility’s capabilities, allowing experts in the region to more efficiently explore new approaches in “tough tech” sectors, including advanced electronics, next-generation batteries, renewable energies, optical computing, biological sensing, and a host of other areas — many likely yet to be imagined.

    “The toolsets will provide an accelerative boost to our ability to launch new technologies that can then be given to the world at scale,” says MIT.nano Director Vladimir Bulović, who is also the Fariborz Maseeh Professor of Emerging Technology. “MIT.nano is committed to its expansive mission — to build a better world. We provide toolsets and capabilities that, in the hands of brilliant researchers, can effectively move the world forward.”

    The announcement comes as part of an agreement between MIT and Applied Materials, Inc. that, together with a grant to MIT from the Northeast Microelectronics Coalition (NEMC) Hub, commits more than $40 million of estimated private and public investment to add advanced nano-fabrication equipment and capabilities at MIT.nano.

    “We don’t believe there is another space in the United States that will offer the same kind of versatility, capability, and accessibility, with 8-inch toolsets integrated right next to more fundamental toolsets for research discoveries,” Bulović says. “It will create a seamless path to accelerate the pace of innovation.”

    Pushing the boundaries of innovation

    Applied Materials is the world’s largest supplier of equipment for manufacturing semiconductors, displays, and other advanced electronics. The company will provide at MIT.nano several state-of-the-art process tools capable of supporting 150- and 200-millimeter wafers and will enhance and upgrade an existing tool owned by MIT. In addition to assisting MIT.nano in the day-to-day operation and maintenance of the equipment, Applied Materials engineers will develop new process capabilities to benefit researchers and students from MIT and beyond.

    “This investment will significantly accelerate the pace of innovation and discovery in microelectronics and microsystems,” says Tomás Palacios, director of MIT’s Microsystems Technology Laboratories and the Clarence J. Lebel Professor in Electrical Engineering. “It’s wonderful news for our community, wonderful news for the state, and, in my view, a tremendous step forward toward implementing the national vision for the future of innovation in microelectronics.”

    Nanoscale research at universities is traditionally conducted on machines that are less compatible with industry, which makes academic innovations more difficult to turn into impactful, mass-produced products. Jorg Scholvin, associate director for MIT.nano’s shared fabrication facility, says the new machines, when combined with MIT.nano’s existing equipment, represent a step-change improvement in that area: Researchers will be able to take an industry-standard wafer and build their technology on top of it to prove to companies it works on existing devices, or to co-fabricate new ideas in close collaboration with industry partners.

    “In the journey from an idea to a fully working device, the ability to begin on a small scale, figure out what you want to do, rapidly debug your designs, and then scale it up to an industry-scale wafer is critical,” Scholvin says. “It means a student can test out their idea on wafer-scale quickly and directly incorporate insights into their project so that their processes are scalable. Providing such proof-of-principle early on will accelerate the idea out of the academic environment, potentially reducing years of added effort. Other tools at MIT.nano can supplement work on the 200-millimeter wafer scale, but the higher throughput and higher precision of the Applied equipment will provide researchers with repeatability and accuracy that is unprecedented for academic research environments. Essentially what you have is a sharper, faster, more precise tool to do your work.”

    Scholvin predicts the equipment will lead to exponential growth in research opportunities.

    “I think a key benefit of these tools is they allow us to push the boundary of research in a variety of different ways that we can predict today,” Scholvin says. “But then there are also unpredictable benefits, which are hiding in the shadows waiting to be discovered by the creativity of the researchers at MIT. With each new application, more ideas and paths usually come to mind — so that over time, more and more opportunities are discovered.”

    Because the equipment is available for use by people outside of the MIT community, including regional researchers, industry partners, nonprofit organizations, and local startups, they will also enable new collaborations.

    “The tools themselves will be an incredible meeting place — a place that can, I think, transpose the best of our ideas in a much more effective way than before,” Bulović says. “I’m extremely excited about that.”

    Palacios notes that while microelectronics is best known for work making transistors smaller to fit on microprocessors, it’s a vast field that enables virtually all the technology around us, from wireless communications and high-speed internet to energy management, personalized health care, and more.

    He says he’s personally excited to use the new machines to do research around power electronics and semiconductors, including exploring promising new materials like gallium nitride, which could dramatically improve the efficiency of electronic devices.

    Fulfilling a mission

    MIT.nano’s leaders say a key driver of commercialization will be startups, both from MIT and beyond.

    “This is not only going to help the MIT research community innovate faster, it’s also going to enable a new wave of entrepreneurship,” Palacios says. “We’re reducing the barriers for students, faculty, and other entrepreneurs to be able to take innovation and get it to market. That fits nicely with MIT’s mission of making the world a better place through technology. I cannot wait to see the amazing new inventions that our colleagues and students will come out with.”

    Bulović says the announcement aligns with the mission laid out by MIT’s leaders at MIT.nano’s inception.

    “We have the space in MIT.nano to accommodate these tools, we have the capabilities inside MIT.nano to manage their operation, and as a shared and open facility, we have methodologies by which we can welcome anyone from the region to use the tools,” Bulović says. “That is the vision MIT laid out as we were designing MIT.nano, and this announcement helps to fulfill that vision.” More