More stories

  • in

    Study: EV charging stations boost spending at nearby businesses

    Charging stations for electric vehicles are essential for cleaning up the transportation sector. A new study by MIT researchers suggests they’re good for business, too.The study found that, in California, opening a charging station boosted annual spending at each nearby business by an average of about $1,500 in 2019 and about $400 between January 2021 and June 2023. The spending bump amounts to thousands of extra dollars annually for nearby businesses, with the increase particularly pronounced for businesses in underresourced areas.The study’s authors hope the research paints a more holistic picture of the benefits of EV charging stations, beyond environmental factors.“These increases are equal to a significant chunk of the cost of installing an EV charger, and I hope this study sheds light on these economic benefits,” says lead author Yunhan Zheng MCP ’21, SM ’21, PhD ’24, a postdoc at the Singapore-MIT Alliance for Research and Technology (SMART). “The findings could also diversify the income stream for charger providers and site hosts, and lead to more informed business models for EV charging stations.”Zheng’s co-authors on the paper, which was published today in Nature Communications, are David Keith, a senior lecturer at the MIT Sloan School of Management; Jinhua Zhao, an MIT professor of cities and transportation; and alumni Shenhao Wang MCP ’17, SM ’17, PhD ’20 and Mi Diao MCP ’06, PhD ’10.Understanding the EV effectIncreasing the number of electric vehicle charging stations is seen as a key prerequisite for the transition to a cleaner, electrified transportation sector. As such, the 2021 U.S. Infrastructure Investment and Jobs Act committed $7.5 billion to build a national network of public electric vehicle chargers across the U.S.But a large amount of private investment will also be needed to make charging stations ubiquitous.“The U.S. is investing a lot in EV chargers and really encouraging EV adoption, but many EV charging providers can’t make enough money at this stage, and getting to profitability is a major challenge,” Zheng says.EV advocates have long argued that the presence of charging stations brings economic benefits to surrounding communities, but Zheng says previous studies on their impact relied on surveys or were small-scale. Her team of collaborators wanted to make advocates’ claims more empirical.For their study, the researchers collected data from over 4,000 charging stations in California and 140,000 businesses, relying on anonymized credit and debit card transactions to measure changes in consumer spending. The researchers used data from 2019 through June of 2023, skipping the year 2020 to minimize the impact of the pandemic.To judge whether charging stations caused customer spending increases, the researchers compared data from businesses within 500 meters of new charging stations before and after their installation. They also analyzed transactions from similar businesses in the same time frame that weren’t near charging stations.Supercharging nearby businessesThe researchers found that installing a charging station boosted annual spending at nearby establishments by an average of 1.4 percent in 2019 and 0.8 percent from January 2021 to June 2023.While that might sound like a small amount per business, it amounts to thousands of dollars in overall consumer spending increases. Specifically, those percentages translate to almost $23,000 in cumulative spending increases in 2019 and about $3,400 per year from 2021 through June 2023.Zheng says the decline in spending increases over the two time periods might be due to a saturation of EV chargers, leading to lower utilization, as well as an overall decrease in spending per business after the Covid-19 pandemic and a reduced number of businesses served by each EV charging station in the second period. Despite this decline, the annual impact of a charging station on all its surrounding businesses would still cover approximately 11.2 percent of the average infrastructure and installation cost of a standard charging station.Through both time frames, the spending increases were highest for businesses within about a football field’s distance from the new stations. They were also significant for businesses in disadvantaged and low-income areas, as designated by California and the Justice40 Initiative.“The positive impacts of EV charging stations on businesses are not constrained solely to some high-income neighborhoods,” Wang says. “It highlights the importance for policymakers to develop EV charging stations in marginalized areas, because they not only foster a cleaner environment, but also serve as a catalyst for enhancing economic vitality.”Zheng believes the findings hold a lesson for charging station developers seeking to improve the profitability of their projects.“The joint gas station and convenience store business model could also be adopted to EV charging stations,” Zheng says. “Traditionally, many gas stations are affiliated with retail store chains, which enables owners to both sell fuel and attract customers to diversify their revenue stream. EV charging providers could consider a similar approach to internalize the positive impact of EV charging stations.”Zheng also says the findings could support the creation of new funding models for charging stations, such as multiple businesses sharing the costs of construction so they can all benefit from the added spending.Those changes could accelerate the creation of charging networks, but Zheng cautions that further research is needed to understand how much the study’s findings can be extrapolated to other areas. She encourages other researchers to study the economic effects of charging stations and hopes future research includes states beyond California and even other countries.“A huge number of studies have focused on retail sales effects from traditional transportation infrastructure, such as rail and subway stations, bus stops, and street configurations,” Zhao says. “This research provides evidence for an important, emerging piece of transportation infrastructure and shows a consistently positive effect on local businesses, paving the way for future research in this area.”The research was supported, in part, by the Singapore-MIT Alliance for Research and Technology (SMART) and the Singapore National Research Foundation. Diao was partially supported by the Natural Science Foundation of Shanghai and the Fundamental Research Funds for the Central Universities of China. More

  • in

    How to increase the rate of plastics recycling

    While recycling systems and bottle deposits have become increasingly widespread in the U.S., actual rates of recycling are “abysmal,” according to a team of MIT researchers who studied the rates for recycling of PET, the plastic commonly used in beverage bottles. However, their findings suggest some ways to change this.The present rate of recycling for PET, or polyethylene terephthalate, bottles nationwide is about 24 percent and has remained stagnant for a decade, the researchers say. But their study indicates that with a nationwide bottle deposit program, the rates could increase to 82 percent, with nearly two-thirds of all PET bottles being recycled into new bottles, at a net cost of just a penny a bottle when demand is robust. At the same time, they say, policies would be needed to ensure a sufficient demand for the recycled material.The findings are being published today in the Journal of Industrial Ecology, in a paper by MIT professor of materials science and engineering Elsa Olivetti, graduate students Basuhi Ravi and Karan Bhuwalka, and research scientist Richard Roth.The team looked at PET bottle collection and recycling rates in different states as well as other nations with and without bottle deposit policies, and with or without curbside recycling programs, as well as the inputs and outputs of various recycling companies and methods. The researchers say this study is the first to look in detail at the interplay between public policies and the end-to-end realities of the packaging production and recycling market.They found that bottle deposit programs are highly effective in the areas where they are in place, but at present there is not nearly enough collection of used bottles to meet the targets set by the packaging industry. Their analysis suggests that a uniform nationwide bottle deposit policy could achieve the levels of recycling that have been mandated by proposed legislation and corporate commitments.The recycling of PET is highly successful in terms of quality, with new products made from all-recycled material virtually matching the qualities of virgin material. And brands have shown that new bottles can be safely made with 100 percent postconsumer waste. But the team found that collection of the material is a crucial bottleneck that leaves processing plants unable to meet their needs. However, with the right policies in place, “one can be optimistic,” says Olivetti, who is the Jerry McAfee Professor in Engineering and the associate dean of the School of Engineering.“A message that we have found in a number of cases in the recycling space is that if you do the right work to support policies that think about both the demand but also the supply,” then significant improvements are possible, she says. “You have to think about the response and the behavior of multiple actors in the system holistically to be viable,” she says. “We are optimistic, but there are many ways to be pessimistic if we’re not thinking about that in a holistic way.”For example, the study found that it is important to consider the needs of existing municipal waste-recovery facilities. While expanded bottle deposit programs are essential to increase recycling rates and provide the feedstock to companies recycling PET into new products, the current facilities that process material from curbside recycling programs will lose revenue from PET bottles, which are a relatively high-value product compared to the other materials in the recycled waste stream. These companies would lose a source of their income if the bottles are collected through deposit programs, leaving them with only the lower-value mixed plastics.The researchers developed economic models based on rates of collection found in the states with deposit programs, recycled-content requirements, and other policies, and used these models to extrapolate to the nation as a whole. Overall, they found that the supply needs of packaging producers could be met through a nationwide bottle deposit system with a 10-cent deposit per bottle — at a net cost of about 1 cent per bottle produced when demand is strong. This need not be a federal program, but rather one where the implementation would be left up to the individual states, Olivetti says.Other countries have been much more successful in implementing deposit systems that result in very high participation rates. Several European countries manage to collect more than 90 percent of PET bottles for recycling, for example. But in the U.S., less than 29 percent are collected, and after losses in the recycling chain about 24 percent actually get recycled, the researchers found. Whereas 73 percent of Americans have access to curbside recycling, presently only 10 states have bottle deposit systems in place.Yet the demand is there so far. “There is a market for this material,” says Olivetti. While bottles collected through mixed-waste collection can still be recycled to some extent, those collected through deposit systems tend to be much cleaner and require less processing, and so are more economical to recycle into new bottles, or into textiles.To be effective, policies need to not just focus on increasing rates of recycling, but on the whole cycle of supply and demand and the different players involved, Olivetti says. Safeguards would need to be in place to protect existing recycling facilities from the lost revenues they would suffer as a result of bottle deposits, perhaps in the form of subsidies funded by fees on the bottle producers, to avoid putting these essential parts of the processing chain out of business. And other policies may be needed to ensure the continued market for the material that gets collected, including recycled content requirements and extended producer responsibility regulations, the team found.At this stage, it’s important to focus on the specific waste streams that can most effectively be recycled, and PET, along with many metals, clearly fit that category. “When we start to think about mixed plastic streams, that’s much more challenging from an environmental perspective,” she says. “Recycling systems need to be pursuing extended producers’ responsibility, or specifically thinking about materials designed more effectively toward recycled content,” she says.It’s also important to address “what the right metrics are to design for sustainably managed materials streams,” she says. “It could be energy use, could be circularity [for example, making old bottles into new bottles], could be around waste reduction, and making sure those are all aligned. That’s another kind of policy coordination that’s needed.” More

  • in

    Satellite-based method measures carbon in peat bogs

    Peat bogs in the tropics store vast amounts of carbon, but logging, plantations, road building, and other activities have destroyed large swaths of these ecosystems in places like Indonesia and Malaysia. Peat formations are essentially permanently flooded forestland, where dead leaves and branches accumulate because the water table prevents their decomposition.

    The pileup of organic material gives these formations a distinctive domed shape, somewhat raised in the center and tapering toward the edges. Determining how much carbon is contained in each formation has required laborious on-the-ground sampling, and so has been limited in its coverage.

    Now, researchers from MIT and Singapore have developed a mathematical analysis of how peat formations build and develop, that makes it possible to evaluate their carbon content and dynamics mostly from simple elevation measurements. These can be carried out by satellites, without requiring ground-based sampling. This analysis, the team says, should make it possible to make more precise and accurate assessments of the amount of carbon that would be released by any proposed draining of peatlands — and, inversely, how much carbon emissions could be avoided by protecting them.

    The research is being reported today in the journal Nature, in a paper by Alexander Cobb, a postdoc with the Singapore-MIT Alliance for Research and Technology (SMART); Charles Harvey, an MIT professor of civil and environmental engineering; and six others.

    Although it is the tropical peatlands that are at greatest risk — because they are the ones most often drained for timber harvesting or the creation of plantations for palm oil, acacia, and other crops — the new formulas the team derived apply to peatlands all over the globe, from Siberia to New Zealand. The formula requires just two inputs. The first is elevation data from a single transect of a given peat dome — that is, a series of elevation measurements along an arbitrary straight line cutting across from one edge of the formation to the other. The second input is a site-specific factor the team devised that relates to the type of peat bog involved and the internal structure of the formation, which together determine how much of the carbon within remains safely submerged in water, where it can’t be oxidized.

    “The saturation by water prevents oxygen from getting in, and if oxygen gets in, microbes breathe it and eat the peat and turn it into carbon dioxide,” Harvey explains.

    “There is an internal surface inside the peat dome below which the carbon is safe because it can’t be drained, because the bounding rivers and water bodies are such that it will keep saturated up to that level even if you cut canals and try to drain it,” he adds. In between the visible surface of the bog and this internal layer is the “vulnerable zone” of peat that can rapidly decompose and release its carbon compounds or become dry enough to promote fires that also release the carbon and pollute the air.

    Through years of on-the-ground sampling and testing, and detailed analysis comparing the ground data with satellite lidar data on surface elevations, the team was able to figure out a kind of universal mathematical formula that describes the structure of peat domes of all kinds and in all locations. They tested it by comparing their predicted results with field measurements from several widely distributed locations, including Alaska, Maine, Quebec, Estonia, Finland, Brunei, and New Zealand.

    These bogs contain carbon that has in many cases accumulated over thousands of years but can be released in just a few years when the bogs are drained. “If we could have policies to preserve these, it is a tremendous opportunity to reduce carbon fluxes to the atmosphere. This framework or model gives us the understanding, the intellectual framework, to figure out how to do that,” Harvey says.

    Many people assume that the biggest greenhouse gas emissions from cutting down these forested lands is from the decomposition of the trees themselves. “The misconception is that that’s the carbon that goes to the atmosphere,” Harvey says. “It’s actually a small amount, because the real fluxes to the atmosphere come from draining” the peat bogs. “Then, the much larger pool of carbon, which is underground beneath the forest, oxidizes and goes to the air, or catches fire and burns.”

    But there is hope, he says, that much of this drained peatland can still be restored before the stored carbon all gets released. First of all, he says, “you’ve got to stop draining it.” That can be accomplished by damming up the drainage canals. “That’s what’s good about this mathematical framework: You need to figure out how to do that, where to put your dams. There’s all sorts of interesting complexities. If you just dam up the canal, the water may flow around it. So, it’s a neat geometric and engineering project to figure out how to do this.”

    While much of the peatland in southeast Asia has already been drained, the new analysis should make it possible to make much more accurate assessments of less-well-studied peatlands in places like the Amazon basin, New Guinea and the Congo basin, which are also threatened by development.

    The new formulation should also help to make some carbon offset programs more reliable, because it is now possible to calculate accurately the carbon content of a given peatland. “It’s quantifiable, because the peat is 100 percent organic carbon. So, if you just measure the change in the surface going up or down, you can say with pretty good certainty how much carbon has been accumulated or lost, whereas if you go to a rainforest, it’s virtually impossible to calculate the amount of underground carbon, and it’s pretty hard to calculate what’s above ground too,” Harvey says. “But this is relatively easy to calculate with satellite measurements of elevation.”

    “We can turn the knob,” he says, “because we have this mathematical framework for how the hydrology, the water table position, affects the growth and loss of peat. We can design a scheme that will change emissions by X amount, for Y dollars.”

    The research team included Rene Dommain, Kimberly Yeap, and Cao Hannan at Nanyang Technical University in Singapore, Nathan Dadap at Stanford University, Bodo Bookhagen at the University of Potsdam, Germany, and Paul Glaser at the University of Minnesota. The work was supported by the National Research Foundation Singapore through the SMART program, by the U.S. National Science Foundation, and Singapore’s Office for Space Technology and Industry. More

  • in

    A new microneedle-based drug delivery technique for plants

    Increasing environmental conditions caused by climate change, an ever-growing human population, scarcity of arable land, and limited resources are pressuring the agriculture industry to adopt more sustainable and precise practices that foster more efficient use of resources (e.g., water, fertilizers, and pesticides) and mitigation of environmental impacts. Developing delivery systems that efficiently deploy agrochemicals such as micronutrients, pesticides, and antibiotics in crops will help ensure high productivity and high produce quality, while minimizing the waste of resources, is crucial.

    Now, researchers in Singapore and the U.S. have developed the first-ever microneedle-based drug delivery technique for plants. The method can be used to precisely deliver controlled amounts of agrochemicals to specific plant tissues for research purposes. When applied in the field, it could one day be used in precision agriculture to improve crop quality and disease management.

    The work is led by researchers from the Disruptive and Sustainable Technologies for Agricultural Precision (DiSTAP) interdisciplinary research group at the Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, and their collaborators from MIT and the Temasek Life Sciences Laboratory (TLL).

    Current and standard practices for agrochemical application in plants, such as foliar spray, are inefficient due to off-target application, quick runoff in the rain, and actives’ rapid degradation. These practices also cause significant detrimental environmental side effects, such as water and soil contamination, biodiversity loss, and degraded ecosystems; and public health concerns, such as respiratory problems, chemical exposure, and food contamination.

    The novel silk-based microneedles technique circumvents these limitations by deploying and targeting a known amount of payload directly into a plant’s deep tissues, which will lead to higher efficacy of plant growth and help with disease management. The technique is minimally invasive, as it delivers the compound without causing long-term damage to the plants, and is environmentally sustainable. It minimizes resource wastage and mitigates the adverse side effects caused by agrochemical contamination of the environment. Additionally, it will help foster precise agricultural practices and provide new tools to study plants and design crop traits, helping to ensure food security.

    Described in a paper titled “Drug Delivery in Plants Using Silk Microneedles,” published in a recent issue of Advanced Materials, the research studies the first-ever polymeric microneedles used to deliver small compounds to a wide variety of plants and the plant response to biomaterial injection. Through gene expression analysis, the researchers could closely examine the reactions to drug delivery following microneedle injection. Minimal scar and callus formation were observed, suggesting minimal injection-induced wounding to the plant. The proof of concept provided in this study opens the door to plant microneedles’ application in plant biology and agriculture, enabling new means to regulate plant physiology and study metabolisms via efficient and effective delivery of payloads.

    The study optimized the design of microneedles to target the systemic transport system in Arabidopsis (mouse-ear cress), the chosen model plant. Gibberellic acid (GA3), a widely used plant growth regulator in agriculture, was selected for the delivery. The researchers found that delivering GA3 through microneedles was more effective in promoting growth than traditional methods (such as foliar spray). They then confirmed the effectiveness using genetic methods and demonstrated that the technique is applicable to various plant species, including vegetables, cereals, soybeans, and rice.

    Professor Benedetto Marelli, co-corresponding author of the paper, principal investigator at DiSTAP, and associate professor of civil and environmental engineering at MIT, shares, “The technique saves resources as compared to current methods of agrochemical delivery, which suffer from wastage. During the application, the microneedles break through the tissue barriers and release compounds directly inside the plants, avoiding agrochemical losses. The technique also allows for precise control of the amounts of the agrochemical used, ensuring high-tech precision agriculture and crop growth to optimize yield.”

    “The first-of-its-kind technique is revolutionary for the agriculture industry. It also minimizes resource wastage and environmental contamination. In the future, with automated microneedle application as a possibility, the technique may be used in high-tech outdoor and indoor farms for precise agrochemical delivery and disease management,” adds Yunteng Cao, the first author of the paper and postdoc at MIT.

    “This work also highlights the importance of using genetic tools to study plant responses to biomaterials. Analyzing these responses at the genetic level offers a comprehensive understanding of these responses, thereby serving as a guide for the development of future biomaterials that can be used across the agri-food industry,” says Sally Koh, the co-first author of this work and PhD candidate from NUS and TLL.

    The future seems promising as Professor Daisuke Urano, co-corresponding author of the paper, TLL principal investigator, and NUS adjunct assistant professor elaborates, “Our research has validated the use of silk-based microneedles for agrochemical application, and we look forward to further developing the technique and microneedle design into a scalable model for manufacturing and commercialization. At the same time, we are also actively investigating potential applications that could have a significant impact on society.”

    The study of drug delivery in plants using silk microneedles expanded upon previous research supervised by Marelli. The original idea was conceived by SMART and MIT: Marelli, Cao, and Professor Nam-Hai Chua, co-lead principal investigator at DiSTAP. Researchers from TLL and the National University of Singapore, Professor Urano Daisuke and Koh, joined the study to contribute biological perspectives. The research is carried out by SMART and supported by the National Research Foundation Singapore (NRF) under its Campus for Research Excellence And Technological Enterprise (CREATE) program.

    SMART was established by MIT and NRF in 2007. SMART is the first entity in CREATE, developed by NRF. SMART serves as an intellectual and innovation hub for research interactions between MIT and Singapore, undertaking cutting-edge research in areas of interest to both parties. SMART currently comprises an Innovation Center and interdisciplinary research groups: Antimicrobial Resistance, Critical Analytics for Manufacturing Personalized-Medicine, DiSTAP, Future Urban Mobility, and Low Energy Electronic Systems. More

  • in

    Nanotube sensors are capable of detecting and distinguishing gibberellin plant hormones

    Researchers from the Disruptive and Sustainable Technologies for Agricultural Precision (DiSTAP) interdisciplinary research group of the Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, and their collaborators from Temasek Life Sciences Laboratory have developed the first-ever nanosensor that can detect and distinguish gibberellins (GAs), a class of hormones in plants that are important for growth. The novel nanosensors are nondestructive, unlike conventional collection methods, and have been successfully tested in living plants. Applied in the field for early-stage plant stress monitoring, the sensors could prove transformative for agriculture and plant biotechnology, giving farmers interested in high-tech precision agriculture and crop management a valuable tool to optimize yield.

    The researchers designed near-infrared fluorescent carbon nanotube sensors that are capable of detecting and distinguishing two plant hormones, GA3 and GA4. Belonging to a class of plant hormones known as gibberellins, GA3 and GA4 are diterpenoid phytohormones produced by plants that play an important role in modulating diverse processes involved in plant growth and development. GAs are thought to have played a role in the driving forces behind the “green revolution” of the 1960s, which was in turn credited with averting famine and saving the lives of many worldwide. The continued study of gibberellins could lead to further breakthroughs in agricultural science and have implications for food security.

    Climate change, global warming, and rising sea levels cause farming soil to get contaminated by saltwater, raising soil salinity. In turn, high soil salinity is known to negatively regulate GA biosynthesis and promote GA metabolism, resulting in the reduction of GA content in plants. The new nanosensors developed by the SMART researchers allow for the study of GA dynamics in living plants under salinity stress at a very early stage, potentially enabling farmers to make early interventions when eventually applied in the field. This forms the basis of early-stage stress detection.

    Currently, methods to detect GA3 and GA4 typically require mass spectroscopy-based analysis, a time-consuming and destructive process. In contrast, the new sensors developed by the researchers are highly selective for the respective GAs and offer real-time, in vivo monitoring of changes in GA levels across a broad range of plant species.

    Described in a paper titled “Near-Infrared Fluorescent Carbon Nanotube Sensors for the Plant Hormone Family Gibberellins” published in the journal Nano Letters, the research represents a breakthrough for early-stage plant stress detection and holds tremendous potential to advance plant biotechnology and agriculture. This paper builds on previous research by the team at SMART DiSTAP on single-walled carbon nanotube-based nanosensors using the corona phase molecular recognition (CoPhMoRe) platform.

    Based on the CoPhMoRe concept introduced by the lab of MIT Professor Professor Michael Strano, the novel sensors are able to detect GA kinetics in the roots of a variety of model and non-model plant species, including Arabidopsis, lettuce, and basil, as well as GA accumulation during lateral root emergence, highlighting the importance of GA in root system architecture. This was made possible by the researchers’ related development of a new coupled Raman/near infrared fluorimeter that enables self-referencing of nanosensor near infrared fluorescence with its Raman G-band, a new hardware innovation that removes the need for a separate reference nanosensor and greatly simplifies the instrumentation requirements by using a single optical channel to measure hormone concentration.

    Using the reversible GA nanosensors, the researchers detected increased endogenous GA levels in mutant plants producing greater amounts of GA20ox1, a key enzyme in GA biosynthesis, as well as decreased GA levels in plants under salinity stress. When exposed to salinity stress, researchers also found that lettuce growth was severely stunted — an indication that only became apparent after 10 days. In contrast, the GA nanosensors reported decreased GA levels after just six hours, demonstrating their efficacy as a much earlier indicator of salinity stress.

    “Our CoPhMoRe technique allows us to create nanoparticles that act like natural antibodies in that they can recognize and lock onto specific molecules. But they tend to be far more stable than alternatives. We have used this method to successfully create nanosensors for plant signals such as hydrogen peroxide and heavy-metal pollutants like arsenic in plants and soil,” says Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT who is co-corresponding author and DiSTAP co-lead principal investigator. “The method works to create sensors for organic molecules like synthetic auxin — an important plant hormone — as we have shown. This latest breakthrough now extends this success to a plant hormone family called gibberellins — an exceedingly difficult one to recognize.”

    Strano adds: “The resulting technology offers a rapid, real-time, and in vivo method to monitor changes in GA levels in virtually any plant, and can replace current sensing methods which are laborious, destructive, species-specific, and much less efficient.”

    Mervin Chun-Yi Ang, associate scientific director at DiSTAP and co-first author of the paper, says, “More than simply a breakthrough in plant stress detection, we have also demonstrated a hardware innovation in the form of a new coupled Raman/NIR fluorimeter that enabled self-referencing of SWNT sensor fluorescence with its Raman G-band, representing a major advance in the translation of our nanosensing tool sets to the field. In the near future, our sensors can be combined with low-cost electronics, portable optodes, or microneedle interfaces for industrial use, transforming how the industry screens for and mitigates plant stress in food crops and potentially improving growth and yield.”

    The new sensors could yet have a variety of industrial applications and use cases. Daisuke Urano, a Temasek Life Sciences Laboratory principal investigator, National University of Singapore (NUS) adjunct assistant professor, and co-corresponding author of the paper, explains, “GAs are known to regulate a wide range of plant development processes, from shoot, root, and flower development, to seed germination and plant stress responses. With the commercialization of GAs, these plant hormones are also sold to growers and farmers as plant growth regulators to promote plant growth and seed germination. Our novel GA nanosensors could be applied in the field for early-stage plant stress monitoring, and also be used by growers and farmers to track the uptake or metabolism of GA in their crops.”

    The design and development of the nanosensors, creation and validation of the coupled Raman/near infrared fluorimeter and related image/data processing algorithms, as well as statistical analysis of readouts from plant sensors for this study were performed by SMART and MIT. The Temasek Life Sciences Laboratory was responsible for the design, execution, and analysis of plant-related studies, including validation of nanosensors in living plants.

    This research was carried out by SMART and supported by the National Research Foundation of Singapore under its Campus for Research Excellence And Technological Enterprise (CREATE) program. The DiSTAP program, led by Strano and Singapore co-lead principal investigator Professor Chua Nam Hai, addresses deep problems in food production in Singapore and the world by developing a suite of impactful and novel analytical, genetic, and biomaterial technologies. The goal is to fundamentally change how plant biosynthetic pathways are discovered, monitored, engineered, and ultimately translated to meet the global demand for food and nutrients. Scientists from MIT, Temasek Life Sciences Laboratory, Nanyang Technological University (NTU) and NUS are collaboratively developing new tools for the continuous measurement of important plant metabolites and hormones for novel discovery, deeper understanding and control of plant biosynthetic pathways in ways not yet possible, especially in the context of green leafy vegetables; leveraging these new techniques to engineer plants with highly desirable properties for global food security, including high yield density production, and drought and pathogen resistance, and applying these technologies to improve urban farming.

    SMART was established by MIT and the National Research Foundation of Singapore in 2007. SMART serves as an intellectual and innovation hub for research interactions between MIT and Singapore, undertaking cutting-edge research projects in areas of interest to both Singapore and MIT. SMART currently comprises an Innovation Center and five interdisciplinary research groups: Antimicrobial Resistance, Critical Analytics for Manufacturing Personalized-Medicine, DiSTAP, Future Urban Mobility, and Low Energy Electronic Systems. More

  • in

    SMART Innovation Center awarded five-year NRF grant for new deep tech ventures

    The Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore has announced a five-year grant awarded to the SMART Innovation Center (SMART IC) by the National Research Foundation Singapore (NRF) as part of its Research, Innovation and Enterprise 2025 Plan. The SMART IC plays a key role in accelerating innovation and entrepreneurship in Singapore and will channel the grant toward refining and commercializing developments in the field of deep technologies through financial support and training.

    Singapore has recently expanded its innovation ecosystem to hone deep technologies to solve complex problems in areas of pivotal importance. While there has been increased support for deep tech here, with investments in deep tech startups surging from $324 million in 2020 to $861 million in 2021, startups of this nature tend to take a longer time to scale, get acquired, or get publicly listed due to increased time, labor, and capital needed. By providing researchers with financial and strategic support from the early stages of their research and development, the SMART IC hopes to accelerate this process and help bring new and disruptive technologies to the market.

    “SMART’s Innovation Center prides itself as being one of the key drivers of research and innovation, by identifying and nurturing emerging technologies and accelerating them towards commercialization,” says Howard Califano, director of SMART IC. “With the support of the NRF, we look forward to another five years of further growing the ecosystem by ensuring an environment where research — and research funds — are properly directed to what the market and society need. This is how we will be able to solve problems faster and more efficiently, and ensure that value is generated from scientific research.”

    Set up in 2009 by MIT and funded by the NRF, the SMART IC furthers SMART’s goals by nurturing promising and innovative technologies that faculty and research teams in Singapore are working on. Some emerging technologies include, but are not limited to, biotechnology, biomedical devices, information technology, new materials, nanotechnology, and energy innovations.

    Having trained over 300 postdocs since its inception, the SMART IC has supported the launch of 55 companies that have created over 3,300 jobs. Some of these companies were spearheaded by SMART’s interdisciplinary research groups, including biotech companies Theonys and Thrixen, autonomous vehicle software company nuTonomy, and integrated circuit company New Silicon. During the RIE 2020 period, 66 Ignition Grants and 69 Innovation Grants were awarded to SMART’s researchers, as well as faculty at other Singapore universities and research institutes.

    The following four programs are open to researchers from education and research facilities, as well as institutes of higher learning, in Singapore:

    Innovation Grant 2.0: The enhanced SMART Innovation Center’s flagship program, the Innovation Grant 2.0, is a gated three-phase program focused on enabling scientist-entrepreneurs to launch a successful venture, with training and intense monitoring across all phases. This grant program can provide up to $800,000 Singaporean dollars and is open to all areas of deep technology (engineering, artificial intelligence, biomedical, new materials, etc). The first grant call for the Innovation Grant 2.0 is open through Oct. 15. Researchers, scientists, and engineers at Singapore’s public institutions of higher learning, research centers, public hospitals, and medical research centers — especially those working on disruptive technologies with commercial potential — are invited to apply for the Innovation Grant 2.0.

    I2START Grant: In collaboration with SMART, the National Health Innovation Center Singapore, and Enterprise Singapore, this novel integrated program will develop master classes on venture building, with a focus on medical devices, diagnostics, and medical technologies. The grant amount is up to S$1,350,000. Applications are accepted throughout the year.

    STDR Stream 2: The Singapore Therapeutics Development Review (STDR) program is jointly operated by SMART, the Agency for Science, Technology and Research (A*STAR), and the Experimental Drug Development Center. The grant is available in two phases, a pre-pilot phase of S$100,000 and a Pilot phase of S$830,000, with a potential combined total of up to S$930,000. The next STDR Pre-Pilot grant call will open on Sept. 15.

    Central Gap Fund: The SMART IC is an Innovation and Enterprise Office under the NRF’s Central Gap Fund. This program helps projects that have already received an Innovation 2.0, STDR Stream 2, or I2START Grant but require additional funding to bridge to seed or Series A funding, with possible funding of up to S$5 million. Applications are accepted throughout the year.

    The SMART IC will also continue developing robust entrepreneurship mentorship programs and regular industry events to encourage closer collaboration among faculty innovators and the business community.

    “SMART, through the Innovation Center, is honored to be able to help researchers take these revolutionary technologies to the marketplace, where they can contribute to the economy and society. The projects we fund are commercialized in Singapore, ensuring that the local economy is the first to benefit,” says Eugene Fitzgerald, chief executive officer and director of SMART, and professor of materials science and engineering at MIT.

    SMART was established by MIT and the NRF in 2007 and serves as an intellectual and innovation hub for cutting-edge research of interest to both parties. SMART is the first entity in the Campus for Research Excellence and Technological Enterprise. SMART currently comprises an Innovation Center and five Interdisciplinary Research Groups: Antimicrobial Resistance, Critical Analytics for Manufacturing Personalized-Medicine, Disruptive and Sustainable Technologies for Agricultural Precision, Future Urban Mobility, and Low Energy Electronic Systems.

    The SMART IC was set up by MIT and the NRF in 2009. It identifies and nurtures a broad range of emerging technologies including but not limited to biotechnology, biomedical devices, information technology, new materials, nanotechnology, and energy innovations, and accelerates them toward commercialization. The SMART IC runs a rigorous grant system that identifies and funds promising projects to help them de-risk their technologies, conduct proof-of-concept experiments, and determine go-to-market strategies. It also prides itself on robust entrepreneurship boot camps and mentorship, and frequent industry events to encourage closer collaboration among faculty innovators and the business community. SMART’s Innovation grant program is the only scheme that is open to all institutes of higher learning and research institutes across Singapore. More

  • in

    Engineers enlist AI to help scale up advanced solar cell manufacturing

    Perovskites are a family of materials that are currently the leading contender to potentially replace today’s silicon-based solar photovoltaics. They hold the promise of panels that are far thinner and lighter, that could be made with ultra-high throughput at room temperature instead of at hundreds of degrees, and that are cheaper and easier to transport and install. But bringing these materials from controlled laboratory experiments into a product that can be manufactured competitively has been a long struggle.

    Manufacturing perovskite-based solar cells involves optimizing at least a dozen or so variables at once, even within one particular manufacturing approach among many possibilities. But a new system based on a novel approach to machine learning could speed up the development of optimized production methods and help make the next generation of solar power a reality.

    The system, developed by researchers at MIT and Stanford University over the last few years, makes it possible to integrate data from prior experiments, and information based on personal observations by experienced workers, into the machine learning process. This makes the outcomes more accurate and has already led to the manufacturing of perovskite cells with an energy conversion efficiency of 18.5 percent, a competitive level for today’s market.

    The research is reported today in the journal Joule, in a paper by MIT professor of mechanical engineering Tonio Buonassisi, Stanford professor of materials science and engineering Reinhold Dauskardt, recent MIT research assistant Zhe Liu, Stanford doctoral graduate Nicholas Rolston, and three others.

    Perovskites are a group of layered crystalline compounds defined by the configuration of the atoms in their crystal lattice. There are thousands of such possible compounds and many different ways of making them. While most lab-scale development of perovskite materials uses a spin-coating technique, that’s not practical for larger-scale manufacturing, so companies and labs around the world have been searching for ways of translating these lab materials into a practical, manufacturable product.

    “There’s always a big challenge when you’re trying to take a lab-scale process and then transfer it to something like a startup or a manufacturing line,” says Rolston, who is now an assistant professor at Arizona State University. The team looked at a process that they felt had the greatest potential, a method called rapid spray plasma processing, or RSPP.

    The manufacturing process would involve a moving roll-to-roll surface, or series of sheets, on which the precursor solutions for the perovskite compound would be sprayed or ink-jetted as the sheet rolled by. The material would then move on to a curing stage, providing a rapid and continuous output “with throughputs that are higher than for any other photovoltaic technology,” Rolston says.

    “The real breakthrough with this platform is that it would allow us to scale in a way that no other material has allowed us to do,” he adds. “Even materials like silicon require a much longer timeframe because of the processing that’s done. Whereas you can think of [this approach as more] like spray painting.”

    Within that process, at least a dozen variables may affect the outcome, some of them more controllable than others. These include the composition of the starting materials, the temperature, the humidity, the speed of the processing path, the distance of the nozzle used to spray the material onto a substrate, and the methods of curing the material. Many of these factors can interact with each other, and if the process is in open air, then humidity, for example, may be uncontrolled. Evaluating all possible combinations of these variables through experimentation is impossible, so machine learning was needed to help guide the experimental process.

    But while most machine-learning systems use raw data such as measurements of the electrical and other properties of test samples, they don’t typically incorporate human experience such as qualitative observations made by the experimenters of the visual and other properties of the test samples, or information from other experiments reported by other researchers. So, the team found a way to incorporate such outside information into the machine learning model, using a probability factor based on a mathematical technique called Bayesian Optimization.

    Using the system, he says, “having a model that comes from experimental data, we can find out trends that we weren’t able to see before.” For example, they initially had trouble adjusting for uncontrolled variations in humidity in their ambient setting. But the model showed them “that we could overcome our humidity challenges by changing the temperature, for instance, and by changing some of the other knobs.”

    The system now allows experimenters to much more rapidly guide their process in order to optimize it for a given set of conditions or required outcomes. In their experiments, the team focused on optimizing the power output, but the system could also be used to simultaneously incorporate other criteria, such as cost and durability — something members of the team are continuing to work on, Buonassisi says.

    The researchers were encouraged by the Department of Energy, which sponsored the work, to commercialize the technology, and they’re currently focusing on tech transfer to existing perovskite manufacturers. “We are reaching out to companies now,” Buonassisi says, and the code they developed has been made freely available through an open-source server. “It’s now on GitHub, anyone can download it, anyone can run it,” he says. “We’re happy to help companies get started in using our code.”

    Already, several companies are gearing up to produce perovskite-based solar panels, even though they are still working out the details of how to produce them, says Liu, who is now at the Northwestern Polytechnical University in Xi’an, China. He says companies there are not yet doing large-scale manufacturing, but instead starting with smaller, high-value applications such as building-integrated solar tiles where appearance is important. Three of these companies “are on track or are being pushed by investors to manufacture 1 meter by 2-meter rectangular modules [comparable to today’s most common solar panels], within two years,” he says.

    ‘The problem is, they don’t have a consensus on what manufacturing technology to use,” Liu says. The RSPP method, developed at Stanford, “still has a good chance” to be competitive, he says. And the machine learning system the team developed could prove to be important in guiding the optimization of whatever process ends up being used.

    “The primary goal was to accelerate the process, so it required less time, less experiments, and less human hours to develop something that is usable right away, for free, for industry,” he says.

    “Existing work on machine-learning-driven perovskite PV fabrication largely focuses on spin-coating, a lab-scale technique,” says Ted Sargent, University Professor at the University of Toronto, who was not associated with this work, which he says demonstrates “a workflow that is readily adapted to the deposition techniques that dominate the thin-film industry. Only a handful of groups have the simultaneous expertise in engineering and computation to drive such advances.” Sargent adds that this approach “could be an exciting advance for the manufacture of a broader family of materials” including LEDs, other PV technologies, and graphene, “in short, any industry that uses some form of vapor or vacuum deposition.” 

    The team also included Austin Flick and Thomas Colburn at Stanford and Zekun Ren at the Singapore-MIT Alliance for Science and Technology (SMART). In addition to the Department of Energy, the work was supported by a fellowship from the MIT Energy Initiative, the Graduate Research Fellowship Program from the National Science Foundation, and the SMART program. More

  • in

    SMART researchers develop method for early detection of bacterial infection in crops

    Researchers from the Disruptive and Sustainable Technologies for Agricultural Precision (DiSTAP) Interdisciplinary Research Group (IRG) ofSingapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, and their local collaborators from Temasek Life Sciences Laboratory (TLL), have developed a rapid Raman spectroscopy-based method for detecting and quantifying early bacterial infection in crops. The Raman spectral biomarkers and diagnostic algorithm enable the noninvasive and early diagnosis of bacterial infections in crop plants, which can be critical for the progress of plant disease management and agricultural productivity.

    Due to the increasing demand for global food supply and security, there is a growing need to improve agricultural production systems and increase crop productivity. Globally, bacterial pathogen infection in crop plants is one of the major contributors to agricultural yield losses. Climate change also adds to the problem by accelerating the spread of plant diseases. Hence, developing methods for rapid and early detection of pathogen-infected crops is important to improve plant disease management and reduce crop loss.

    The breakthrough by SMART and TLL researchers offers a faster and more accurate method to detect bacterial infection in crop plants at an earlier stage, as compared to existing techniques. The new results appear in a paper titled “Rapid detection and quantification of plant innate immunity response using Raman spectroscopy” published in the journal Frontiers in Plant Science.

    “The early detection of pathogen-infected crop plants is a significant step to improve plant disease management,” says Chua Nam Hai, DiSTAP co-lead principal investigator, professor, TLL deputy chair, and co-corresponding author. “It will allow the fast and selective removal of pathogen load and curb the further spread of disease to other neighboring crops.”

    Traditionally, plant disease diagnosis involves a simple visual inspection of plants for disease symptoms and severity. “Visual inspection methods are often ineffective, as disease symptoms usually manifest only at relatively later stages of infection, when the pathogen load is already high and reparative measures are limited. Hence, new methods are required for rapid and early detection of bacterial infection. The idea would be akin to having medical tests to identify human diseases at an early stage, instead of waiting for visual symptoms to show, so that early intervention or treatment can be applied,” says MIT Professor Rajeev Ram, who is a DiSTAP principal investigator and co-corresponding author on the paper.

    While existing techniques, such as current molecular detection methods, can detect bacterial infection in plants, they are often limited in their use. Molecular detection methods largely depend on the availability of pathogen-specific gene sequences or antibodies to identify bacterial infection in crops; the implementation is also time-consuming and nonadaptable for on-site field application due to the high cost and bulky equipment required, making it impractical for use in agricultural farms.

    “At DiSTAP, we have developed a quantitative Raman spectroscopy-based algorithm that can help farmers to identify bacterial infection rapidly. The developed diagnostic algorithm makes use of Raman spectral biomarkers and can be easily implemented in cloud-based computing and prediction platforms. It is more effective than existing techniques as it enables accurate identification and early detection of bacterial infection, both of which are crucial to saving crop plants that would otherwise be destroyed,” explains Gajendra Pratap Singh, scientific director and principal investigator at DiSTAP and co-lead author.

    A portable Raman system can be used on farms and provides farmers with an accurate and simple yes-or-no response when used to test for the presence of bacterial infections in crops. The development of this rapid and noninvasive method could improve plant disease management and have a transformative impact on agricultural farms by efficiently reducing agricultural yield loss and increasing productivity.

    “Using the diagnostic algorithm method, we experimented on several edible plants such as choy sum,” says DiSTAP and TLL principal investigator and co-corresponding author Rajani Sarojam. “The results showed that the Raman spectroscopy-based method can swiftly detect and quantify innate immunity response in plants infected with bacterial pathogens. We believe that this technology will be beneficial for agricultural farms to increase their productivity by reducing their yield loss due to plant diseases.”

    The researchers are currently working on the development of high-throughput, custom-made portable or hand-held Raman spectrometers that will allow Raman spectral analysis to be quickly and easily performed on field-grown crops.

    SMART and TLL developed and discovered the diagnostic algorithm and Raman spectral biomarkers. TLL also confirmed and validated the detection method through mutant plants. The research is carried out by SMART and supported by the National Research Foundation of Singapore under its Campus for Research Excellence And Technological Enterprise (CREATE) program.

    SMART was established by MIT and the NRF in 2007. The first entity in CREATE developed by NRF, SMART serves as an intellectual and innovation hub for research interactions between MIT and Singapore, undertaking cutting-edge research projects in areas of interest to both Singapore and MIT. SMART currently comprises an Innovation Center and five IRGs: Antimicrobial Resistance, Critical Analytics for Manufacturing Personalized-Medicine, DiSTAP, Future Urban Mobility, and Low Energy Electronic Systems. SMART research is funded by the NRF under the CREATE program.

    Led by Professor Michael Strano of MIT and Professor Chua Nam Hai of Temasek Lifesciences Laboratory, the DiSTAP program addresses deep problems in food production in Singapore and the world by developing a suite of impactful and novel analytical, genetic, and biomaterial technologies. The goal is to fundamentally change how plant biosynthetic pathways are discovered, monitored, engineered, and ultimately translated to meet the global demand for food and nutrients. Scientists from MIT, TTL, Nanyang Technological University, and National University of Singapore are collaboratively developing new tools for the continuous measurement of important plant metabolites and hormones for novel discovery, deeper understanding and control of plant biosynthetic pathways in ways not yet possible, especially in the context of green leafy vegetables; leveraging these new techniques to engineer plants with highly desirable properties for global food security, including high-yield density production, and drought and pathogen resistance; and applying these technologies to improve urban farming. More