More stories

  • in

    Kerry Emanuel: A climate scientist and meteorologist in the eye of the storm

    Kerry Emanuel once joked that whenever he retired, he would start a “hurricane safari” so other people could experience what it’s like to fly into the eye of a hurricane.

    “All of a sudden, the turbulence stops, the sun comes out, bright sunshine, and it’s amazingly calm. And you’re in this grand stadium [of clouds miles high],” he says. “It’s quite an experience.”

    While the hurricane safari is unlikely to come to fruition — “You can’t just conjure up a hurricane,” he explains — Emanuel, a world-leading expert on links between hurricanes and climate change, is retiring from teaching in the Department of Earth Atmospheric and Planetary Sciences (EAPS) at MIT after a more than 40-year career.

    Best known for his foundational contributions to the science of tropical cyclones, climate, and links between them, Emanuel has also been a prominent voice in public debates on climate change, and what we should do about it.

    “Kerry has had an enormous effect on the world through the students and junior scientists he has trained,” says William Boos PhD ’08, an atmospheric scientist at the University of California at Berkeley. “He’s a brilliant enough scientist and theoretician that he didn’t need any of us to accomplish what he has, but he genuinely cares about educating new generations of scientists and helping to launch their careers.”

    In recognition of Emanuel’s teaching career and contributions to science, a symposium was held in his honor at MIT on June 21 and 22, organized by several of his former students and collaborators, including Boos. Research presented at the symposium focused on the many fields influenced by Emanuel’s more than 200 published research papers — on everything from forecasting the risks posed by tropical cyclones to understanding how rainfall is produced by continent-sized patterns of atmospheric circulation.

    Emanuel’s career observing perturbations of Earth’s atmosphere started earlier than he can remember. “According to my older brother, from the age of 2, I would crawl to the window whenever there was a thunderstorm,” he says. At first, those were the rolling thunderheads of the Midwest where he grew up, then it was the edges of hurricanes during a few teenage years in Florida. Eventually, he would find himself watching from the very eye of the storm, both physically and mathematically.

    Emanuel attended MIT both as an undergraduate studying Earth and planetary sciences, and for his PhD in meteorology, writing a dissertation on thunderstorms that form ahead of cold fronts. Within the department, he worked with some of the central figures of modern meteorology such as Jule Charney, Fred Sanders, and Edward Lorenz — the founder of chaos theory.

    After receiving his PhD in 1978, Emanuel joined the faculty of the University of California at Los Angeles. During this period, he also took a semester sabbatical to film the wind speeds of tornadoes in Texas and Oklahoma. After three years, he returned to MIT and joined the Department of Meteorology in 1981. Two years later, the department merged with Earth and Planetary Sciences to form EAPS as it is known today, and where Emanuel has remained ever since.

    At MIT, he shifted scales. The thunderstorms and tornadoes that had been the focus of Emanuel’s research up to then were local atmospheric phenomena, or “mesoscale” in the language of meteorologists. The larger “synoptic scale” storms that are hurricanes blew into Emanuel’s research when as a young faculty member he was asked to teach a class in tropical meteorology; in prepping for the class, Emanuel found his notes on hurricanes from graduate school no longer made sense.

    “I realized I didn’t understand them because they couldn’t have been correct,” he says. “And so I set out to try to find a much better theoretical formulation for hurricanes.”

    He soon made two important contributions. In 1986, his paper “An Air-Sea Interaction Theory for Tropical Cyclones. Part 1: Steady-State Maintenance” developed a new theory for upper limits of hurricane intensity given atmospheric conditions. This work in turn led to even larger-scale questions to address. “That upper bound had to be dependent on climate, and it was likely to go up if we were to warm the climate,” Emanuel says — a phenomenon he explored in another paper, “The Dependence of Hurricane Intensity on Climate,” which showed how warming sea surface temperatures and changing atmospheric conditions from a warming climate would make hurricanes more destructive.

    “In my view, this is among the most remarkable achievements in theoretical geophysics,” says Adam Sobel PhD ’98, an atmospheric scientist at Columbia University who got to know Emanuel after he graduated and became interested in tropical meteorology. “From first principles, using only pencil-and-paper analysis and physical reasoning, he derives a quantitative bound on hurricane intensity that has held up well over decades of comparison to observations” and underpins current methods of predicting hurricane intensity and how it changes with climate.

    This and diverse subsequent work led to numerous honors, including membership to the American Philosophical Society, the National Academy of Sciences, and the American Academy of Arts and Sciences.

    Emanuel’s research was never confined to academic circles, however; when politicians and industry leaders voiced loud opposition to the idea that human-caused climate change posed a threat, he spoke up.

    “I felt kind of a duty to try to counter that,” says Emanuel. “I thought it was an interesting challenge to see if you could go out and convince what some people call climate deniers, skeptics, that this was a serious risk and we had to treat it as such.”

    In addition to many public lectures and media appearances discussing climate change, Emanuel penned a book for general audiences titled “What We Know About Climate Change,” in addition to a widely-read primer on climate change and risk assessment designed to influence business leaders.

    “Kerry has an unmatched physical understanding of tropical climate phenomena,” says Emanuel’s colleague, Susan Solomon, the Lee and Geraldine Martin Professor of Environmental Studies at EAPS. “But he’s also a great communicator and has generously given his time to public outreach. His book ‘What We Know About Climate Change’ is a beautiful piece of work that is readily understandable and has captivated many a non-expert reader.”

    Along with a number of other prominent climate scientists, Emanuel also began advocating for expanding nuclear power as the most rapid path to decarbonizing the world’s energy systems.

    “I think the impediment to nuclear is largely irrational in the United States,” he says. “So, I’ve been trying to fight that just like I’ve been trying to fight climate denial.”

    One lesson Emanuel has taken from his public work on climate change is that skeptical audiences often respond better to issues framed in positive terms than to doom and gloom; he’s found emphasizing the potential benefits rather than the sacrifices involved in the energy transition can engage otherwise wary audiences.

    “It’s really not opposition to science, per se,” he says. “It’s fear of the societal changes they think are required to do something about it.”

    He has also worked to raise awareness about how insurance companies significantly underestimate climate risks in their policies, in particular by basing hurricane risk on unreliable historical data. One recent practical result has been a project by the First Street Foundation to assess the true flood risk of every property in the United States using hurricane models Emanuel developed.

    “I think it’s transformative,” Emanuel says of the project with First Street. “That may prove to be the most substantive research I’ve done.”

    Though Emanuel is retiring from teaching, he has no plans to stop working. “When I say ‘retire’ it’s in quotes,” he says. In 2011, Emanuel and Professor of Geophysics Daniel Rothman founded the Lorenz Center, a climate research center at MIT in honor of Emanuel’s mentor and friend Edward Lorenz. Emanuel will continue to participate in work at the center, which aims to counter what Emanuel describes as a trend away from “curiosity-driven” work in climate science.

    “Even if there were no such thing as global warming, [climate science] would still be a really, really exciting field,” says Emanuel. “There’s so much to understand about climate, about the climates of the past, about the climates of other planets.”

    In addition to work with the Lorenz Center, he’s become interested once again in tornadoes and severe local storms, and understanding whether climate also controls such local phenomena. He’s also involved in two of MIT’s Climate Grand Challenges projects focused on translating climate hazards to explicit financial and health risks — what will bring the dangers of climate change home to people, he says, is for the public to understand more concrete risks, like agricultural failure, water shortages, electricity shortages, and severe weather events. Capturing that will drive the next few years of his work.

    “I’m going to be stepping up research in some respects,” he says, now living full-time at his home in Maine.

    Of course, “retiring” does mean a bit more free time for new pursuits, like learning a language or an instrument, and “rediscovering the art of sailing,” says Emanuel. He’s looking forward to those days on the water, whatever storms are to come. More

  • in

    Tapping into the million-year energy source below our feet

    There’s an abandoned coal power plant in upstate New York that most people regard as a useless relic. But MIT’s Paul Woskov sees things differently.

    Woskov, a research engineer in MIT’s Plasma Science and Fusion Center, notes the plant’s power turbine is still intact and the transmission lines still run to the grid. Using an approach he’s been working on for the last 14 years, he’s hoping it will be back online, completely carbon-free, within the decade.

    In fact, Quaise Energy, the company commercializing Woskov’s work, believes if it can retrofit one power plant, the same process will work on virtually every coal and gas power plant in the world.

    Quaise is hoping to accomplish those lofty goals by tapping into the energy source below our feet. The company plans to vaporize enough rock to create the world’s deepest holes and harvest geothermal energy at a scale that could satisfy human energy consumption for millions of years. They haven’t yet solved all the related engineering challenges, but Quaise’s founders have set an ambitious timeline to begin harvesting energy from a pilot well by 2026.

    The plan would be easier to dismiss as unrealistic if it were based on a new and unproven technology. But Quaise’s drilling systems center around a microwave-emitting device called a gyrotron that has been used in research and manufacturing for decades.

    “This will happen quickly once we solve the immediate engineering problems of transmitting a clean beam and having it operate at a high energy density without breakdown,” explains Woskov, who is not formally affiliated with Quaise but serves as an advisor. “It’ll go fast because the underlying technology, gyrotrons, are commercially available. You could place an order with a company and have a system delivered right now — granted, these beam sources have never been used 24/7, but they are engineered to be operational for long time periods. In five or six years, I think we’ll have a plant running if we solve these engineering problems. I’m very optimistic.”

    Woskov and many other researchers have been using gyrotrons to heat material in nuclear fusion experiments for decades. It wasn’t until 2008, however, after the MIT Energy Initiative (MITEI) published a request for proposals on new geothermal drilling technologies, that Woskov thought of using gyrotrons for a new application.

    “[Gyrotrons] haven’t been well-publicized in the general science community, but those of us in fusion research understood they were very powerful beam sources — like lasers, but in a different frequency range,” Woskov says. “I thought, why not direct these high-power beams, instead of into fusion plasma, down into rock and vaporize the hole?”

    As power from other renewable energy sources has exploded in recent decades, geothermal energy has plateaued, mainly because geothermal plants only exist in places where natural conditions allow for energy extraction at relatively shallow depths of up to 400 feet beneath the Earth’s surface. At a certain point, conventional drilling becomes impractical because deeper crust is both hotter and harder, which wears down mechanical drill bits.

    Woskov’s idea to use gyrotron beams to vaporize rock sent him on a research journey that has never really stopped. With some funding from MITEI, he began running tests, quickly filling his office with small rock formations he’d blasted with millimeter waves from a small gyrotron in MIT’s Plasma Science and Fusion Center.

    Woskov displaying samples in his lab in 2016.

    Photo: Paul Rivenberg

    Previous item
    Next item

    Around 2018, Woskov’s rocks got the attention of Carlos Araque ’01, SM ’02, who had spent his career in the oil and gas industry and was the technical director of MIT’s investment fund The Engine at the time.

    That year, Araque and Matt Houde, who’d been working with geothermal company AltaRock Energy, founded Quaise. Quaise was soon given a grant by the Department of Energy to scale up Woskov’s experiments using a larger gyrotron.

    With the larger machine, the team hopes to vaporize a hole 10 times the depth of Woskov’s lab experiments. That is expected to be accomplished by the end of this year. After that, the team will vaporize a hole 10 times the depth of the previous one — what Houde calls a 100-to-1 hole.

    “That’s something [the DOE] is particularly interested in, because they want to address the challenges posed by material removal over those greater lengths — in other words, can we show we’re fully flushing out the rock vapors?” Houde explains. “We believe the 100-to-1 test also gives us the confidence to go out and mobilize a prototype gyrotron drilling rig in the field for the first field demonstrations.”

    Tests on the 100-to-1 hole are expected to be completed sometime next year. Quaise is also hoping to begin vaporizing rock in field tests late next year. The short timeline reflects the progress Woskov has already made in his lab.

    Although more engineering research is needed, ultimately, the team expects to be able to drill and operate these geothermal wells safely. “We believe, because of Paul’s work at MIT over the past decade, that most if not all of the core physics questions have been answered and addressed,” Houde says. “It’s really engineering challenges we have to answer, which doesn’t mean they’re easy to solve, but we’re not working against the laws of physics, to which there is no answer. It’s more a matter of overcoming some of the more technical and cost considerations to making this work at a large scale.”

    The company plans to begin harvesting energy from pilot geothermal wells that reach rock temperatures at up to 500 C by 2026. From there, the team hopes to begin repurposing coal and natural gas plants using its system.

    “We believe, if we can drill down to 20 kilometers, we can access these super-hot temperatures in greater than 90 percent of locations across the globe,” Houde says.

    Quaise’s work with the DOE is addressing what it sees as the biggest remaining questions about drilling holes of unprecedented depth and pressure, such as material removal and determining the best casing to keep the hole stable and open. For the latter problem of well stability, Houde believes additional computer modeling is needed and expects to complete that modeling by the end of 2024.

    By drilling the holes at existing power plants, Quaise will be able to move faster than if it had to get permits to build new plants and transmission lines. And by making their millimeter-wave drilling equipment compatible with the existing global fleet of drilling rigs, it will also allow the company to tap into the oil and gas industry’s global workforce.

    “At these high temperatures [we’re accessing], we’re producing steam very close to, if not exceeding, the temperature that today’s coal and gas-fired power plants operate at,” Houde says. “So, we can go to existing power plants and say, ‘We can replace 95 to 100 percent of your coal use by developing a geothermal field and producing steam from the Earth, at the same temperature you’re burning coal to run your turbine, directly replacing carbon emissions.”

    Transforming the world’s energy systems in such a short timeframe is something the founders see as critical to help avoid the most catastrophic global warming scenarios.

    “There have been tremendous gains in renewables over the last decade, but the big picture today is we’re not going nearly fast enough to hit the milestones we need for limiting the worst impacts of climate change,” Houde says. “[Deep geothermal] is a power resource that can scale anywhere and has the ability to tap into a large workforce in the energy industry to readily repackage their skills for a totally carbon free energy source.” More

  • in

    Making hydrogen power a reality

    For decades, government and industry have looked to hydrogen as a potentially game-changing tool in the quest for clean energy. As far back as the early days of the Clinton administration, energy sector observers and public policy experts have extolled the virtues of hydrogen — to the point that some people have joked that hydrogen is the energy of the future, “and always will be.”

    Even as wind and solar power have become commonplace in recent years, hydrogen has been held back by high costs and other challenges. But the fuel may finally be poised to have its moment. At the MIT Energy Initiative Spring Symposium — entitled “Hydrogen’s role in a decarbonized energy system” — experts discussed hydrogen production routes, hydrogen consumption markets, the path to a robust hydrogen infrastructure, and policy changes needed to achieve a “hydrogen future.”

    During one panel, “Options for producing low-carbon hydrogen at scale,” four experts laid out existing and planned efforts to leverage hydrogen for decarbonization. 

    “The race is on”

    Huyen N. Dinh, a senior scientist and group manager at the National Renewable Energy Laboratory (NREL), is the director of HydroGEN, a consortium of several U.S. Department of Energy (DOE) national laboratories that accelerates research and development of innovative and advanced water splitting materials and technologies for clean, sustainable, and low-cost hydrogen production.

    For the past 14 years, Dinh has worked on fuel cells and hydrogen production for NREL. “We think that the 2020s is the decade of hydrogen,” she said. Dinh believes that the energy carrier is poised to come into its own over the next few years, pointing to several domestic and international activities surrounding the fuel and citing a Hydrogen Council report that projected the future impacts of hydrogen — including 30 million jobs and $2.5 trillion in global revenue by 2050.

    “Now is the time for hydrogen, and the global race is on,” she said.

    Dinh also explained the parameters of the Hydrogen Shot — the first of the DOE’s “Energy Earthshots” aimed at accelerating breakthroughs for affordable and reliable clean energy solutions. Hydrogen fuel currently costs around $5 per kilogram to produce, and the Hydrogen Shot’s stated goal is to bring that down by 80 percent to $1 per kilogram within a decade.

    The Hydrogen Shot will be facilitated by $9.5 billion in funding for at least four clean hydrogen hubs located in different parts of the United States, as well as extensive research and development, manufacturing, and recycling from last year’s bipartisan infrastructure law. Still, Dinh noted that it took more than 40 years for solar and wind power to become cost competitive, and now industry, government, national lab, and academic leaders are hoping to achieve similar reductions in hydrogen fuel costs over a much shorter time frame. In the near term, she said, stakeholders will need to improve the efficiency, durability, and affordability of hydrogen production through electrolysis (using electricity to split water) using today’s renewable and nuclear power sources. Over the long term, the focus may shift to splitting water more directly through heat or solar energy, she said.

    “The time frame is short, the competition is intense, and a coordinated effort is critical for domestic competitiveness,” Dinh said.

    Hydrogen across continents

    Wambui Mutoru, principal engineer for international commercial development, exploration, and production international at the Norwegian global energy company Equinor, said that hydrogen is an important component in the company’s ambitions to be carbon-neutral by 2050. The company, in collaboration with partners, has several hydrogen projects in the works, and Mutoru laid out the company’s Hydrogen to Humber project in Northern England. Currently, the Humber region emits more carbon dioxide than any other industrial cluster in the United Kingdom — 50 percent more, in fact, than the next-largest carbon emitter.   

    “The ambition here is for us to deploy the world’s first at-scale hydrogen value chain to decarbonize the Humber industrial cluster,” Mutoru said.

    The project consists of three components: a clean hydrogen production facility, an onshore hydrogen and carbon dioxide transmission network, and offshore carbon dioxide transportation and storage operations. Mutoru highlighted the importance of carbon capture and storage in hydrogen production. Equinor, she said, has captured and sequestered carbon offshore for more than 25 years, storing more than 25 million tons of carbon dioxide during that time.

    Mutoru also touched on Equinor’s efforts to build a decarbonized energy hub in the Appalachian region of the United States, covering territory in Ohio, West Virginia, and Pennsylvania. By 2040, she said, the company’s ambition is to produce about 1.5 million tons of clean hydrogen per year in the region — roughly equivalent to 6.8 gigawatts of electricity — while also storing 30 million tons of carbon dioxide.

    Mutoru acknowledged that the biggest challenge facing potential hydrogen producers is the current lack of viable business models. “Resolving that challenge requires cross-industry collaboration, and supportive policy frameworks so that the market for hydrogen can be built and sustained over the long term,” she said.

    Confronting barriers

    Gretchen Baier, executive external strategy and communications leader for Dow, noted that the company already produces hydrogen in multiple ways. For one, Dow operates the world’s largest ethane cracker, in Texas. An ethane cracker heats ethane to break apart molecular bonds to form ethylene, with hydrogen one of the byproducts of the process. Also, Baier showed a slide of the 1891 patent for the electrolysis of brine water, which also produces hydrogen. The company still engages in this practice, but Dow does not have an effective way of utilizing the resulting hydrogen for their own fuel.

    “Just take a moment to think about that,” Baier said. “We’ve been talking about hydrogen production and the cost of it, and this is basically free hydrogen. And it’s still too much of a barrier to somewhat recycle that and use it for ourselves. The environment is clearly changing, and we do have plans for that, but I think that kind of sets some of the challenges that face industry here.”

    However, Baier said, hydrogen is expected to play a significant role in Dow’s future as the company attempts to decarbonize by 2050. The company, she said, plans to optimize hydrogen allocation and production, retrofit turbines for hydrogen fueling, and purchase clean hydrogen. By 2040, Dow expects more than 60 percent of its sites to be hydrogen-ready.

    Baier noted that hydrogen fuel is not a “panacea,” but rather one among many potential contributors as industry attempts to reduce or eliminate carbon emissions in the coming decades. “Hydrogen has an important role, but it’s not the only answer,” she said.

    “This is real”

    Colleen Wright is vice president of corporate strategy for Constellation, which recently separated from Exelon Corporation. (Exelon now owns the former company’s regulated utilities, such as Commonwealth Edison and Baltimore Gas and Electric, while Constellation owns the competitive generation and supply portions of the business.) Wright stressed the advantages of nuclear power in hydrogen production, which she said include superior economics, low barriers to implementation, and scalability.

    “A quarter of emissions in the world are currently from hard-to-decarbonize sectors — the industrial sector, steel making, heavy-duty transportation, aviation,” she said. “These are really challenging decarbonization sectors, and as we continue to expand and electrify, we’re going to need more supply. We’re also going to need to produce clean hydrogen using emissions-free power.”

    “The scale of nuclear power plants is uniquely suited to be able to scale hydrogen production,” Wright added. She mentioned Constellation’s Nine Mile Point site in the State of New York, which received a DOE grant for a pilot program that will see a proton exchange membrane electrolyzer installed at the site.

    “We’re very excited to see hydrogen go from a [research and development] conversation to a commercial conversation,” she said. “We’ve been calling it a little bit of a ‘middle-school dance.’ Everybody is standing around the circle, waiting to see who’s willing to put something at stake. But this is real. We’re not dancing around the edges. There are a lot of people who are big players, who are willing to put skin in the game today.” More

  • in

    Donald Sadoway wins European Inventor Award for liquid metal batteries

    MIT Professor Donald Sadoway has won the 2022 European Inventor Award, in the category for Non-European Patent Office Countries, for his work on liquid metal batteries that could enable the long-term storage of renewable energy.

    Sadoway is the John F. Elliott Professor of Materials Chemistry in MIT’s Department of Materials Science and Engineering, and a longtime supporter and friend of the Materials Research Laboratory.

    “By enabling the large-scale storage of renewable energy, Donald Sadoway’s invention is a huge step towards the deployment of carbon-free electricity generation,” says António Campinos, president of the European Patent Office. “He has spent his career studying electrochemistry and has transformed this expertise into an invention that represents a huge step forward in the transition to green energy.”

    Sadoway was honored at the 2022 European Inventor Award ceremony on June 21. The award is one of Europe’s most prestigious innovation prizes and is presented annually to outstanding inventors from Europe and beyond who have made an exceptional contribution to society, technological progress, and economic growth.

    When accepting the award in Munich, Sadoway told the audience:

    “I am astonished. When I look at all the patented technologies that are represented at this event I see an abundance of excellence, all of them solutions to pressing problems. I wonder if the judges are assessing not only degrees of excellence but degrees of urgency. The liquid metal battery addresses an existential threat to the health of our atmosphere which is related to climate change.

    “By hosting this event the EPO celebrates invention. The thread that connects all the inventors is their efforts to make the world a better place. In my judgment there is no nobler pursuit. So perhaps this is a celebration of nobility.”

    Sadoway’s liquid metal batteries consist of three liquid layers of different densities, which naturally separate in the same way as oil and vinegar do in a salad dressing. The top and bottom layers are made from molten metals, with a middle layer of molten liquid salt.

    To keep the metals liquid, the batteries need to operate at extremely high temperatures, so Sadoway designed a system that is self-heating and insulated, requiring no external heating or cooling. They have a lifespan of more than 20 years, can maintain 99 percent of their capacity over 5,000 charging cycles, and have no combustible materials, meaning there is no fire risk.

    In 2010, with a patent for his invention and support from Bill Gates, Sadoway co-founded Ambri, based in Marlborough, Massachusetts just outside Boston, to develop a commercial product. The company will soon install a unit on a 3,700-acre development for a data center in Nevada. This battery will store energy from a reported 500 megawatts of on-site renewable generation, the same output as a natural gas power plant.

    Born in 1950 into a family of Ukrainian immigrants in Canada, Sadoway studied chemical metallurgy specializing in what he calls “extreme electrochemistry” — chemical reactions in molten salts and liquid metals that have been heated to over 500 degrees Celsius. After earning his BASc, MASc, and PhD, all from the University of Toronto, he joined the faculty at MIT in 1978. More

  • in

    Helping renewable energy projects succeed in local communities

    Jungwoo Chun makes surprising discoveries about sustainability initiatives by zooming in on local communities.

    His discoveries lie in understanding how renewable energy infrastructure develops at a local level. With so many stakeholders in a community — citizens, government officials, businesses, and other organizations — the development process gets complicated very quickly. Chun works to unpack stakeholder relationships to help local renewable energy projects move forward.

    While his interests today are in local communities around the U.S., Chun comes from a global background. Growing up, his family moved frequently due to his dad’s work. He lived in Seoul, South Korea until elementary school and then hopped from city to city around Asia, spending time in China, Hong Kong, and Singapore. When it was time for college, he returned to South Korea, majoring in international studies at Korea University and later completing his master’s there in the same field.

    After graduating, Chun wanted to leverage his international expertise to tackle climate change. So, he pursued a second master’s in international environmental policy with William Moomaw at Tufts University.

    During that time, Chun came across an article on climate change by David Victor, a professor in public policy at the University of California at San Diego. Victor argued that while international efforts to fight climate change are necessary, more tangible progress can be made through local efforts catered to each country. That prompted Chun to think a step further: “What can we do in the local community to make a little bit of a difference, which could add up to something big in the long term?”

    With a renewed direction for his goals, Chun arrived at the MIT Department of Urban Studies and Planning, specializing in environmental policy and planning. But he was still missing that final inspirational spark to proactively pursue his goals — until he began working with his primary advisor, Lawrence Susskind, the Ford Professor of Urban and Environmental Planning and director of the Science Impact Collaborative.

    For previous research projects, “I would just do what I was told,” Chun says, but his new advisor “really opened [his] eyes” to being an active member of the community. From the start, Susskind has encouraged Chun to share his research ideas and has shown him how to leverage his research skills for public service. Over the past few years, Chun has also taught several classes with Susskind, learning to approach education thoughtfully for an engaging and equitable classroom. Because of their relationship, Chun now always searches for ways to make a difference through research, teaching, and public service.

    Understanding renewable energy projects at a local level

    For his main dissertation project with Susskind, Chun is studying community-owned solar energy projects, working to understand what makes them successful.

    Often, communities don’t have the required expertise to carry out these projects on their own and instead look to advisory organizations for help. But little research has been done on these organizations and the roles that they play in developing solar energy infrastructure.

    Through over 200 surveys and counting, Chun has discovered that these organizations act as life-long collaborators to communities and are critical in getting community-owned solar projects up and running. At the start of these projects, they walk communities through a mountain of logistics for setting up solar energy infrastructure, including permit applications, budgeting, and contractor employment. After the infrastructure is in place, the organizations stay involved, serving as consultants when needed and sometimes even becoming partners.

    Because of these roles, Chun calls these organizations “intermediaries,” drawing a parallel with roles in in conflict resolution. “But it’s much more than that,” he adds. Intermediaries help local communities “build a movement [for community-owned solar energy projects] … and empower them to be independent and self-sustaining.”

    Chun is also working on another project with Susskind, looking at situations where communities are opposed to renewable energy infrastructure. For this project, Chun is supervising and mentoring a group of five undergraduates. Together, they are trying to pinpoint the reasons behind local opposition to renewable energy projects.

    The idea for this project emerged two years ago, when Chun heard in the news that many solar and wind projects were being delayed or cancelled due to local opposition. But the reasons for this opposition weren’t thoroughly researched.

    “When we started to dig a little deeper, [we found that] communities oppose these projects even though they aren’t opposed to renewable energy,” Chun says. The primary reasons for opposition lie in land use concerns, including financial challenges, health and safety concerns, and ironically, environmental consequences. By better understanding these concerns, Chun hopes to help more renewable energy projects succeed and bring society closer to a sustainable future.

    Bringing research to the classroom and community

    Right now, Chun is looking to bring his research insights on renewable energy infrastructure into the classroom. He’s developing a course on renewable energy that will act as a “clinic” where students will work with communities to understand their concerns for potential renewable energy projects. The students’ findings will then be passed onto project leaders to help them address these concerns.

    This new course is modeled after 11.074/11.274 (Cybersecurity Clinic), which Chun has helped develop over the past few years. In this clinic, students work with local governments in New England to assess potential cybersecurity vulnerabilities in their digital systems. At first, “a lot of city governments were very skeptical, like ‘students doing service for us…?’” Chun says. “But in the end, they were all very satisfied with the outcome” and found the assessments “impactful.”

    Since the Cybersecurity Clinic has kicked off, other universities have approached Chun and his co-instructors about developing their own regional clinics. Now, there are cybersecurity clinics operating around the world. “That’s been a huge success,” Chun says. Going forward, “we’d like to expand the benefit of this clinic [to address] communities opposing renewable energy [projects].” The new course will be a philosophical trifecta for Chun, combining his commitments to research, teaching, and public service.

    Chun plans to wrap up his PhD at the end of this summer and is currently writing his dissertation on community-owned solar energy projects. “I’m done with all the background work — working the soil and throwing the seeds in the right place,” he says, “It’s now time to gather all the crops and present the work.” More

  • in

    Study finds natural sources of air pollution exceed air quality guidelines in many regions

    Alongside climate change, air pollution is one of the biggest environmental threats to human health. Tiny particles known as particulate matter or PM2.5 (named for their diameter of just 2.5 micrometers or less) are a particularly hazardous type of pollutant. These particles are produced from a variety of sources, including wildfires and the burning of fossil fuels, and can enter our bloodstream, travel deep into our lungs, and cause respiratory and cardiovascular damage. Exposure to particulate matter is responsible for millions of premature deaths globally every year.

    In response to the increasing body of evidence on the detrimental effects of PM2.5, the World Health Organization (WHO) recently updated its air quality guidelines, lowering its recommended annual PM2.5 exposure guideline by 50 percent, from 10 micrograms per meter cubed (μm3) to 5 μm3. These updated guidelines signify an aggressive attempt to promote the regulation and reduction of anthropogenic emissions in order to improve global air quality.

    A new study by researchers in the MIT Department of Civil and Environmental Engineering explores if the updated air quality guideline of 5 μm3 is realistically attainable across different regions of the world, particularly if anthropogenic emissions are aggressively reduced. 

    The first question the researchers wanted to investigate was to what degree moving to a no-fossil-fuel future would help different regions meet this new air quality guideline.

    “The answer we found is that eliminating fossil-fuel emissions would improve air quality around the world, but while this would help some regions come into compliance with the WHO guidelines, for many other regions high contributions from natural sources would impede their ability to meet that target,” says senior author Colette Heald, the Germeshausen Professor in the MIT departments of Civil and Environmental Engineering, and Earth, Atmospheric and Planetary Sciences. 

    The study by Heald, Professor Jesse Kroll, and graduate students Sidhant Pai and Therese Carter, published June 6 in the journal Environmental Science and Technology Letters, finds that over 90 percent of the global population is currently exposed to average annual concentrations that are higher than the recommended guideline. The authors go on to demonstrate that over 50 percent of the world’s population would still be exposed to PM2.5 concentrations that exceed the new air quality guidelines, even in the absence of all anthropogenic emissions.

    This is due to the large natural sources of particulate matter — dust, sea salt, and organics from vegetation — that still exist in the atmosphere when anthropogenic emissions are removed from the air. 

    “If you live in parts of India or northern Africa that are exposed to large amounts of fine dust, it can be challenging to reduce PM2.5 exposures below the new guideline,” says Sidhant Pai, co-lead author and graduate student. “This study challenges us to rethink the value of different emissions abatement controls across different regions and suggests the need for a new generation of air quality metrics that can enable targeted decision-making.”

    The researchers conducted a series of model simulations to explore the viability of achieving the updated PM2.5 guidelines worldwide under different emissions reduction scenarios, using 2019 as a representative baseline year. 

    Their model simulations used a suite of different anthropogenic sources that could be turned on and off to study the contribution of a particular source. For instance, the researchers conducted a simulation that turned off all human-based emissions in order to determine the amount of PM2.5 pollution that could be attributed to natural and fire sources. By analyzing the chemical composition of the PM2.5 aerosol in the atmosphere (e.g., dust, sulfate, and black carbon), the researchers were also able to get a more accurate understanding of the most important PM2.5 sources in a particular region. For example, elevated PM2.5 concentrations in the Amazon were shown to predominantly consist of carbon-containing aerosols from sources like deforestation fires. Conversely, nitrogen-containing aerosols were prominent in Northern Europe, with large contributions from vehicles and fertilizer usage. The two regions would thus require very different policies and methods to improve their air quality. 

    “Analyzing particulate pollution across individual chemical species allows for mitigation and adaptation decisions that are specific to the region, as opposed to a one-size-fits-all approach, which can be challenging to execute without an understanding of the underlying importance of different sources,” says Pai. 

    When the WHO air quality guidelines were last updated in 2005, they had a significant impact on environmental policies. Scientists could look at an area that was not in compliance and suggest high-level solutions to improve the region’s air quality. But as the guidelines have tightened, globally-applicable solutions to manage and improve air quality are no longer as evident. 

    “Another benefit of speciating is that some of the particles have different toxicity properties that are correlated to health outcomes,” says Therese Carter, co-lead author and graduate student. “It’s an important area of research that this work can help motivate. Being able to separate out that piece of the puzzle can provide epidemiologists with more insights on the different toxicity levels and the impact of specific particles on human health.”

    The authors view these new findings as an opportunity to expand and iterate on the current guidelines.  

    “Routine and global measurements of the chemical composition of PM2.5 would give policymakers information on what interventions would most effectively improve air quality in any given location,” says Jesse Kroll, a professor in the MIT departments of Civil and Environmental Engineering and Chemical Engineering. “But it would also provide us with new insights into how different chemical species in PM2.5 affect human health.”

    “I hope that as we learn more about the health impacts of these different particles, our work and that of the broader atmospheric chemistry community can help inform strategies to reduce the pollutants that are most harmful to human health,” adds Heald. More

  • in

    Migration Summit addresses education and workforce development in displacement

    “Refugees can change the world with access to education,” says Alnarjes Harba, a refugee from Syria who recently shared her story at the 2022 Migration Summit — a first-of-its-kind, global convening to address the challenges that displaced communities face in accessing education and employment.

    At the age of 13, Harba was displaced to Lebanon, where she graduated at the top of her high school class. But because of her refugee status, she recalls, no university in her host country would accept her. Today, Harba is a researcher in health-care architecture. She holds a bachelor’s degree from Southern New Hampshire University, where she was part of the Global Education Movement, a program providing refugees with pathways to higher education and work.

    Like many of the Migration Summit’s participants, Harba shared her story to call attention not only to the barriers to refugee education, but also to the opportunities to create more education-to-employment pathways like MIT Refugee Action Hub’s (ReACT) certificate programs for displaced learners.

    Organized by MIT ReACT, the MIT Abdul Latif Jameel World Education Lab (J-WEL), Na’amal, Karam Foundation, and Paper Airplanes, the Migration Summit sought to center the voices and experiences of those most directly impacted by displacement — both in narratives about the crisis and in the search for solutions. Themed “Education and Workforce Development in Displacement,” this year’s summit welcomed more than 900 attendees from over 30 countries, to a total of 40 interactive virtual sessions led by displaced learners, educators, and activists working to support communities in displacement.

    Sessions highlighted the experiences of refugees, migrants, and displaced learners, as well as current efforts across the education and workforce development landscape, ranging from pK-12 initiatives to post-secondary programs, workforce training to entrepreneurship opportunities.

    Overcoming barriers to access

    The vision for the Migration Summit developed, in part, out of the need to raise more awareness about the long-standing global displacement crisis. According to the United Nations High Commissioner for Refugees (UNHCR), 82.4 million people worldwide today are forcibly displaced, a figure that doesn’t include the estimated 12 million people who have fled their homes in Ukraine since February.

    “Refugees not only leave their countries; they leave behind a thousand memories, their friends, their families,” says Mondiant Dogon, a human rights activist, refugee ambassador, and author who gave the Migration Summit’s opening keynote address. “Education is the most important thing that can happen to refugees. In that way, we can leave behind the refugee camps and build our own independent future.”

    Yet, as the stories of the summit’s participants highlight, many in displacement have lost their livelihoods or had their education disrupted — only to face further challenges when trying to access education or find work in their new places of residence. Obstacles range from legal restrictions, language and cultural barriers, and unaffordable costs to lack of verifiable credentials. UNHCR estimates that only 5 percent of refugees have access to higher education, compared to the global average of 39 percent.

    “There is another problem related to forced displacement — dehumanization of migrants,” says Lina Sergie Attar, the founder and CEO of Karam Foundation. “They are unjustly positioned as enemies, as a threat.”

    But as Blein Alem, an MIT ReACT alum and refugee from Eritrea, explains, “No one chooses to be a refugee — it just occurs. Whether by conflict, war, human rights violations, just because you have refugee status does not mean that you are not willing to make a change in your life and access to education and work.” Several participants, including Alem, shared that, even with a degree in hand, their refugee status limited their ability to work in their new countries of residence.

    Displaced communities face complex and structural challenges in accessing education and workforce development opportunities. Because of the varying and vast effects of displacement, efforts to address these challenges range in scale and focus and differ across sectors. As Lorraine Charles, co-founder and director of Na’amal, noted in the Migration Summit’s closing session, many organizations find themselves working in silos, or even competing with each other for funding and other resources. As a result, solution-making has been fragmented, with persistent gaps between different sectors that are, in fact, working toward the same goals.

    Imagining a modular, digital, collaborative approach

    A key takeaway from the month’s discussions, then, is the need to rethink the response to refugee education and workforce challenges. During the session, “From Intentions to Impact: Decolonizing Refugee Response,” participants emphasized the systemic nature of these challenges. Yet formal responses, such as the 1951 Refugee Convention, have been largely inadequate — in some instances even oppressing the communities they’re meant to support, explains Sana Mustafa, director of partnership and engagement for Asylum Access.

    “We have the opportunity to rethink how we are handling the situation,” Mustafa says, calling for more efforts to include refugees in the design and development of solutions.

    Presenters also agreed that educational institutions, particularly universities, could play a vital role in providing more pathways for refugees and displaced learners. Key to this is rethinking the structure of education itself, including its delivery.

    “The challenge right now is that degrees are monolithic,” says Sanjay Sarma, vice president for MIT Open Learning, who gave the keynote address on “Pathways to Education, Livelihood, and Hope.” “They’re like those gigantic rocks at Stonehenge or in other megalithic sites. What we need is a much more granular version of education: bricks. Bricks were invented several thousand years ago, but we don’t really have that yet formally and extensively in education.”

    “There is no way we can accommodate thousands and thousands of refugees face-to-face,” says Shai Reshef, the founder and president of University of the People. “The only path is a digital one.”

    Ultimately, explains Demetri Fadel of Karam Foundation, “We really need to think about how to create a vision of education as a right for every person all around the world.”

    Underlying many of the Migration Summit’s conclusions is the awareness that there is still much work to be done. However, as the summit’s co-chair Lana Cook said in her closing remarks, “This was not a convening of despair, but one about what we can build together.”

    The summit’s organizers are currently putting together a public report of the key findings that have emerged from the month’s conversations, including recommendations for thematic working groups and future Migration Summit activities. More

  • in

    Cracking the case of Arctic sea ice breakup

    Despite its below-freezing temperatures, the Arctic is warming twice as fast as the rest of the planet. As Arctic sea ice melts, fewer bright surfaces are available to reflect sunlight back into space. When fractures open in the ice cover, the water underneath gets exposed. Dark, ice-free water absorbs the sun’s energy, heating the ocean and driving further melting — a vicious cycle. This warming in turn melts glacial ice, contributing to rising sea levels.

    Warming climate and rising sea levels endanger the nearly 40 percent of the U.S. population living in coastal areas, the billions of people who depend on the ocean for food and their livelihoods, and species such as polar bears and Artic foxes. Reduced ice coverage is also making the once-impassable region more accessible, opening up new shipping lanes and ports. Interest in using these emerging trans-Arctic routes for product transit, extraction of natural resources (e.g., oil and gas), and military activity is turning an area traditionally marked by low tension and cooperation into one of global geopolitical competition.

    As the Arctic opens up, predicting when and where the sea ice will fracture becomes increasingly important in strategic decision-making. However, huge gaps exist in our understanding of the physical processes contributing to ice breakup. Researchers at MIT Lincoln Laboratory seek to help close these gaps by turning a data-sparse environment into a data-rich one. They envision deploying a distributed set of unattended sensors across the Arctic that will persistently detect and geolocate ice fracturing events. Concurrently, the network will measure various environmental conditions, including water temperature and salinity, wind speed and direction, and ocean currents at different depths. By correlating these fracturing events and environmental conditions, they hope to discover meaningful insights about what is causing the sea ice to break up. Such insights could help predict the future state of Arctic sea ice to inform climate modeling, climate change planning, and policy decision-making at the highest levels.

    “We’re trying to study the relationship between ice cracking, climate change, and heat flow in the ocean,” says Andrew March, an assistant leader of Lincoln Laboratory’s Advanced Undersea Systems and Technology Group. “Do cracks in the ice cause warm water to rise and more ice to melt? Do undersea currents and waves cause cracking? Does cracking cause undersea waves? These are the types of questions we aim to investigate.”

    Arctic access

    In March 2022, Ben Evans and Dave Whelihan, both researchers in March’s group, traveled for 16 hours across three flights to Prudhoe Bay, located on the North Slope of Alaska. From there, they boarded a small specialized aircraft and flew another 90 minutes to a three-and-a-half-mile-long sheet of ice floating 160 nautical miles offshore in the Arctic Ocean. In the weeks before their arrival, the U.S. Navy’s Arctic Submarine Laboratory had transformed this inhospitable ice floe into a temporary operating base called Ice Camp Queenfish, named after the first Sturgeon-class submarine to operate under the ice and the fourth to reach the North Pole. The ice camp featured a 2,500-foot-long runway, a command center, sleeping quarters to accommodate up to 60 personnel, a dining tent, and an extremely limited internet connection.

    At Queenfish, for the next four days, Evans and Whelihan joined U.S. Navy, Army, Air Force, Marine Corps, and Coast Guard members, and members of the Royal Canadian Air Force and Navy and United Kingdom Royal Navy, who were participating in Ice Exercise (ICEX) 2022. Over the course of about three weeks, more than 200 personnel stationed at Queenfish, Prudhoe Bay, and aboard two U.S. Navy submarines participated in this biennial exercise. The goals of ICEX 2022 were to assess U.S. operational readiness in the Arctic; increase our country’s experience in the region; advance our understanding of the Arctic environment; and continue building relationships with other services, allies, and partner organizations to ensure a free and peaceful Arctic. The infrastructure provided for ICEX concurrently enables scientists to conduct research in an environment — either in person or by sending their research equipment for exercise organizers to deploy on their behalf — that would be otherwise extremely difficult and expensive to access.

    In the Arctic, windchill temperatures can plummet to as low as 60 degrees Fahrenheit below zero, cold enough to freeze exposed skin within minutes. Winds and ocean currents can drift the entire camp beyond the reach of nearby emergency rescue aircraft, and the ice can crack at any moment. To ensure the safety of participants, a team of Navy meteorological specialists continually monitors the ever-changing conditions. The original camp location for ICEX 2022 had to be evacuated and relocated after a massive crack formed in the ice, delaying Evans’ and Whelihan’s trip. Even the newly selected site had a large crack form behind the camp and another crack that necessitated moving a number of tents.

    “Such cracking events are only going to increase as the climate warms, so it’s more critical now than ever to understand the physical processes behind them,” Whelihan says. “Such an understanding will require building technology that can persist in the environment despite these incredibly harsh conditions. So, it’s a challenge not only from a scientific perspective but also an engineering one.”

    “The weather always gets a vote, dictating what you’re able to do out here,” adds Evans. “The Arctic Submarine Laboratory does a lot of work to construct the camp and make it a safe environment where researchers like us can come to do good science. ICEX is really the only opportunity we have to go onto the sea ice in a place this remote to collect data.”

    A legacy of sea ice experiments

    Though this trip was Whelihan’s and Evans’ first to the Arctic region, staff from the laboratory’s Advanced Undersea Systems and Technology Group have been conducting experiments at ICEX since 2018. However, because of the Arctic’s remote location and extreme conditions, data collection has rarely been continuous over long periods of time or widespread across large areas. The team now hopes to change that by building low-cost, expendable sensing platforms consisting of co-located devices that can be left unattended for automated, persistent, near-real-time monitoring. 

    “The laboratory’s extensive expertise in rapid prototyping, seismo-acoustic signal processing, remote sensing, and oceanography make us a natural fit to build this sensor network,” says Evans.

    In the months leading up to the Arctic trip, the team collected seismometer data at Firepond, part of the laboratory’s Haystack Observatory site in Westford, Massachusetts. Through this local data collection, they aimed to gain a sense of what anthropogenic (human-induced) noise would look like so they could begin to anticipate the kinds of signatures they might see in the Arctic. They also collected ice melting/fracturing data during a thaw cycle and correlated these data with the weather conditions (air temperature, humidity, and pressure). Through this analysis, they detected an increase in seismic signals as the temperature rose above 32 F — an indication that air temperature and ice cracking may be related.

    A sensing network

    At ICEX, the team deployed various commercial off-the-shelf sensors and new sensors developed by the laboratory and University of New Hampshire (UNH) to assess their resiliency in the frigid environment and to collect an initial dataset.

    “One aspect that differentiates these experiments from those of the past is that we concurrently collected seismo-acoustic data and environmental parameters,” says Evans.

    The commercial technologies were seismometers to detect the vibrational energy released when sea ice fractures or collides with other ice floes; a hydrophone (underwater microphone) array to record the acoustic energy created by ice-fracturing events; a sound speed profiler to measure the speed of sound through the water column; and a conductivity, temperature, and depth (CTD) profiler to measure the salinity (related to conductivity), temperature, and pressure (related to depth) throughout the water column. The speed of sound in the ocean primarily depends on these three quantities. 

    To precisely measure the temperature across the entire water column at one location, they deployed an array of transistor-based temperature sensors developed by the laboratory’s Advanced Materials and Microsystems Group in collaboration with the Advanced Functional Fabrics of America Manufacturing Innovation Institute. The small temperature sensors run along the length of a thread-like polymer fiber embedded with multiple conductors. This fiber platform, which can support a broad range of sensors, can be unspooled hundreds of feet below the water’s surface to concurrently measure temperature or other water properties — the fiber deployed in the Arctic also contained accelerometers to measure depth — at many points in the water column. Traditionally, temperature profiling has required moving a device up and down through the water column.

    The team also deployed a high-frequency echosounder supplied by Anthony Lyons and Larry Mayer, collaborators at UNH’s Center for Coastal and Ocean Mapping. This active sonar uses acoustic energy to detect internal waves, or waves occurring beneath the ocean’s surface.

    “You may think of the ocean as a homogenous body of water, but it’s not,” Evans explains. “Different currents can exist as you go down in depth, much like how you can get different winds when you go up in altitude. The UNH echosounder allows us to see the different currents in the water column, as well as ice roughness when we turn the sensor to look upward.”

    “The reason we care about currents is that we believe they will tell us something about how warmer water from the Atlantic Ocean is coming into contact with sea ice,” adds Whelihan. “Not only is that water melting ice but it also has lower salt content, resulting in oceanic layers and affecting how long ice lasts and where it lasts.”

    Back home, the team has begun analyzing their data. For the seismic data, this analysis involves distinguishing any ice events from various sources of anthropogenic noise, including generators, snowmobiles, footsteps, and aircraft. Similarly, the researchers know their hydrophone array acoustic data are contaminated by energy from a sound source that another research team participating in ICEX placed in the water. Based on their physics, icequakes — the seismic events that occur when ice cracks — have characteristic signatures that can be used to identify them. One approach is to manually find an icequake and use that signature as a guide for finding other icequakes in the dataset.

    From their water column profiling sensors, they identified an interesting evolution in the sound speed profile 30 to 40 meters below the ocean surface, related to a mass of colder water moving in later in the day. The group’s physical oceanographer believes this change in the profile is due to water coming up from the Bering Sea, water that initially comes from the Atlantic Ocean. The UNH-supplied echosounder also generated an interesting signal at a similar depth.

    “Our supposition is that this result has something to do with the large sound speed variation we detected, either directly because of reflections off that layer or because of plankton, which tend to rise on top of that layer,” explains Evans.  

    A future predictive capability

    Going forward, the team will continue mining their collected data and use these data to begin building algorithms capable of automatically detecting and localizing — and ultimately predicting — ice events correlated with changes in environmental conditions. To complement their experimental data, they have initiated conversations with organizations that model the physical behavior of sea ice, including the National Oceanic and Atmospheric Administration and the National Ice Center. Merging the laboratory’s expertise in sensor design and signal processing with their expertise in ice physics would provide a more complete understanding of how the Arctic is changing.

    The laboratory team will also start exploring cost-effective engineering approaches for integrating the sensors into packages hardened for deployment in the harsh environment of the Arctic.

    “Until these sensors are truly unattended, the human factor of usability is front and center,” says Whelihan. “Because it’s so cold, equipment can break accidentally. For example, at ICEX 2022, our waterproof enclosure for the seismometers survived, but the enclosure for its power supply, which was made out of a cheaper plastic, shattered in my hand when I went to pick it up.”

    The sensor packages will not only need to withstand the frigid environment but also be able to “phone home” over some sort of satellite data link and sustain their power. The team plans to investigate whether waste heat from processing can keep the instruments warm and how energy could be harvested from the Arctic environment.

    Before the next ICEX scheduled for 2024, they hope to perform preliminary testing of their sensor packages and concepts in Arctic-like environments. While attending ICEX 2022, they engaged with several other attendees — including the U.S. Navy, Arctic Submarine Laboratory, National Ice Center, and University of Alaska Fairbanks (UAF) — and identified cold room experimentation as one area of potential collaboration. Testing can also be performed at outdoor locations a bit closer to home and more easily accessible, such as the Great Lakes in Michigan and a UAF-maintained site in Barrow, Alaska. In the future, the laboratory team may have an opportunity to accompany U.S. Coast Guard personnel on ice-breaking vessels traveling from Alaska to Greenland. The team is also thinking about possible venues for collecting data far removed from human noise sources.

    “Since I’ve told colleagues, friends, and family I was going to the Arctic, I’ve had a lot of interesting conversations about climate change and what we’re doing there and why we’re doing it,” Whelihan says. “People don’t have an intrinsic, automatic understanding of this environment and its impact because it’s so far removed from us. But the Arctic plays a crucial role in helping to keep the global climate in balance, so it’s imperative we understand the processes leading to sea ice fractures.”

    This work is funded through Lincoln Laboratory’s internally administered R&D portfolio on climate. More