More stories

  • in

    Scientists and musicians tackle climate change together

    Audiences may travel long distances to see their favorite musical acts in concert or to attend large music festivals, which can add to their personal carbon footprint of emissions that are steadily warming the planet. But these same audiences, and the performers they follow, are often quite aware of the dangers of climate change and eager to contribute to ways of curbing those emissions.

    How should the industry reconcile these two perspectives, and how should it harness the enormous influence that musicians have on their fans to help promote action on climate change?

    That was the focus of a wide-ranging discussion on Monday hosted by MIT’s Environmental Solutions Initiative, titled “Artists and scientists together on climate solutions.” The event, which was held live at the Media Lab’s Bartos Theater and streamed online, featured John Fernandez, director of ESI; Dava Newman, director of the Media Lab; Tony McGuinness, a musician with the group Above and Beyond; and Anna Johnson, the sustainability and environment officer at Involved Group, an organization dedicated to embedding sustainability in business operations in the arts and culture fields.

    Fernandez pointed out in opening the discussion that when it comes to influencing people’s attitudes and behavior, changes tend to come about not just through information from a particular field, but rather from a whole culture. “We started thinking about how we might work with artists, how to have scientists and engineers, inventors, and designers working with artists on the challenges that we really need to face,” he said.

    Dealing with the climate change issue, he said, “is not about 2050 or 2100. This is about 2030. This is about this decade. This is about the next two or three years, really shifting that curve” to lowering the world’s greenhouse gas emissions. “It’s not going to be done just with science and engineering,” he added. “It’s got to be done with artists and business and everyone else. It’s not only ‘all of the above’ solutions, it’s ‘all of the above’ people, coming together to solve this problem.”

    Newman, who is also a professor in MIT’s Department of Aeronautics and Astronautics and has served as a NASA deputy administrator, said that while scientists and engineers can produce vast amounts of useful data that clearly demonstrate the dramatic changes the Earth’s climate is undergoing, communicating that information effectively is often a challenge for these specialists. “That data is just the data, but that doesn’t change the hearts and minds,” she said.

    “As scientists, having the data from our satellites, looking down, but also flying airplanes into the atmosphere, … we have the sensors, and then what can we do with it all? … How do we change human behavior? That’s the part I don’t know how to do,” Newman said. “I can have the technology, I can get precision measurements, I can study it, but really at the end of the day, we have to change human behavior, and that is so hard.”

    And that’s where the world of art and music can play a part, she said. “The best way that I know how to do it is with artistic experiences. You can have one moving experience and when you wake up tomorrow, maybe you’re going to do something a little different.” To help generate the compassion and empathy needed to affect behavior positively, she said, “that’s where we turn to the storytellers. We turn to the visionaries.”

    McGuinness, whose electronic music trio has performed for millions of people around the world, said that his own awareness of the urgency of the climate issue came from his passion for scuba diving, and the dramatic changes he has seen over the last two decades. In diving at a coral reef off Palau in the South Pacific, he returned to what had been a lush, brightly colored ecosystem, and found that “immediately when you put your face under the water, you’re looking at the surface of the moon. It was a horrible shock to see this.”

    After this and other similar diving experiences, he said, “I just came away shocked and stunned,” and realizing the kinds of underwater experiences he had enjoyed would no longer exist for his children. After reading more on the subject of global warming,  “that really sort of tipped me over the edge. And I was like, this is probably the most important thing for living beings now. And that’s sort of where I’ve remained ever since.”

    While his group Above and Beyond has performed one song specifically related to global warming, he doesn’t expect that to be the most impactful way of using their influence. Rather, they’re trying to lead by example, he said, by paying more attention to everything from the supply chains of the merchandise sold at concerts to the emissions generated by travel to the concerts. They’re also being selective about concert venues and making an effort to find performance spaces that are making a significant effort to curb their emissions.

    “If people start voting with their wallets,” McGuinness said, “and there are companies that are doing better than others and are doing the right thing, maybe it’ll catch on. I guess that’s what we can hope for.”

    Understanding these kinds of issues, involving supply chains, transportation, and facilities associated within the music industry, has been the focus of much of Johnson’s work, through the organization Involved Group, which has entered into a collaboration with MIT through the Environmental Solutions Initiative. “It’s these kinds of novel partnerships that have so much potential to catalyze the change that we need to see at an incredible pace,” she said. Already, her group has worked with MIT on mapping out where emissions occur throughout the various aspects of the music industry.

    At a recent music festival in London, she said, the group interviewed hundreds of participants, including audience members, band members, and the crew. “We explored people’s level of awareness of the issues around climate change and environmental degradation,” she said. “And what was really interesting was that there was clearly a lot of awareness of the issue across those different stakeholders, and what felt like a real, genuine level of concern and also of motivation, to want to deepen their understanding of what their contribution on a personal level really meant.”

    Working together across the boundaries of different disciplines and areas of expertise could be crucial to winning the battle against global warming, Newman said. “That’s usually how breakthroughs work,” she said. “If we’re really looking to have impact, it’s going to be from teams of people who are trained across the disciplines.” She pointed out that 90 percent of MIT students are also musicians: “It does go together!” she said. “I think going forward, we have to create new academia, new opportunities that are truly multidisciplinary.” More

  • in

    A robot that finds lost items

    A busy commuter is ready to walk out the door, only to realize they’ve misplaced their keys and must search through piles of stuff to find them. Rapidly sifting through clutter, they wish they could figure out which pile was hiding the keys.

    Researchers at MIT have created a robotic system that can do just that. The system, RFusion, is a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper. It fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.

    The RFusion prototype the researchers developed relies on RFID tags, which are cheap, battery-less tags that can be stuck to an item and reflect signals sent by an antenna. Because RF signals can travel through most surfaces (like the mound of dirty laundry that may be obscuring the keys), RFusion is able to locate a tagged item within a pile.

    Using machine learning, the robotic arm automatically zeroes-in on the object’s exact location, moves the items on top of it, grasps the object, and verifies that it picked up the right thing. The camera, antenna, robotic arm, and AI are fully integrated, so RFusion can work in any environment without requiring a special set up.

    While finding lost keys is helpful, RFusion could have many broader applications in the future, like sorting through piles to fulfill orders in a warehouse, identifying and installing components in an auto manufacturing plant, or helping an elderly individual perform daily tasks in the home, though the current prototype isn’t quite fast enough yet for these uses.

    “This idea of being able to find items in a chaotic world is an open problem that we’ve been working on for a few years. Having robots that are able to search for things under a pile is a growing need in industry today. Right now, you can think of this as a Roomba on steroids, but in the near term, this could have a lot of applications in manufacturing and warehouse environments,” said senior author Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab.

    Co-authors include research assistant Tara Boroushaki, the lead author; electrical engineering and computer science graduate student Isaac Perper; research associate Mergen Nachin; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. The research will be presented at the Association for Computing Machinery Conference on Embedded Networked Senor Systems next month.

    Play video

    Sending signals

    RFusion begins searching for an object using its antenna, which bounces signals off the RFID tag (like sunlight being reflected off a mirror) to identify a spherical area in which the tag is located. It combines that sphere with the camera input, which narrows down the object’s location. For instance, the item can’t be located on an area of a table that is empty.

    But once the robot has a general idea of where the item is, it would need to swing its arm widely around the room taking additional measurements to come up with the exact location, which is slow and inefficient.

    The researchers used reinforcement learning to train a neural network that can optimize the robot’s trajectory to the object. In reinforcement learning, the algorithm is trained through trial and error with a reward system.

    “This is also how our brain learns. We get rewarded from our teachers, from our parents, from a computer game, etc. The same thing happens in reinforcement learning. We let the agent make mistakes or do something right and then we punish or reward the network. This is how the network learns something that is really hard for it to model,” Boroushaki explains.

    In the case of RFusion, the optimization algorithm was rewarded when it limited the number of moves it had to make to localize the item and the distance it had to travel to pick it up.

    Once the system identifies the exact right spot, the neural network uses combined RF and visual information to predict how the robotic arm should grasp the object, including the angle of the hand and the width of the gripper, and whether it must remove other items first. It also scans the item’s tag one last time to make sure it picked up the right object.

    Cutting through clutter

    The researchers tested RFusion in several different environments. They buried a keychain in a box full of clutter and hid a remote control under a pile of items on a couch.

    But if they fed all the camera data and RF measurements to the reinforcement learning algorithm, it would have overwhelmed the system. So, drawing on the method a GPS uses to consolidate data from satellites, they summarized the RF measurements and limited the visual data to the area right in front of the robot.

    Their approach worked well — RFusion had a 96 percent success rate when retrieving objects that were fully hidden under a pile.

    “Sometimes, if you only rely on RF measurements, there is going to be an outlier, and if you rely only on vision, there is sometimes going to be a mistake from the camera. But if you combine them, they are going to correct each other. That is what made the system so robust,” Boroushaki says.

    In the future, the researchers hope to increase the speed of the system so it can move smoothly, rather than stopping periodically to take measurements. This would enable RFusion to be deployed in a fast-paced manufacturing or warehouse setting.

    Beyond its potential industrial uses, a system like this could even be incorporated into future smart homes to assist people with any number of household tasks, Boroushaki says.

    “Every year, billions of RFID tags are used to identify objects in today’s complex supply chains, including clothing and lots of other consumer goods. The RFusion approach points the way to autonomous robots that can dig through a pile of mixed items and sort them out using the data stored in the RFID tags, much more efficiently than having to inspect each item individually, especially when the items look similar to a computer vision system,” says Matthew S. Reynolds, CoMotion Presidential Innovation Fellow and associate professor of electrical and computer engineering at the University of Washington, who was not involved in the research. “The RFusion approach is a great step forward for robotics operating in complex supply chains where identifying and ‘picking’ the right item quickly and accurately is the key to getting orders fulfilled on time and keeping demanding customers happy.”

    The research is sponsored by the National Science Foundation, a Sloan Research Fellowship, NTT DATA, Toppan, Toppan Forms, and the Abdul Latif Jameel Water and Food Systems Lab. More