Sixteen of the tiny flying quadrotor robots built by a team at the University of Pennsylvania's General Robotics, Automation, Sensing and Perception (GRASP) Laboratory gave a dazzling sound and light show recently at the Cannes Lions International Festival of Creativity in France. Flying in swarms and equipped with mirrors, they demonstrated their dexterity in a performance at the Saatchi & Saatchi New Directors' Showcase 2012. (You can watch a video of the Cannes performance at the bottom of this post.)
These autonomous, hand-sized acrobats were developed by the electrical engineer Alex Kushleyev and the mechanical engineer Daniel Mellinger, who formed KMel Robotics late last year. In the Cannes show video, the quadrotors dance and manipulate sound and light. On the KMel Robotics Website, credit is given to Vicon for motion capture systems, Analog Devices for MEMS rate gyroscopes, and Murata for accelerometer sensors.
Flying in swarms and equipped with mirrors, 16 of the University of Pennsylvania's tiny flying quadrotor robots dazzled an audience at Cannes by manipulating sound and light and dancing to music. (Source: KMel Robotics)
Since we first reported on the quadrotors, more information on them has become available, primarily in the form of videos on the GRASP Lab's site. The longest and most complete video is a TED talk by Vijay Kumar, a professor at Penn's School of Engineering and Applied Science (SEAS) and former head of the GRASP Lab.
The talk gives a wealth of details. For example, the robots weigh between 50gm and 4kg and change motion depending on each rotor's relative speed, which is determined by an onboard processor getting data from onboard sensors and gyroscopes. Kumar also calls the robots alternatives to unmanned aerial vehicles for military applications. They could also be useful in building construction and first responder reconnaissance.
Kushleyev, Mellinger, and Kumar have authored a paper giving further details (PDF). They presented the paper at the Robotics Science and Systems 2012 conference in Sydney.
In our previous story, we hypothesized that the GRASP Lab's participation in the Scalable sWarms of Autonomous Robots and Mobile Sensors (SWARMS) project might explain the lack of details about the quadrotors in earlier announcements. The SWARMS project, associated with the Army Research Office and the Army Institute of Collaborative Biotechnology, posts military-related goals on its Website. It combines robotics, systems engineering, artificial intelligence, control theory, and biology to apply biologically inspired models of swarm behaviors to large networked groups of autonomous vehicles that respond to high-level management commands. The robots could be used for things like rescue missions after natural disasters.
According to Kumar's TED talk, the quadrotors' abilities are clearly related to this project. However, according to a news story on the SEAS Website, KMel does not plan to use its technology for surveillance or military applications. The company designs, builds, and programs customized versions of the quadrotors "for research that will advance algorithms, sensing capabilities, and control mechanisms for group cooperation and autonomous flight."
Interesting way to show off the capabilities of the robots, but I have to admit, it's hard for me to envision a role with real utility other than some good old entertainment. It reminded me of a scene straight out of the Blue Man Group. On a more serious note, these robots are obviously quite powerful and I imagine have the potential to be put to good use.
I agree, the music was totally over the top, but hey, that's show business, and the sponsoring company is an ad agency. I would think the KMel guys decided to participate either because they were asked to (after a video of the swarm playing the James Bond theme went viral) or because it looked like a great opportunity to demonstrate their tech, or both. Although KMel has said its technology is not aimed at military uses, the original U of Pa development was definitely done in that context. I can easily visualize military uses for swarming robots that can coordinate their movements so precisely.
Admittedly, if I hadn't seen the earlier videos that you've posted, Ann, I'm not sure I would have understood what I was looking at in this rather dark video. Seeing the earlier video and then watching this, however, I understand how stunning this technology is. It's a testament to the incredible creativity of the engineers at the University of Pennsylvania, and also a testament to the engineers who made the sensors, particularly the MEMS gyroscopes.
To Charles point, I thought similarly, that the dark video does not do justice to the incredible technology synchronization we are witnessing.
These flying robots are alike a magical blend of Micromechanics, lightweight electronics and amazingly executed SW algorithms to produce this absolutely life-like performance. Incredibly impressive.
I'm speculating that the SW algorithms might actually Mic the music to process tones and loudness to assist in interpreting movements and rate of change in flight speeds-? Literally dancing to the music-?
God willing, I'm only about halfway through my career, but I've seen this pattern often enough to have a guess at what will happen next. The first time I saw it was in graduate school when our department received a huge grant to buy new personal computers. The much-respected professor in charge of the purchase insisted that the PCs be delivered with green-monochrome monitors rather than the much more expensive color CRTs. His reasoning was "Why the hell do we need color in science? Color screens are only for playing games and watching porn." As Data Visualization becomes more important to the discovery process, it is difficult to remember this attitude.
Another example was evident at the February 2006 Adobe TED talk in which Jeff Han presented a futuristic multi-touch interface. Microsoft made some acquisitions and demonstrated its "Surface" concept soon after, but it wasn't until Apple launched its iPAD in April 2010 that multi-touch went mainstream.
The applications of autonomous SWARMS are following a similar pattern - first applications in Defense, then Entertainment, and next commercial applications. In a few short years autonomous SWARMS will be commonplace with all sorts of practical applications. It may be difficult to imagine how we got along without them.
In this case, I believe that the lights are controlled by the music. In show biz, DMX controllers are used to remotely set On/Off/Intensity of lighting. (A popular instance of this can be found in the YouTube videos of the Christmas light displays set to the Trans-Siberian Orchestra's rendition of 'Carol of the Bells', etc., and DIY project plans are available to construct DMX controllers.) It makes sense that the QRs are just flying a pattern to fixed pionts in space & time, with the appropriate mirror angle. A more interesting variation, as suggested by the opening sequence, would be to have the QRs' altitude control the music pitch. A LIDAR range value can be converted to a pitch value fairly simply.
Diogenes, thanks for that explanation: it makes sense that the lights are controlled by the music. But the text accompanying the video says the swarm manipulates the music. Any idea how it does that? I would have guessed the opposite to be true, i.e., that the music determined the robots' movements.
Right, as I suggested, "Literally dancing to the Music" ,,,, I've also seen the Christmas light extravaganza choreographed to the Trans-Siberian Orchestra, and am thinking this may be similarly executed. While I have no expertise on the specific means, I was thinking along the lines of a 25 year old technology derivative, being the LED bars responding an a Graphic Audio Equalizer. Sound pattern and intensity directly affecting light response. I imagine it would be a rather direct exercise for one skilled in the art.
Engineers at the University of San Diego’s Jacobs School of Engineering have designed biobatteries on commercial tattoo paper, with an anode and cathode screen-printed on and modified to harvest energy from lactate in a person’s sweat.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.