The European Space Agency (ESA) wants you to play video games with flying drones so it can develop better space robots. A free app that runs on the iPhone or iPad lets owners of Parrot AR.Drone quadricopters navigate their remote-controlled robots to perform dockings on a simulation of the International Space Station. Using crowdsourcing, the ESA figures it will arrive faster at improved methods for robotic docking strategies.
The new AstroDrone app is part of a larger ESA project run by its Advanced Concepts Team. This Artificial Intelligence project aims at collecting data via crowdsourcing about how robots navigate their environments. In particular, it's aimed at improving robots' abilities to autonomously estimate distances to other objects, a skillset clearly needed in the complex task of docking in space.
A free iPhone video game app turns your Parrot AR.Drone into a simulated spacecraft, which you can use to simulate docking on the International Space Station. You get points for accuracy and speed, and the European Space Agency gets tons of data to help make better space robots. (Source: European Space Agency/Anneke Le Floc'h)
In the AstroDrone experiment, the team's goal is to discover whether robots can learn to accurately estimate distances merely by looking at still images. Players put an augmented-reality marker on a feature in physical reality, to indicate that this object represents the Space Station's docking port. They then try to dock the AR.Drone on an image of the Space Station's port as fast as possible, but also carefully and accurately. Players can win extra points for docking the drone with the correct orientation, as well as for achieving a low-speed final approach. (Watch a video of an AR.Drone being docked on a model of the Space Station at the ESA's European Space Research and Technology Centre, below.)
Partial image data, which the team calls Speeded Up Partial Features (SURF), is identified by the team's OpenSURF algorithm. This algorithm identifies objects with specific image sizes and their orientations. The identification of an object's size, shape, and orientation, and the extraction of such partial data from a still image, sounds very much like sophisticated industrial machine vision software used in object recognition and defect detection.
On the player's iPhone, SURFs are extracted from a sequence of still images using Willow Garage's OpenCV. Players then send this data, along with information about the robot's height, attitude angles, velocities, and other data about its state, to a central database. Also, as players log their game scores on the app's high-score table, they can send their performance data. All of this input is anonymous: no GPS data or raw video images are sent. The only data players send is the abstracted mathematical features of images that the drone sees and uses for navigation, as well as velocity readings.
The AR.Drone 2.0, which can be controlled by an iPad or iPhone, carries a lot of hardware to make the project possible, including two different cameras: a 30-fps HD 720p with a 92-degree wide-angle lens, and a 60 fps vertical QVGA for measuring ground speed. Its embedded 1-GHz 32-bit ARM Cortex A8 CPU has 800 MHz video DSP and runs Linux 2.6.32. Memory is 1Gbit of 200-MHz DDR-2 RAM. Sensors include a 3-axis accelerometer, a 3-axis gyroscope, a 3-axis magnetometer, and ultrasound sensors for measuring ground altitude.
With about 500,000 AR.Drone quadricopters in gamers' hands, the ESA team hopes using high numbers of players and multiple inputs will help the researchers develop spacecraft that are autonomous and can independently, and correctly, dock and land themselves. The key to this is creating a database of SURFs for objects with standard sizes, combined with a set of corresponding distances. From this information, algorithms can be developed that accurately extract distances from images once an object and its features are identified in the database.
The team is planning more levels of the game that will guide the AR.Drone to connecting with other space objects -- such as the agency's 2014 planned rendezvous of its Rosetta probe with the 67P/Churyumov-Gerasimenko comet -- as well as versions of the app for other, unspecified devices.
Chuck, I understand the fear. I've read a ton of science fiction since age 11, so I probably have some of the same worries you do about the Robot Takeover and the War with the Machines (oops, wrong universe). At least I'll probably be long gone if that happens.
Relying on "crowdsourcing" is not a fast moving option. Depending if people are paid or not, work takes a while to push through. People either have to absolutely love the product or company or get paid to help a product along...
Or in the immortal word of one Internet meme "Ain't nobody got time for that!"
It seems that somehow the data created as various folks attempt to dock at a simulated space station is somehow going to be useful. Right??? But the game will certainly obtain lots of data, for sure. But the main value of data is created when it is condensed into knowledge, and the value of knowledge is that it can lead to insight and understanding. Exactly how that happens in this situation is not completely clear to me just yet.
Cabe, if you're responding to my comment, "that didn't take much motivation," I meant that people who have already invested $300 in the AR.Drone--not small change if you ask me--therefore don't need much motivation to use it in this crowdsourcing app. You had said that getting people to participate might be problematic, especially if they weren't paid. But dedicated gamers like these don't need an excuse to play.
William, there's more detail about what the researchers are doing, and plan to do, in the links we gave in the article, including a list of references at the end of the AI project article There's also a list of publications by the Advanced Concepts Team here: http://www.esa.int/gsp/ACT/publications/index.htm
Ann, thanks for pointing out that thye links had additional useful information. Unfortunately for me, I guess, is that I seldom follow links placed in articles, partly because some of those links have been quite slow in the past.
Major global metropolitan areas are implementing a vast number of technology, energy, transportation, and Internet projects to make the metropolis a friendlier, greener, safer, and more sustainable place to be.
Here’s a look at robots depicted in movies and on TV during the 1950s and 1960s. We tried to collect the classics here, omitting the scores of forgettable B movies such as Santa Claus Conquers the Martians and Dr. Goldfoot and the Bikini Machine. Stay tuned for slideshows of robot stars from later decades.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.