The European Space Agency (ESA) wants you to play video games with flying drones so it can develop better space robots. A free app that runs on the iPhone or iPad lets owners of Parrot AR.Drone quadricopters navigate their remote-controlled robots to perform dockings on a simulation of the International Space Station. Using crowdsourcing, the ESA figures it will arrive faster at improved methods for robotic docking strategies.
The new AstroDrone app is part of a larger ESA project run by its Advanced Concepts Team. This Artificial Intelligence project aims at collecting data via crowdsourcing about how robots navigate their environments. In particular, it's aimed at improving robots' abilities to autonomously estimate distances to other objects, a skillset clearly needed in the complex task of docking in space.
A free iPhone video game app turns your Parrot AR.Drone into a simulated spacecraft, which you can use to simulate docking on the International Space Station. You get points for accuracy and speed, and the European Space Agency gets tons of data to help make better space robots. (Source: European Space Agency/Anneke Le Floc'h)
In the AstroDrone experiment, the team's goal is to discover whether robots can learn to accurately estimate distances merely by looking at still images. Players put an augmented-reality marker on a feature in physical reality, to indicate that this object represents the Space Station's docking port. They then try to dock the AR.Drone on an image of the Space Station's port as fast as possible, but also carefully and accurately. Players can win extra points for docking the drone with the correct orientation, as well as for achieving a low-speed final approach. (Watch a video of an AR.Drone being docked on a model of the Space Station at the ESA's European Space Research and Technology Centre, below.)
Partial image data, which the team calls Speeded Up Partial Features (SURF), is identified by the team's OpenSURF algorithm. This algorithm identifies objects with specific image sizes and their orientations. The identification of an object's size, shape, and orientation, and the extraction of such partial data from a still image, sounds very much like sophisticated industrial machine vision software used in object recognition and defect detection.
On the player's iPhone, SURFs are extracted from a sequence of still images using Willow Garage's OpenCV. Players then send this data, along with information about the robot's height, attitude angles, velocities, and other data about its state, to a central database. Also, as players log their game scores on the app's high-score table, they can send their performance data. All of this input is anonymous: no GPS data or raw video images are sent. The only data players send is the abstracted mathematical features of images that the drone sees and uses for navigation, as well as velocity readings.
The AR.Drone 2.0, which can be controlled by an iPad or iPhone, carries a lot of hardware to make the project possible, including two different cameras: a 30-fps HD 720p with a 92-degree wide-angle lens, and a 60 fps vertical QVGA for measuring ground speed. Its embedded 1-GHz 32-bit ARM Cortex A8 CPU has 800 MHz video DSP and runs Linux 2.6.32. Memory is 1Gbit of 200-MHz DDR-2 RAM. Sensors include a 3-axis accelerometer, a 3-axis gyroscope, a 3-axis magnetometer, and ultrasound sensors for measuring ground altitude.
With about 500,000 AR.Drone quadricopters in gamers' hands, the ESA team hopes using high numbers of players and multiple inputs will help the researchers develop spacecraft that are autonomous and can independently, and correctly, dock and land themselves. The key to this is creating a database of SURFs for objects with standard sizes, combined with a set of corresponding distances. From this information, algorithms can be developed that accurately extract distances from images once an object and its features are identified in the database.
The team is planning more levels of the game that will guide the AR.Drone to connecting with other space objects -- such as the agency's 2014 planned rendezvous of its Rosetta probe with the 67P/Churyumov-Gerasimenko comet -- as well as versions of the app for other, unspecified devices.
Excellent story and unique application for using mobile devices. I don't think I have the dexterity and coordination to make this work but I'm sure there is a whole generation of gamers than can help them gather data. Thanks.
Glad you enjoyed the story, Al. I thought it was a fun app, and also good to know that at least some space agencies are open to the crowdsourcing concept. I've heard of SETI's requests for help from millions of people with PCs, but not anything about NASA using the crowdsourcing approach. Anyone know?
Every time I see one of these stories, I can't help but think of an old Chicago-ism: "They're muscling in our rackets." Increasingly, I'm seeing a lot of tasks that robots can do more effectively than humans. And the kicker to this story is they now want all of us to help them learn. I know it's logical; it's all in the best interests of science and technology; it's probably helpful to mankind in a hundred different ways that I can't even imagine. Still, I have this niggling fear, and I know it's not the most enlightened view -- but they're muscling in our rackets.
Chuck, your comments make me think of black and white 40s gangster movies, many of them located in Chicago. Anyway, in this case I think teaching robots how to dock on the ISS makes more sense than trying to teach humans to do it. Of course, astronauts and people who want to be astronauts might not agree.
Ann, maybe there's a movie here -- a cross between 2001: A Space Odyssey and Public Enemy. As to it making sense for applications that humans don't want to do, I concur. From a completely logical standpoint, I know that robotics make sense in countless applications. Admittedly, my fear is illogical. Still, every time I read one of these stories...
Chuck, I understand the fear. I've read a ton of science fiction since age 11, so I probably have some of the same worries you do about the Robot Takeover and the War with the Machines (oops, wrong universe). At least I'll probably be long gone if that happens.
Relying on "crowdsourcing" is not a fast moving option. Depending if people are paid or not, work takes a while to push through. People either have to absolutely love the product or company or get paid to help a product along...
Or in the immortal word of one Internet meme "Ain't nobody got time for that!"
Cabe, if you're responding to my comment, "that didn't take much motivation," I meant that people who have already invested $300 in the AR.Drone--not small change if you ask me--therefore don't need much motivation to use it in this crowdsourcing app. You had said that getting people to participate might be problematic, especially if they weren't paid. But dedicated gamers like these don't need an excuse to play.
Al, one aspect of AI that these researchers are trying to fulfill with crowdsourcing is to gather lots of data (lots of people in the crowd). Even if you were not very skilled at it, that data might be useful as well.
The main point of crowdsourcing is big numbers, so lots of data. In this case, ESA has designed the app to motivate players to do an excellent job, thus providing the type of data they want, i.e., what are the best ways to approach Object A reliably with millions of copies of Object B, in turn learning methods that can be taught to other robots.
In addition to to using crowd sourcing to improve their design, ESA also builds public awareness to their space programs (which could ultimately lead to more favorable funding for certain programs by the public and their government).
It seems that somehow the data created as various folks attempt to dock at a simulated space station is somehow going to be useful. Right??? But the game will certainly obtain lots of data, for sure. But the main value of data is created when it is condensed into knowledge, and the value of knowledge is that it can lead to insight and understanding. Exactly how that happens in this situation is not completely clear to me just yet.
William, there's more detail about what the researchers are doing, and plan to do, in the links we gave in the article, including a list of references at the end of the AI project article There's also a list of publications by the Advanced Concepts Team here: http://www.esa.int/gsp/ACT/publications/index.htm
Ann, thanks for pointing out that thye links had additional useful information. Unfortunately for me, I guess, is that I seldom follow links placed in articles, partly because some of those links have been quite slow in the past.
The Internet happened.” Those three words spoken yesterday by Marc Ostertag, North America president of B&R Automation at Pacific Design & Manufacturing, now taking place in Anaheim through Feb. 11, continues to bring ever-lasting changes to our ways of life and will undoubtedly transform manufacturing.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.