The European Space Agency (ESA) wants you to play video games with flying drones so it can develop better space robots. A free app that runs on the iPhone or iPad lets owners of Parrot AR.Drone quadricopters navigate their remote-controlled robots to perform dockings on a simulation of the International Space Station. Using crowdsourcing, the ESA figures it will arrive faster at improved methods for robotic docking strategies.
The new AstroDrone app is part of a larger ESA project run by its Advanced Concepts Team. This Artificial Intelligence project aims at collecting data via crowdsourcing about how robots navigate their environments. In particular, it's aimed at improving robots' abilities to autonomously estimate distances to other objects, a skillset clearly needed in the complex task of docking in space.
A free iPhone video game app turns your Parrot AR.Drone into a simulated spacecraft, which you can use to simulate docking on the International Space Station. You get points for accuracy and speed, and the European Space Agency gets tons of data to help make better space robots. (Source: European Space Agency/Anneke Le Floc'h)
In the AstroDrone experiment, the team's goal is to discover whether robots can learn to accurately estimate distances merely by looking at still images. Players put an augmented-reality marker on a feature in physical reality, to indicate that this object represents the Space Station's docking port. They then try to dock the AR.Drone on an image of the Space Station's port as fast as possible, but also carefully and accurately. Players can win extra points for docking the drone with the correct orientation, as well as for achieving a low-speed final approach. (Watch a video of an AR.Drone being docked on a model of the Space Station at the ESA's European Space Research and Technology Centre, below.)
Partial image data, which the team calls Speeded Up Partial Features (SURF), is identified by the team's OpenSURF algorithm. This algorithm identifies objects with specific image sizes and their orientations. The identification of an object's size, shape, and orientation, and the extraction of such partial data from a still image, sounds very much like sophisticated industrial machine vision software used in object recognition and defect detection.
On the player's iPhone, SURFs are extracted from a sequence of still images using Willow Garage's OpenCV. Players then send this data, along with information about the robot's height, attitude angles, velocities, and other data about its state, to a central database. Also, as players log their game scores on the app's high-score table, they can send their performance data. All of this input is anonymous: no GPS data or raw video images are sent. The only data players send is the abstracted mathematical features of images that the drone sees and uses for navigation, as well as velocity readings.
The AR.Drone 2.0, which can be controlled by an iPad or iPhone, carries a lot of hardware to make the project possible, including two different cameras: a 30-fps HD 720p with a 92-degree wide-angle lens, and a 60 fps vertical QVGA for measuring ground speed. Its embedded 1-GHz 32-bit ARM Cortex A8 CPU has 800 MHz video DSP and runs Linux 2.6.32. Memory is 1Gbit of 200-MHz DDR-2 RAM. Sensors include a 3-axis accelerometer, a 3-axis gyroscope, a 3-axis magnetometer, and ultrasound sensors for measuring ground altitude.
With about 500,000 AR.Drone quadricopters in gamers' hands, the ESA team hopes using high numbers of players and multiple inputs will help the researchers develop spacecraft that are autonomous and can independently, and correctly, dock and land themselves. The key to this is creating a database of SURFs for objects with standard sizes, combined with a set of corresponding distances. From this information, algorithms can be developed that accurately extract distances from images once an object and its features are identified in the database.
The team is planning more levels of the game that will guide the AR.Drone to connecting with other space objects -- such as the agency's 2014 planned rendezvous of its Rosetta probe with the 67P/Churyumov-Gerasimenko comet -- as well as versions of the app for other, unspecified devices.
Al, one aspect of AI that these researchers are trying to fulfill with crowdsourcing is to gather lots of data (lots of people in the crowd). Even if you were not very skilled at it, that data might be useful as well.
Ann, maybe there's a movie here -- a cross between 2001: A Space Odyssey and Public Enemy. As to it making sense for applications that humans don't want to do, I concur. From a completely logical standpoint, I know that robotics make sense in countless applications. Admittedly, my fear is illogical. Still, every time I read one of these stories...
Chuck, your comments make me think of black and white 40s gangster movies, many of them located in Chicago. Anyway, in this case I think teaching robots how to dock on the ISS makes more sense than trying to teach humans to do it. Of course, astronauts and people who want to be astronauts might not agree.
In addition to to using crowd sourcing to improve their design, ESA also builds public awareness to their space programs (which could ultimately lead to more favorable funding for certain programs by the public and their government).
Every time I see one of these stories, I can't help but think of an old Chicago-ism: "They're muscling in our rackets." Increasingly, I'm seeing a lot of tasks that robots can do more effectively than humans. And the kicker to this story is they now want all of us to help them learn. I know it's logical; it's all in the best interests of science and technology; it's probably helpful to mankind in a hundred different ways that I can't even imagine. Still, I have this niggling fear, and I know it's not the most enlightened view -- but they're muscling in our rackets.
Glad you enjoyed the story, Al. I thought it was a fun app, and also good to know that at least some space agencies are open to the crowdsourcing concept. I've heard of SETI's requests for help from millions of people with PCs, but not anything about NASA using the crowdsourcing approach. Anyone know?
Excellent story and unique application for using mobile devices. I don't think I have the dexterity and coordination to make this work but I'm sure there is a whole generation of gamers than can help them gather data. Thanks.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
Robots that walk have come a long way from simple barebones walking machines or pairs of legs without an upper body and head. Much of the research these days focuses on making more humanoid robots. But they are not all created equal.
The IEEE Computer Society has named the top 10 trends for 2014. You can expect the convergence of cloud computing and mobile devices, advances in health care data and devices, as well as privacy issues in social media to make the headlines. And 3D printing came out of nowhere to make a big splash.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.