In September we wrote about researchers using functional magnetic resonance imaging (fMRI) to enable thought control of a robot thousands of miles away, in a scenario reminiscent of the movie Avatar. The news was exciting, but the accompanying video from the Japanese AIST (National Institute of Advanced Industrial Science and Technology) was a bit disappointing. A larger, clearer video, showing the control of a bigger, more sophisticated nearby robot, is now available from the joint robotics laboratory (JRL) of the AIST and the French CNRS (Centre National de la Recherche Scientifique).
According to Abderrahmane Kheddar, director of the CNRS-AIST JRL, in a video interview conducted by DigiInfo TV, the idea behind the research is to let people feel as if they are truly embodied in the robot. This is especially important for the paraplegics and tetraplegics who are one group of targets for the technology, using it to navigate. For example, "a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan," said Kheddar.
By focusing their attention on patterns created by flickering lights on a PC screen, which are associated with specific actions, users can control which actions they want a robot to perform, where the robot moves, and how it interacts with its environment.
(Source: CNRS-AIST Joint Robotics Laboratory)
In the video, a researcher wears a cap peppered with electrodes, and watches a PC screen. Flashing symbols control where the robot moves and how it interacts with its environment, while another researcher helps keep it upright. One pattern on the screen associates flickering lights with specific actions. By focusing their attention, users can induce which actions they want the robot to perform.
"We read the electric activities of the brain that are transferred to this PC, and then there is a signal processing unit which is trying to classify what the user is thinking," said Kheddar. "As you see here there are several icons that can be associated with tasks, or you can recognize an object that will flicker automatically. With different frequencies we can recognize which frequency the user is focusing his attention to and then we can select this object. Since the object is associated with a task, it's easy to instruct the robot which task it has to perform."
In related work taking place at the AIST's Intelligent Systems Research Institute in Tsukuba, Japan, researchers from both countries are pursuing means to increase a humanoid robot's functional autonomy. Specific research topics include task and motion planning and control, control of reactive behaviors, and cooperation between human and robot via a multimodal interface that integrates a brain-computer interface (BCI), vision, and haptics. The joint project includes multiple collaborative research projects with other research institutes in both Europe and Japan.
In the US, a Brown University study earlier this year resulted in the control of a robotic arm by a paralyzed woman who has been unable to move her arms and legs for 15 years. For the first time since her paralysis, tetraplegic Cathy Hutchinson controlled the arm to pick up a bottle, and steer it toward her to take a drink.