Ann R. Thryft

November 27, 2012

3 Min Read
Update on Thought-Controlled Robots

In September we wrote about researchers using functional magnetic resonance imaging (fMRI) to enable thought control of a robot thousands of miles away, in a scenario reminiscent of the movie Avatar. The news was exciting, but the accompanying video from the Japanese AIST (National Institute of Advanced Industrial Science and Technology) was a bit disappointing. A larger, clearer video, showing the control of a bigger, more sophisticated nearby robot, is now available from the joint robotics laboratory (JRL) of the AIST and the French CNRS (Centre National de la Recherche Scientifique).

According to Abderrahmane Kheddar, director of the CNRS-AIST JRL, in a video interview conducted by DigiInfo TV, the idea behind the research is to let people feel as if they are truly embodied in the robot. This is especially important for the paraplegics and tetraplegics who are one group of targets for the technology, using it to navigate. For example, "a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan," said Kheddar.

125357_182371.jpg

In the video, a researcher wears a cap peppered with electrodes, and watches a PC screen. Flashing symbols control where the robot moves and how it interacts with its environment, while another researcher helps keep it upright. One pattern on the screen associates flickering lights with specific actions. By focusing their attention, users can induce which actions they want the robot to perform.

"We read the electric activities of the brain that are transferred to this PC, and then there is a signal processing unit which is trying to classify what the user is thinking," said Kheddar. "As you see here there are several icons that can be associated with tasks, or you can recognize an object that will flicker automatically. With different frequencies we can recognize which frequency the user is focusing his attention to and then we can select this object. Since the object is associated with a task, it's easy to instruct the robot which task it has to perform."

In related work taking place at the AIST's Intelligent Systems Research Institute in Tsukuba, Japan, researchers from both countries are pursuing means to increase a humanoid robot's functional autonomy. Specific research topics include task and motion planning and control, control of reactive behaviors, and cooperation between human and robot via a multimodal interface that integrates a brain-computer interface (BCI), vision, and haptics. The joint project includes multiple collaborative research projects with other research institutes in both Europe and Japan.

In the US, a Brown University study earlier this year resulted in the control of a robotic arm by a paralyzed woman who has been unable to move her arms and legs for 15 years. For the first time since her paralysis, tetraplegic Cathy Hutchinson controlled the arm to pick up a bottle, and steer it toward her to take a drink.

Related posts:

About the Author(s)

Ann R. Thryft

Ann R. Thryft has written about manufacturing- and electronics-related technologies for Design News, EE Times, Test & Measurement World, EDN, RTC Magazine, COTS Journal, Nikkei Electronics Asia, Computer Design, and Electronic Buyers' News (EBN). She's introduced readers to several emerging trends: industrial cybersecurity for operational technology, industrial-strength metals 3D printing, RFID, software-defined radio, early mobile phone architectures, open network server and switch/router architectures, and set-top box system design. At EBN Ann won two independently judged Editorial Excellence awards for Best Technology Feature. She holds a BA in Cultural Anthropology from Stanford University and a Certified Business Communicator certificate from the Business Marketing Association (formerly B/PAA).

Sign up for the Design News Daily newsletter.

You May Also Like