Video: Wearable Sensor Builds Maps on the Fly

The same MIT researchers who are helping the US military create robots that can autonomously generate 3D maps of their immediate location have developed similar technology humans can wear to navigate new and potentially dangerous environments.

Researchers at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have built a wearable system that senses the environment of its wearer and builds a digital map of the area as the person moves through it. The ultimate goal for the technology's development -- funded by the Air Force and the Office of Naval Research -- is to help emergency responders find their way through an unfamiliar area after a disaster safely, and possibly locate survivors, according to MIT.

The prototype of the sensor platform is comprised of several small devices affixed to an iPad-sized sheet of hard plastic. The wearer bears the plastic on the chest like a backward backpack.

Researchers have tested the system on a graduate student who wandered through MIT hallways while the system's sensors used a wireless connection to send data to a laptop in a conference room away from the scene. As the student walked, the system created a map of his progress on the laptop, allowing people in the room to track his progress. The technology is based on a navigation system CSAIL engineers have been working on to allow robots to autonomously move through new and changing environments.

That system uses a low-cost camera -- such as the one in Microsoft's Kinect motion sensing input device -- to create images of the environment. It also uses algorithms based on Simultaneous Localization and Mapping to allow the robots to constantly update maps and keep track of their own location in it as they learn new information. In fact, the Kinect principle is becoming a driver of creating artificial intelligence to help robots more effectively and autonomously interact with their environments. Engineers even started a crowdsourcing project that allows people to use their Kinect camera to easily create 3D scans of anything and everything around them, hoping to make it easier to program environment-sensing AI in robots.

To modify the camera-based robotic sensing device they created, MIT researchers made a number of modifications to develop a wearable device so humans, too, could map their environment on the fly. For example, one of the system's sensors is a laser rangefinder that sweeps a laser beam in an arc and measures the time it takes for light pulses to return to calculate the distance of walls. However, a human -- particularly one moving through the rubble found after a disaster -- jostles the sensor more than a robot, causing a less-than-accurate reading.

The robot also has sensors in its wheels that a person wouldn't have to provide distance information, and someone responding to a disaster might have to traverse several levels of a building, requiring a map-generating sensor to recognize altitude changes.

To adapt the robot system to a wearable one, researchers added a series of accelerometers and gyroscopes and a camera, and also experimented with the use of a barometer, as air-pressure changes help to indicate change in floor level, researchers said.

One key commonality between the systems, however, is the camera, which is integral to both. The camera snaps photos of its environment every few meters, and software takes about 200 visual features from each image that are associated with particular map locations.

Related posts:

Comments (0)

Please log in or to post comments.
  • Oldest First
  • Newest First
Loading Comments...