To detect items on the ship's hull, HUL uses sonar. Researchers take these images and then process those signals into a grainy point cloud that at a low resolution, can determine something like a ship's propeller, but can't determine where something begins and ends, according to researchers. Therefore, seeing something smaller, such as a 10cm mine -- which is about the size of an iPod -- required a clearer picture from the robot's sonar, researchers said. HULS also would need finer images to avoid colliding with propellers and other protrusions from the ship.
To create this for the control system, researchers adapted an algorithm used in computer graphics to generate a 3D mesh model for their sonar data. The second phase of the research involved programming HULS to swim closer to the ship and navigate the hull based on this mesh model to cover each point on the model, which are spaced 10cm apart. By covering the hull in this way -- which researchers compared to mowing a lawn one strip at a time -- a robot could detect a small mine.
MIT researchers have tested the algorithms by creating underwater models of two vessels -- the Curtiss, a 183-meter military support ship in San Diego, and the Seneca, an 82-meter cutter in Boston. More tests are scheduled in the Boston Harbor this month before the new control system can be used in practice.
The Navy has a number of robotics and unmanned vehicle projects in the works, including another to develop an unmanned vessel to perform tasks too dangerous for manned ships. Indeed, the military is increasingly exploring the design of a new unmanned aircraft and other vehicles to keep military personnel out of harm's way.