Ann, this is an interesting technology. On the other hand, the video was underwhelming. It is always interesting to hear the speculation that researchers have for their developments. I wonder if anyone really tracks the accuracy of what is said.
I agree, Chuck. It would help to see what they mean about tight spaces. My guess is that since it's squishy, it can fit into places that a "hard" robot would not be able to fit through. However, it's still tethered, so that could be a hindrance to maneuverability.
Lou, much of this robotics research, like other research, doesn't get all the way to a full-blown product/system. That's because some of it consists of fundamental investigations of how things work, and some of it just doesn't pan out. In general, that's pretty typical of advances in both the sciences and technology. As many commenters have noted, making people aware of what other engineers are thinking up can be inspiring.
Thanks Nadine, glad you enjoyed the post. Even though, as Lou noted it's not a great video and the movements of the robot are rather crude, it's still fun to watch. I thought the prosthetics apps seemed a bit far-fetched, but the search-and-rescue ones make sense for navigating tight spaces and acting as a type of sentinel by lighting up. What I'd like to see is the untethered stage of this beastie.
Rob, I think you nailed that--surveillance is supposed to be one of the major apps this robot would be good for. I can see it taking many different forms, too. Hope they get a better video for the next rev.
Even in the video you posted, Ann, you can see that this robot would be able to squeeze through a small area. It has a gummy worm aspect of flexibility. If they can move beyond a tether -- say, with the flexibile battery you wrote about last week -- http://www.designnews.com/author.asp?section_id=1392&doc_id=249722 -- this could go through all sorts of small spaces.
Rob, the researchers did say that the next step is to develop this robot so it works without a tether. Whether this guy can take advantage of that flexible battery, who knows, but that sounds like a great idea.
Ann, maybe I missed it, but do you know how the color is determined? Is this a case where the human operator decides how the robot will blend in to its surrounding and then give a command through the various chemical reactions, or does the unit decide for itself what to do?
Jack, how those colors are determined wasn't specified, but at this point I'm reasonably certain the robot is not doing the choosing. I have several unanswered questions about how the robot will work in the next rev, which is supposed to be untethered. Once of them is: where will the multiple fluids used for color changes come from? Will it be pre-resident in different layers? And what about the pumping action? In the video, the pumping, at least, appears to be done by the operator in real time.
Good point, Ann. At first I thought the robot was calling for the chemicals remotely, just because too much was needed for the package. But after viewing the video again, you're right. It does look like they might be pumped due to human intervention.
The fun factor continues to draw developers to Linux. This open-source system continues to succeed in the market and in the hearts and minds of developers. Design News will delve into this territory with next week's Continuing Education Class titled, “Introduction to Linux Device Drivers.”
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.