Heather Knight, a roboticist and founder of Marilyn Monrobots, is trying to bridge the uncanny valley by adding humor to the robotic repertoire. Her robot, Data, can do imitations of Darth Vader, R2D2, and Buzz Lightyear. (Photo courtesy of Freescale Semiconductor.)
The slides show a series of overpriced make do remote controlled toys for study.
We need inexpensive, fully dexterous, ambulatory, disposable robots (slave) to take humans out of harms way. ie: nurse in the ebola hospital, radiation cleanup workers in japan, specialist doctors half way around the world, solders / peace keepers, bomb / land mines disposal personal, and so much more.
These are possible today, except for the ego's getting in the way.
These are not autonomous, but human controlled. Skynet without the evil computer control.
Ramjet, certainly not everybody would be able to comment about every aspect, that is true. I have worked in a job-shop where several co-workers were "experts".
And as for how human does the robot need to appear, your response equates quite closely to the "form follows function" assertion that we have heard for a wole lot of years. Really an assertion of value engineering, why add stuff not needed to do the job correctly, which certainly makes a lot of sense.
I am amazed that nobody has commented on myremark about human-looking robots in the "adult" entertainment industry. And as for folks not being willing to accept human looking robots, really, just consider that no matter what they look like their entire phisiology is totally different from ours. We would not be "brothers under the skin", way more than just a different species.
As I read this phrase "uncanny valley," there's a curious analogy with the synthesis of musical-instrument sounds:
Back in the old days of Moog (and similar) analog synthesis (roughly in the late '60s to early '80s), people were able to create surprisingly accurate imitations of orchestral instruments. These were generally regarded as impressive. Gradually, the technology improved over time, to using sampling and physical modeling (actually running a real-time simulation of waves moving through an air column or string). By any objective measure, these simulations of orchestral instruments are *vastly* more accurate than the analog-synthesizer simulations that were generally perceived as impressive.
However, people perceive them - or some of them and some people at least - quite the opposite: They sounded dreadful! Why? Because these sounds went from the realm of "impressive imitations" to the realm of "awful-sounding real instruments"! They were enough better that they invited the same kind of scrutiny that we would put on real performers of real instruments. No performer of a violin, clarinet, horn, or bassoon, say, would ever strive to sound like these imitations!
Analogously, if you look at a humanoid robot whose face is clearly intended to be a cartoon representation of humans, then if they are otherwise impressive enough in their capabilities, then people will react positively. However, if you try to make them have realistic skin, realistic expressions, etc., people will start scrutinizing them the same way we scrutinize real people. By that benchmark, these robots look really awful, and give a negative impression. That, even though, by any objective measure, they are *vastly* more life-like!
Those robots that look human are certainly quite novel, and probably a real source of potential danger. Just as the real animal winds up being thought of in terms of the cartoon creature, ("Bullwinkle Moose"), but in reality is nothing like it, so the human looking robots will be constantly sending the wrong message. This is why industrial robots look like industrial robots: They are far less likely to accidentally rip your head off, which they are really capable of doing, by the way.
So a human looking robot really is a creepy thing, since the actual entity is nothing like the person presented. Probably the most successful application for human looking robots would be in the "Adult entertainment" industry. I am not suggesting that it is a good idea, just pointing out the nature of the problem.
Determining the quantities and location of sensors in an Internet of Things application requires a thorough problem statement and a clear vision of success, an expert will tell engineers at the upcoming Design & Manufacturing Show in Minneapolis.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies.
You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived.
So if you can't attend live, attend at your convenience.