DeLaval's voluntary milking system automates milk production to an extreme degree, letting cows decide when they need to be milked, 24/7, and providing buffered controlled cooling (2) and a storage tank (3) for fresh milk. This amount of automation boosts productivity and keeps cows happy. Shown are the automatic milking station (1), a milk diversion unit (4), and the vacuum supply system (5). (Source: DeLaval)
This is terrific stuff Ann. But put yourself in the soldier's position for a moment, trapped under heavy debris, when the Golum Krang robot rolls out of the smoke carrying a big-ass pipe in its hands, with a mechanical voice coming from its grill "I am Golem Krang. I am here to help you". Talk about SF movies in real life.
That's really funny, TJ! In reading and writing about all these robots and not having seen any of them up close and personally (yet), I have to say it does creep me out sometimes to think of how sophisticated these machines are becoming. While I appreciate the tasks they can accomplish, there is that whole "Terminator" worry lurking in the background. At what point do robots become smarter than us? (Hopefully never, of course, but the artificial intelligence being created today is getting pretty darned smart!)
Elizabeth, I'm not at all worried about a Terminator takeover. It's the base programming that we should worry about.
Modern airliners are fly-by-wire. Pilots give control inputs which tell the flight computers what the pilot wishes to do. The flight computers interpret that input and adjust the planes control surfaces, engines, etc. to best meet these wishes.
Flight computers today can override or discard a pilot's inputs if they're outside of acceptable range. Effectively, a computer engineer designing the control laws has overriden the pilot. Granted with the best of intentions, but it still means a set of rules set down by someone not in immediate control has more say in what happens than the pilot.
There was a Paris airshow crash back in the 90's I think it was, that can be attributed to the pilot and the flight computer having a difference of opinion. The aircraft made a low, slow flyby of the show with flaps down, gear down. The plane kept sinking slowly while the pilot was pulling back (go up!). The plane thought the pilot wanted to land (flaps down, gear down), and so igored the excessive pull-up input by the pilot.
Something similar happened to one of the YF-22 prototypes (also in the 90s). The pilot was in a similar situation (low, slow, flaps down, gear down), and decided to abort his landing. He went through full power into afterburner, and that confused the flight computer. Flaps down, gear down, but MAX throttle. Eventually, the plane crashed (after the landing gear retracted, but no injuries) because the plane got into PIO Pilot-Induced-Oscillations.
I totally understand, TJ. So if the underlying programming and technology is good, there won't be a problem and in fact, in some cases, robots know best. It's when the code under the covers is faulty that there could be issues with the behavior of these more sophisticated robots. Let's hope that all those working behind the scenes know what they're doing! (Well, of course they do, or we wouldn't have such clever robots.)
That type of problem is what's being addressed in work done by the University of Aberdeen, which we wrote about inHumans, Do You Speak !~+V•&T1F0()? http://www.designnews.com/author.asp?section_id=1386&doc_id=251721 so humans and robots can communicate at a distance about specific tasks the robot is engaged in, and change plans or tactics as necessary.