That is, except for Rethink Robotics' Baxter. I visited the Rethink booth again this year and talked to Eric Foellmer, product marketing and marketing communications manager, who showed me the company's latest Baxter robot demo. The big red guy still looks pretty much the same, but has some new abilities, mostly due to software.
This year Baxter was demoing the 2.0 software released last fall. This software added three main features and performance improvements: Pick and Place Everywhere, defining Waypoints, and a Hold function, plus improvements in speed, repeatability, and vision capabilities. Watching the robot and its handlers demonstrate these new abilities was fascinating. Some of the movements Baxter can now make in picking and packing reminded me of the sophistication of a human's arm and wrist movements. You can watch a video here showing Baxter packing and loading, as well as diagrams demonstrating the simplicity of programming the robot. You can find more videos on this page.
Baxter showed off his 2.0-derived moves at ATX West this year. The big red guy still looks the same, but has some new abilities, mostly due to software. For example, Pick and Place Everywhere lets Baxter perform pick-and-place at any axis, from any orientation to any orientation, freeing him up for packing and loading. (Source: Rethink Robotics)
Foellmer said one thing customers are doing is integrating Baxter with other automation equipment, as well as with humans, which the new software enables, especially its Waypoint and Hold functions. This is being done via digital I/O to PLCs, for example. Besides pick-and-place, packing/unpacking, and loading/unloading, Baxter is also doing machine tending tasks.
Not only small companies, which are Rethink's targets, are using the robot. Many recent customers are larger companies, often using one or more Baxters in small offline cells doing QA and inspection tasks, Foellmer told us. Some new features are in gripper technology, specifically a new vacuum platform that gives users more ability to customize the robot's vacuum grippers. Most customers are using vacuum grippers now, but a few still work with linear electric grippers.
I also learned that the Baxter Research Robot with a stripped-down version of Rethink's software isn't just going to university labs. Although some of those applications can be pretty amusing, like the Baxter programmed by Cornell University students to do supermarket checkout tasks, the research version is also being purchased by internal R&D departments, and Rethink has just begun shipping this version globally, said Foellmer. Corporate R&D departments are using the robot with more traditional programming and some are using it as a design platform, aided by the robot's open-source Unix-based OS, ROS (robot operating system).
Ann, I recall the early recognition algorithm that used a "sliding mask" or sliding template to find shapes and orientations, which is how they did things "back then". But the systems seem to be more powerful currently, and I am more distant from those organizations now. And it certainly is true that nobody would give up secrets willingly.
So what may be possible is a description of using the systems and setting them up for a specific application, such as teaching a robot to pick apples.. That is a generic application that would be a challenge but provide a chance to learn as to how one would program a robot to do that.
William, vision-based object recognition has been around a long time, as have software packages for same by companies like Cognex and Dalsa. I wrote about that during my machine vision coverage days. To what extent the software companies let us see under the hood is a different question, especially as to how all that interfaces with robotics. You might want to check out their websites and/or do some googling for that answer.
Ann, It shows that my robotic experience is a bit old, since at the time that was how it was done. But the down side is that it had to be done in the actual work cell, and that it took severla hours of instruction to become fairly good at it. But those programs did work very well, as long as being perfectly repetitive was the goal. But that did mean that parts had to be in exactly the right place, which takes extra effort and expense. Vision based flexible operation is quite different and would need anh extended set of instructions to allow it to function at all. It would be very educational to have the exact mechanism of vision based object recognition explained. How does it know what angle to use, and just where to grip, that object? That does not seem quite intuitive to me.
You're right, Baxter's slower pace is one of the things that makes him safe. He's designed to work with people, not to be locked in a cage away from them, so that slower pace is necessary. He's also designed to do simpler, repetitive or difficult tasks, which can cause injury to humans doing those tasks. The kind of programming you're used to is exactly what Rethink wanted to get away from.
The video is quite informative, although I didn't see any programming using the vision feature, or perhaps there isn't one. But I am used to seeing robots work at a pace that humans would not be able to keep up for very long, while Baxter seems to move at a slow and gentle pace. Of course that is what make him safe, it seems. At any rate, the programming method that I have used with MotoMan robots was teaching them one point and path at a time, in sequence, with points in the motion path used as trigger points for external system commands. So the programming presented in the video is quite different.
Thanks, etmax. Rethink is one of those companies I think really deserves the term "innovative," a hugely overused word if you ask me.
I think your concern about replacing humans' jobs is a good one--I share it. OTOH, I have a relative who has permanent, severe damage to her wrists and hands from the repetitive motions involved in canning work she did for a few years in her youth. It's those types of motions that robots are better suited for. Also, the customer videos seem to bear out the idea that Baxter can be intelligently used with humans--which is what it's designed for.
Hi Ann, thanks a brilliant post. I've been fascinated by robots since seeing the first "Lost In Space" series in the mid 60's, and finally they are reaching the dexterity portrayed. Meanwhile separately the intelligence is evolving.
Still I wonder what all this will mean for the large portion of the population that can only pack boxes or perform other (for humans) non-specialised tasks.
I think you're right Rob. At least, that's the impression I got watching the customer videos on the Rethink site, as well as listening to Foellmer's account of the interface. Rethink has the programming down, as well as the HW design.
Many of the new adhesives we're featuring in this slideshow are for use in automotive and other transportation applications. The rest of these new products are for a wide variety of applications including aviation, aerospace, electrical motors, electronics, industrial, and semiconductors.
A Columbia University team working on molecular-scale nano-robots with moving parts has run into wear-and-tear issues. They've become the first team to observe in detail and quantify this process, and are devising coping strategies by observing how living cells prevent aging.
Many of the new materials on display at MD&M West were developed to be strong, tough replacements for metal parts in different kinds of medical equipment: IV poles, connectors for medical devices, medical device trays, and torque-applying instruments for orthopedic surgery. Others are made for close contact with patients.
New sensor technology integrates sensors, traces, and electronics into a smart fabric for wearables that measures more dimensions -- force, location, size, twist, bend, stretch, and motion -- and displays data in 3D maps.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.