The ROS-Industrial ecosystem aims to help develop more specialized industrial robot application software. Gerkey said:
There's only so much you can do with the existing embedded industrial controllers that these companies have spent years developing. Now you can keep that controller box, but add a connection to a PC, for example, with algorithms developed in a research lab.
One big area that could benefit is motion-planning software. When industrial robots are programmed, they must be moved with a joystick from one configuration to the next and programmed to remember each configuration, in a highly tedious and inflexible process. Since the robot doesn't sense where an object is, like the bin it's tossing other objects into, it does all this movement blindly.
In ROS-Industrial, we can add perception and motion-perception software for environments where there's variability in what robots need to do. This can be developed in a university lab, so ROS becomes the pipeline from the lab to the factory. Even if the factory needs to rewrite that software, now they have a new and different capability to use in an industrial environment.
To some industrial robot makers, ROS-Industrial gives the ability to patch in technologies that would otherwise be foreign to them. They can use another software suite that will solve the ability to create increasingly diverse applications, said Erik Nieves, technology director of Yaskawa Motoman Robotics, in an interview. "Where ROS shines is working in unstructured or semi-structured environments where you need more perception, where vision plays a much bigger role, and where that vision requires a more sophisticated response from the robot."
One example of developing for unstructured environments is the 3D mapping software created by the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory to help robots autonomously navigate a constantly changing environment.
Watch videos of robot arms using ROS-Industrial here and here.
I watched the video and was a little surprised at the crudeness of the operation, and also with the large time gaps. I was wondering if this is typical or if it was just a crude demo meant to only show that it could be done. I am a hardware guy and certainly don't appreciate the complexities of the software art included in this demo, but can this be further improved by selecting a different level of accuracy and speed, or is this the state of the software now?
I wonder if open source for robotics will follow a path similar to that of Linux in the embedded world. Linux had tremendous appeal for many developers, and because it often turned out to be more difficult than it looked, a group of commercial versions of embedded Linux sprang up around it. Could we expect to see the same here?
Your quite welcome. Yes its great for stimulating creativity for young inventors. The free software tools like CADSoft Eagle makes it easy for creating circuit schematic diagrams and PCBs. Adafruit and Sparkfun provide tutorials and new library components for today's active and passive semiconductor parts. I'm currently using this software to develop kits for Jameco Electronics. Today it's really cool to be into OSHW. Checkout the link for CADSoft Eagle.
Thanks, MrDon, this is really helpful. I knew there were some things going on with Sparkfun, but I hadn't thought of it in terms of open source hardware. This is very encouraging. Does this tend to attract young inventors?
The Open Source Hardware (OSHW) movement is quite big today. Companies like Arduino (yes the company name is their product), Adafruit and Sparkfun Electronics are pioneers in providing all source code, BOMs and gerber files for anyone to manufacture their designs and products. Of course, they sell kits for individuals who just want to build some really cool gadgets. Here's 3 links explaining additional information about OSHW.
One thing that's interesting to me about this development is hearing about all the applications that industrial robot makers could start helping their robots accomplish, such as finely dexterous movements taken from surgical robot programming, or motion planning for unknown environments. That "pipeline from the lab to the factory" is a good image for how the open source process can work at its best.
Beth your right on target. The Open Source movement has turned into a Megatech industry. With software being the enabling Gate to New Product Developmemt, hardware has picked up momentum as well(Open Source Hardware [OSH]foundation). The ROS is good example of how collaboration between Universities and Tech Industries can produced cost effective solutions to solve challenging problems like Motion Planning.
The Industrial Internet of Things may be going off the deep end in connecting everything on the plant floor. Some machines, bearings, or conveyors simply donít need to be monitored -- even if they can be.
Wind turbines already are imposing structures that stretch high into the sky, but an engineering graduate student at the University of Notre Dame wants to make them even taller to reduce energy costs and improve efficiency.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies.
You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived.
So if you can't attend live, attend at your convenience.