William, thanks for taking the time to check out what kind of programming is actually being discussed. When I saw the mentions about simulation and virtual robot cells, it looked like offline programming to me, but I was not about to conclude that. My understanding of this whole shift to point and click, which I've also encountered in machine vision, is that it's aimed at simplifying programming so that operators can do it instead of programmers, to save money. Obviously, this can only be aimed at less complex tasks that can be modulized in some way.
@ANN, I did visit the ABB site and read through a large portion of the program manual. What they are describing is offline programming, in which the programmer first builds a virtual robot cell and then puts in a virtual robot with virtual tools. The tricky part that I see with that is bulding the vitual work cell.
What becomes clear is that offline programming appears to require accurate dimensions and spatial reference informatiom about the elements in the workcell, and they need to be very accurate. Of course it should be possible to do that for a cell if nothing can move relative to anything else. Now I understand how offline programming is done, and it would be similar to real-time programming, except that things would not break, and it would require some very good visualization skills. And I can see where point and click would fit in.
Thanks, William. Let us know what you find out. Meanwhile, I checked ABB's website, and I found two things that may be relevant. First, the RAPID programming language is mentioned in the press release discussing the controller used for the package described in the Little Robots story that mentions point and click programming. Second, the software itself, RobotStudio for the PC, mentioned to me during the interview for that story, is described on their website as using simulation for offline programming: http://www.abb.com/product/seitp327/78fb236cae7e605dc1256f1e002a892c.aspx?productLanguage=us&country=US
William, those are good questions. Please do let us know what you find out about, such as which robotic functions/programming steps have become objects or modules, or automated in some way. If it's anything like machine vision, my guess is that those are low-level function clusters of some kind. Or perhaps it's something entirely different.
Ann, I am trying to imagine what part of robot programming point and click would work for, and I can see that I am going to have to chase that subject quite a bit more in order to see what new things are being used. The point by point programming could be called real slow time, since the motion is usually much slower than normal operation.
Of course, for anything beyond the very simplest program we always need a sequence of motions chart, which not only defines all of the moves but also lists all of the qualifying conditions, both for the move to begin and then when the move is done. That allows us to verify that one thing is complete before starting the next thing. When things must happen at the same time it becomes more complex, particularly if they must be synchronized. OF course a robot controller already does that, in that six axis may move in unison to move the arm from one position smoothly to another one. Consider the math to make the six non-orthagonal axis work that well.
To make things work along with a robot move we can put in an intermediate point where an I/O point is switched on during a move, as the motion passes a specific pointr. That function is not new, but it certainly can be very useful.
Alex, thanks for the programming feedback on two-armed robots. I would imagine it must be similar to programming any real-time system, such as machine vision, except probably a lot more complicated than MV. William, the point-and-click reference is to my story on ABB's smaller robot packages with simplified programming interfaces:
That's a great point, Bill, about the programming of two-armed robots constituting a big challenge. Indeed, it's the only programming exercise I can think of which rivals real-time programming. The solution is somewhat similar (except in the case of RTOSes the timing is handled implicitly, though you have to test explicitly for ability to respond to real-time interrupts. Anyway, so for two-armed robot programming, i think what the programmer needs to do is to set up a timing diagram prior to programming, and then to verify both the accuracy of this model and compliance with it, throughout all stages including programming and test and integration.
I don't know what sort of programming in a machine control system could be reduced to point and click, unless it would be the creation pf the operator interface portion. Machine controls are mostly about " when this and this and that, then do this, unless those", and that is about as simple as logic can get. Of course each different controller (PLC) has a different dialect, as it were, but many of them are close enough that picking up another one would take less than an hour. Some systems, such as Seimens, are totally different and have no similarity to the other languages, which makes choosing them a very large commitment, in that the new programming language is completely different, both in grammer, syntax, and spelling.
Robot programming as "point and click" is even harder to imagine, at least as far as Nachi and Motoman robots are concerned. But there may be something new that I am not aware of. Robot programs are mostly moves from point to point, with each point being described by three axis and three angles, at least in rectangular format programming. The alternative being to set up a value for each robot axis, recalling that there are six non-orthagonal axis. That method could easily become quite tedious, it would seem.
William, thanks for the points on the programming aspects of two-armed robots. The synchronization problems to solve will be pretty complex. Perhaps we'll get some comments from those with experience in that area.
And Rob, I've noticed a similar trend in machine vision--more point-and-click interfaces where operators can select pre-determined functions.
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.