I read the by-line and immediately thought, "Another story about running a robot slow to keep it from wearing out or breaking." But it seems your isue was the timing of the weld gun in relation to the robot motion.
But to my first point, I have always wondered why technicians (most notably the maintenance guys) want to run a robot (servo robot no less) at a greatly reduced speed? I understand that end of arm tooling weight has some factor in this, but if the robot program allows you to run fast, then I expect the robot to be designed to handle this speed. If it wears out or breaks, that is the manufacturer of the robot issue. I figure if the manufacturer didn't want it to fall apart from running fast, they should of limited the maximum speed that I can set!
Actually, that makes perfect sense, GTOlover. If it's made to run fast, it should certainly do well running fast. At the manufacturing show in Philly last month, I saw some robots that moved mind-numbingly fast.
GTO lover; End of arm tooling - weight and robustness - are factors to consider. There are two competing parameters: speed and acceleration. A short, fast move with high acceleration can be hard on tooling, and if you are moving a part, the part can shift or even come out of the gripper. Even if there is a problem with a misplaced part only 1% of the time, the extra cycle time of a slower move can be more than paid-back by 100% successful placement vs. stopping the line to fix the part. Reduced acceleration limits the maximum speed that can be reached in a short move. Another issue is cycle time. If the longest cycle time in a series process is 50 seconds, there is nothing to be gained by completing previous or following steps in 30 seconds by driving the robot at maximum speed.
I've seen some of the same robots at shows, Rob, and I'm always amazed by the speed. I'm also a little baffled. High speeds combined with heavy end-of-arm tooling combines to apply some pretty big bending moments and torques, which I would imagine puts a lot of wear and tear on the robots.
Being the snide, recalcitrant, irrascible OLD_CURMUDGEON that I am, I have a completely different take on this...... The "readjustment" of the operating parameters of the robot may have had nothing to do with the throughput of the production line, BUT may have been inspired by some "union" action. Obviously, the company wherein this incident occurred was of sufficiently high capacity to warrant robotic technology, AND many of these companies also "enjoy" organized labor representation. Just sayin'............
Just as jackrabbit starts can quickly wear out an automobile, running ANY machinery the same way has the same effect. Although the robot (or servomotor etc) may be rated for some top speed, you need enough time to reach that speed and decelerate safely. It costs power (and heat) to spped up and that energy needsa to be recovered (usually as heat) to stop. The best advice was given in another reply, don't make the process any faster than the limiting factor (the longest process in a serial chain). The bragging rights you'd get from having the fastest robot will quickly go sour when you have to repair your cell prematurely ! As far as the robot being "designed to handle this speed".. please refer to the "Designed by Monkeys" column for enlightenment.
My wife used to run a machine that assembled IC sockets from boxes of plastic bodies and bags of pins, producing about 100,000 sockets per shift. There was an employee incentive program that awarded $.25/hr to the operator that held the highest production record each week. The formula for production record was the number of good sockets produced minus the number of bad sockets produced. If you let the machine run out of components it made fewer parts which cost you points. If you let the machine get dirty or drift out of adjustment it produced bad parts which cost you points twice.
For several years she and the other operators jockeyed for production records. Every so often they would improve the machine to run faster or more reliably which would bump the production standings a bit. One day they made an "improvement" to the machine that made it run much faster, but the count of bad parts skyrocketed. When asked when they were going to fix their fix the operators were told that count of bad parts had been so low for so long that they didn't bother with them anymore. The faster machine made more good parts per hour and that was all that mattered.
My wife happened to hold the production record at the time the machines were "improved" so with the new higher bad part counts no one could ever beat her old record and she kept the $.25/hr bonus for several more years. The incentive program no longer produced healthy competition but animosity instead. Eventually when a wave of layoffs came she was let go because she was costing the company more money for the same job.
This story reveals a few common problems. First, software is invisible, you can't have too much version control and modern data driven / object oriented controls are more vulnerable as the relationship between a data value and its effect on controls is often not obvious. I once looked into a problem on an old piece of equipment that was experiencing occasional misfeeds and discovered that someone had changed the I/O configuration to use the wrong sensor as a process input (cases of nobody changed anything are fairly frequent: Nobody is a busy boy). The second important point is that time is money: the practices of OEE and TPM reflect the dark side which is that the faster a mechanical system goes the faster it breaks down. This is an exponential property. I frequently have to explain why the OEE rate factor must not exceed 100% (why you should not give yourself bonus points on operating a system beyond its design limits). Wear out behavior in general is at least inverse cubic with cycle time or worse when mechanical limits are exceeded (fasteners break out, shaft couplers slip, shafts break). The corallary of this is that mechanisms always wear out. For positioning systems, the limit is when the required precision can no longer be obtained which can be well before any obvious mechanical problem arises. The third thing is that it is difficult to specify the performance limits of a complex articulated mechanism such as a welding robot. While the datasheet calls out maximum reach, payload and dynamics, these parameters are for some arbitrary case and it is typically not reasonable to max out everything at once. What the specs rarely reveal are parameters like stifness, maximum payload angular moments, and power dissipation capacity which all have bearing on how hard and fast you should go. I once experienced a case where a 1-1/4" steel spline shaft supporting a robot grip would snap off every 50,000 part cycles. By design, the robot was carrying the maximum spec load at maximum axis extension at maximum acceleration while translating and rotating simultaneously (the fix was to lower the robot base by ~100 mm). A corollary is that parameters are typically spec'd independently while during complex motions the acceleration of outer links imposes an additional load on inner links. The fourth point is that welding robots (an other applications) have difficult to quantify and nonlinear loads imposed by attached hoses and cables. This is easy to ignore and difficult to properly compensate. The classic error is to assume that if the extra weight is supported that nulling weight also nulls mass (it doesn't). The fifth thing pointed out by this post is that static behavior as seen by walking through tool paths is a nearly meaningless representation of dynamic behavior (at speed everything is ballistic).
Having actually been a robot programmer in my previous job, I'm annoyed at two robots at my current plant. They are ABB robots, and as the original article stated, if you set the accuracy parameter very high for a move, the robot goes to that exact point before performing the next instruction. That is very important for something like making a spot weld, but when the points are just midpoints in a series of moves to reach a final destination, it gives the robot very jerky movements.
I was taught by an ABB guy to lower the accuracy setting for those midpoint moves to make the movement smoother, but apparently the company who programmed these two robots never learned that lesson, and it bothers me to see the robots having such jerky movements when I know that they don't have to.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.