Good points about adaptive gains, tuning and resonances. I've often seen that the effect of mechanical resonances and changes over time are not taken into consideration when tuning a servo. The problem is that nine times out of ten you can get away with it, so the tenth time seems like a "mystery" when it occurs.
I have been thinking more about this rubber string analogy and there was an error in my earlier post too.
The system consists of two masses, the ball and your hand, and a spring between them, the rubber band. Your muscles provide the force to your hand that is moving it. In a servo drive the force is the air-pap torque, the hand corresponds to the inertia of the servomotor's rotor and the ball corresponds to the driven load. The rubber band corresponds to the shafts and the coupling between the motor and the load.
Now when you slowly use your muscles to move your hand up and down the ball will follow. When you increase the frequency of the movement you will find that it becomes more and more "stiff", that is, more difficult to move your hand although the ball is moving up and down. This is the anti-frequency resonance where it is difficult to get the hand tomove. By increasing the frequency further it becomes again easier to move your hand and the amplitudes of the hand and the ball oscillations increases a lot when the frequency is approaching the resonance frequency. Above the resonance frequency the ball movement decreases and your muscles are moving mainly your hand.
The anti-resonance frequency makes it difficult to control the motor movement near and above it and thus limits the dynamics that is possible to be achieved. Note that the anti-resonance frequency is defined only by the torsional stiffnesses of the coupling and the shaft and the inertia of the load. Thus the servo control system cannot improve the control response beyond that. As a rule of thumb the shortest possible response time of the speed control is the inverse of the anti-resonance frequency.
Reduction gear is often used with servo drives. The reduction gear decreases the ioad inertia seen on the motor shaft. Thus in practice the anti-resonance frequency is important only for servo drives that do not have reduction gear.
Don't forget friction! Friction is a nasty thing that makes accurate position control difficult. Slip stick itself is one of the reasons for growling at low speeds. However, sometimes a dither (vibration) signal is added to the torque reference in order to keep the mechanical system in a small movement all the time. This avoids the slip stick phenomena but unfortunately has audible noise as byproduct.
By the way, there is a slight error in the explanation of the Bode plot (apparently this is a transfer function from motor air-gap torque to motor speed). The plot line below about 20 Hz describes the motor and load moving together as one piece. Around 29 Hz only the motor moves but not the load (this frequency is also known as anti-resonance frequency). Between 29 Hz and 53 Hz the load is more and more accelerating when the motor is decelerating and vice versa thus finally reaching the resonance at 53 Hz. Above 53 Hz the plot shows more and more only the inertia of the motor.
You can check this with the rubber band and ball. If you move your hand very slowly up and down, the ball will follow. When you increse the frequency you will find there is a frequency where your hand moves but the ball does not move. This is the anti-resonance frequency. Increasing the frequency still you will finally get into resonance frequency where the ball is moving very much even when your hand movement is small. Increasing the frequency yet higher (if you can) you will find that the ball is again more or less standing still although your hand moves a lot.
Wish I could afford a driver like this one. I accidentally hit the end of travel on one of my servos on my CNC mill. The momentary stall ended up letting all the "magic smoke" out of one of my servo drivers. A driver that would properly compensate for drive error like the above one would be most handy.
As long as a servo drive permits access to and real time changes in its gains, then it's not strictly necessary to have a modern servo drive in order to take advantage of the concept. The system controller could change the gains and feed them to the drive. It's not as elegant as a self-contained drive, but it does permit use of this neat concept.
Marcus, thank you for an informative article. I was just at a seminar given by a semiconductor vendor on a new microcontroller targeted at the motor control market. These incorporate motor control timers as well as fast A/D converters. All of these are built in to the SoC, so that the measurement and correction strategies you discuss can be implemented.
The legacy endpoint devices that control our critical infrastructure (utility systems, water treatment plants, military networks, industrial control systems, etc.) are some of the most vulnerable devices on the Internet.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This radio show will show what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.