Thanks, Jon. I agree that it is always better for the data to be "real" and noise eliminated at the source is always best which also eliminates the need for data to be massaged by software - I was just wondering if it was an option in this case. Thanks for the information!
A software algorithm would have the same problem as people. It could not distinguish the points from the "wanted" signal from those points acquired by an almost unlimited number of higher-frequency signals. An anti-alias filter will help. I find it better to eliminate any source of unwanted signals--noise--as close to the source as possible. More about filters and how to choose them in my next column.
Great information, Jon - I love your series and how it addresses so many relevant issues in test engineering. I was wondering since this is a mathematical function - is it possible to write a software algorithm to eliminate the unwanted data points and if so - would a hardware solution (anti-aliasing filter) or a software solution (algorithm) be more advantageous - or is that simply a matter of available resources?
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.