The two factors outlined above (linearity and repeatability) can be very useful when choosing other types of sensors besides just the pressure sensors. In fact, they apply to most, if not all, of the sensors within every effective flow control system.
Interesting piece which sheds a lot of light on a subject that most manufacturers have to deal with every day. Of the two characteristics to be considered when looking for the most accurate pressure sensor, I have always believed that the repeatability (or non-repeatability, as you put it) of the sensor results should be the most significant factor. The sensor that gives the least differences between independent measurements reflects the smallest variations and is therefore, in my opinion, the most accurate.
Interesting coimments all. I would like to add my 2 cents. The actual term I learned was measurement uncertainty (MU) usually expressed as a % of actual reading or full scale of the instrument (not the calibrated range) the determination of which is dictated by the linear plot of the instrument's error data on a Gaussian or normal distribution. The MU comprises of two components. The Precision or Random error and the Bias or Systematic error.
1/The precision error is that whick is inherent to the instrument's circuit, parts and design (pressure elements, manufacturer's calibration facility etc).
2/ The systematic error is the error caused by the installation of same instrument and resultant from things like environment, installation details, ancilliary equipment etc.
The two types of errors (and they are always present) are combined using RSS (root sum square) method the same way the collected data used to calculate the errors. to express the MU. Naturally the more data acquired the more statistically correct (higher confidence level) will be your results, The resolution and repeatablility are totally different and while extrememly important (a highly repeatable instrument can have terrible accuracy but the converse is not acceptable) and a highly accurate (MU) instrument must have very good resolution to adequately reflect the results.
As almost anyone will tell you MU (accuracy/inaccuracy) is one of the MOST misunderstood and misapplied terms in instrumentation and automation!!
I frequently have this discussion with customers that have trouble understanding resolution and accuracy. I tell them it's the difference of having a ruler with a single, precise mark on it that is NIST traceable, or a ruler with 1000 marks on it that vary +/- 10 marks.
It is so true that accuracy, precision, and resolution are all quite different. That is why I never attach numbers to the words accuracy or precision when I am writing. The very definitive terms are "uncertainty" and "resolution", with the term "precision" used as a descriptor for items that are more accurate than average. Claiming a maximum of 1% measurement uncertainty is much more definitive, but it still needs to be qualifies as to 1% of what? Both percentage of reading and percentage of range are commonly used. Resolution is much easier to define and is actually independant of both accuracy and uncertainty. A five digit meter can be quite inaccurate while still providing excellent resolution.
The other term that is missing from the discussion is repeatability, which is vital to prodcing useful statistics based on measurements. A source of pain is that it seems that repeatability appears to decrease as uncertainty is decreased and resolution rises.
You are right, Aeroengineer1. Marketing sometimes has a lot to do with the lack of quality information on a pressure sensor. Likely, it's just a lack of knowledge about the differences you point out. Also, manufacturers don't like to advertise the weaker aspects of their product. I think this second reason happens more often than anyone admits.
Some larger companies have whole departments dedicated to determining all of these key aspects of a pressure sensor by in-house experimental test because the vendor information is incomplete or even incorrect. Some applications require that level of understanding because the application is that critical. For most, having this kind of rigor isn't practical, so we expect the company to provide their known limits of their device. Although some vendors will work with you on getting a better handle of the accuracy and precision, because this information may still be lacking for your particular application, the only fallback is the classical approach, lab or field testing.
This is a good post to remind people that something as simple as a pressure sensor requires some thought the accuracy or precision of that instrument. Knowing that environmental conditions (temperature being one) can shift the measurement is also one more thing to stay keen about. Instrumentation is an important consideration in design.
If marketing and other people would stop using these terms as if they were synonyms, there would be much less confusion. In that I do take issue with the fact that accuracy is defineded as the highes and lowest measurement from a specific point. This is precission. Precission measures the scatter in the data about a known point. Accuracy isspecifically the average of the given points and how close that value is to reality. All the points may have very high scatter, but if the average is correct, then you can call the measurement accurate, despite all the values that make up that measurement to be far off. You can have a device that is not very accurate over a sigle measurement, but is accurate over multiple measurements. For a device to be accurate over a single measurement, it also needs to be precise. Precission is the measure of the error band on the total measurement. The larger the error band, the less precise the sensor is. You can have a sensor, though that is precise, but not accurate. In each of these cases, there are ways of attempting to improve the accuracy. Some of them involve averaging the data, some of them just require a simple calibration value, some require a calibration table along with averaging. Many highly precise sensors will require calibration methods to take out effects from thermal offsets as well as manufacturing differences that cause part to part differences.
The company says it anticipates high-definition video for home security and other uses will be the next mature technology integrated into the IoT domain, hence the introduction of its MatrixCam devkit.
Siemens and Georgia Institute of Technology are partnering to address limitations in the current additive manufacturing design-to-production chain in an applied research project as part of the federally backed America Makes program.
Most of the new 3D printers and 3D printing technologies in this crop are breaking some boundaries, whether it's build volume-per-dollar ratios, multimaterials printing techniques, or new materials types.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.