That atomic clock does sound very cool, Jon – I will definitely check it out. The problem back then was a monetary one...cheaper for engineering to write additional software than to add hardware, especially when you have the staff with the ability to write it. I must come clean though – we had an incredible engineer named Norris Lauer that ultimately tackled those software issues – I could make Testpoint (our software of choice back then) dance, I could program in Labview, and I could even write microcontroller programs in assembly, but Norris was our C guru. My job was easy – I identified the problem and after Norris was done, all I had to do was call the DLL!
Good point, Nancy. I'd don't envy you having to write a real-time-clock DLL. Wow. I might have designed a small add-in board with a home-brew programmable real-time clock and used an interrupt on the PC. Still a lot of custom work. It seems there's always some customization needed, even when we can now buy so many products off the shelf. By the way, a company called Symmetricom now sells an atomic-clock module that mounts on a printed circuit board. The QUANTUM Chip-Scale Atomic Clock cost is about $1500. The company's specs note "±5.0E-11 accuracy at shipment," and I assume that means in units of seconds. Very cool.
I sure agree with your logic, Jon! Mydesign – I am also with you – I am a huge fan of off the shelf boards, having done IEEE programming for most of my career. Throw a GPIB-488 card in a rack – address your instruments, use Labview or Tespoint test and measurement software and you are off and running. But lots of times we had stickier problems (problem off the top of my head - having to write a dll because the windows clock did not offer enough resolution for what needed to be measured) that required more in-depth engineering knowledge and having that knowledge also helps engineeers choose the right card or instrument for their specific data acquisition needs and helps them be able to get creative when required...you can't write low level software for a process you don't understand.
That makes sense, Jon. So the data collector is devoted to capturing the signals. it can then send its data to a computer that can store it in an historian and process it for trends, alerts, and other analysis.
Hi, Rob. In many cases, the data-acquisition equipment connects to a computer, either a desktop PC, or some sort of embedded system. The larger computer will handle data analysis, plotting, storage, and so on. A smaller system might perform control operatiuons and report only some of the information. Some DAQ modules can connect to the Internet--as can PCs--and create alerts if a measurement exceeds certain limits, for example. These devices can provide some closed-loop control, too. The choice of how and where you handle the data depend on a specific application.
Hello, Mydesign. Good point about using off-the-shelf data-acquisition cards and modules. For many people, these devices will work just fine. I think we should help engineers and other technical people understand the basics of data acqusition, regardless of the type of equipment they use. Then, when they need something with additional capabilities, they can converse with manufacturers and understand basic terms and specifications. To properly use tools, we should understand how they work.
Hello Geralda. Thanks for pointing out the errors in the text. I'll contact our Web editor and get the units squared away. Sometimes blog front-end software does not properly convert typed symbols and foreigh-language characters properly. The correct units appeared in the original text, and will appear properly in the issue of Design News that carries this information in my Measurements column. Again, thanks.
Thanks for your comments, Christopher. I will talk more about the software-and-memory side of data-acquisition systems later in the series. You're right; it requires thought about what you want to do with the data.
Aside from the typos, this article presents a good starting primer on how to specify a DAC system.
However, I would go still farther. When I specify a DAC system I also go on to consider the entire workflow of the data collection and analysis process. It doesn't do a lot of good to collect a mountain of accurate data at an appropriate resolution without a straight forward means of storing it, archiving it and analyzing it. I always stop and consider what I'm going to do with the data, how complex the post processing is going to be and, probably most importantly, how often am I going to repeat this process.
If the effort is a one-off, then you don't need to get too fancy. A stand alone box like a Fluke Hydra Databucket might be suitable. A PCMCIA card or USB thumb drive and sneaker net are fine for moving data around. If all you're doing is plotting the data a looking for a min/max, then Excel is a perfectly useful tool (as long as the dataset has less then 32K samples per 'column').
If the post processing is much more complicated, or you'll be collecting and analyzing data repeatedly, then a more automated means of storing and analyzing data is a must. NI LabView and NI's hardware is a good default, though I've found a range of lower cost alternatives are also available. Tools like this allow the engineer to collect and fully package the data in near real-time.
As far as I'm concerned, this is an integral part of specifying a DAC system too.
Last year at Hannover Fair, lots of people were talking about Industry 4.0. This is a concept that seems to have a different name in every region. I’ve been referring to it as the Industrial Internet of Things (IIoT), not to be confused with the plain old Internet of Things (IoT). Others refer to it as the Connected Industry, the smart factory concept, M2M, data extraction, and so on.
Some of the biggest self-assembled building blocks and structures made from engineered DNA have been developed by researchers at Harvard's Wyss Institute. The largest, a hexagonal prism, is one-tenth the size of an average bacterium.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.