Test instruments haven't kept pace with the rapid changes that have occurred in devices being tested. There have been advances in instrumentation in the past few decades, such as the move from analog meters and scopes to DPOs and DMMs, large increases in scope bandwidths and a proliferation of standard-specific instruments in wireless test. But the fundamental model of how engineers interact with a stand-alone instrument has remained the same push a button on the front panel or send a command over a control bus and receive a vendor-defined measurement in response.
While the stand-alone instrument model has remained the same, the devices engineers measure and test hardly resemble their counterparts from decades ago. From passive to active automotive suspension systems, or the analog phone to the iPhone, there has been a dramatic increase in device complexity. Further, the functionality of these devices has become increasingly defined by the software embedded in them. As a result, design engineers can add capabilities faster than ever before. The only way to keep up with this pace of change is to use a test system you can reconfigure with software and take advantage of the rapidly increasing performance of commercial technologies.
While instruments have been around a lot longer than the Web, the Web community is currently focused on a similar trend toward user-empowerment they call Web 2.0. In the same vein, a software-based approach to instrumentation inherently empowers users to build custom instrumentation to meet their unique application needs. I call this approach Instrumentation 2.0.
With an Instrumentation 2.0 approach, you use generic hardware components to digitize or generate a signal. Raw data is passed over a high-speed bus to a general-purpose processor where custom software analysis routines turn the raw data into the required measurement. For example, consider an RF measurement system that can be reconfigured to test any wireless system. This may consist of a block downconverter and digitizer connected to a processor over a high-speed bus capable of transferring the digitized data in real time, such as PCI Express. A software tool is used to perform user-defined analysis to create the required measurements. NI LabVIEW, for example, includes routines for modulation and decoding that can create a software-defined RF test system for measuring Bluetooth, 802.11, GSM and other wireless standards. By creating these routines in software that uses common hardware components, there is a significant savings in cost and size.
The technology behind Instrumentation 2.0 has been around for many years. You may have heard the term virtual instrumentation before, for example. Recently, however, Instrumentation 2.0 has been gaining momentum. The U.S. Department of Defense has dictated their future test equipment will use software-based, reconfigurable instrumentation and has deployed trial systems that have demonstrated distinct advantages over traditional instruments in size and flexibility.
The ultimate goal of Instrumentation 2.0, in addition to meeting the performance and flexibility needs of today's complex devices, is to integrate test directly into the design process to support quicker product development. Because mechanical and electrical design and simulation happens in a software-based environment, a software-based test system enables quick import of real-world measurement data into simulations, rapid prototyping of simulated designs through hardware I/O and test routines that are automatically generated through the design process. By upgrading test instrumentation, we can also upgrade design.