When an engineer mentions "design for test," colleagues may think of specific concepts such as providing access to key electrical signals, using boundary-scan chains, and implementing built-in self-test capabilities. But design for test (DFT) can take on a broader dimension, one in which engineers apply the same tests and equipment to most of the measurement tasks throughout a product's design cycle—from R&D to manufacturing test.
To better understand how standard, or common, hardware might apply to product designs, consider the flow of a design from concept to manufacturing. At the modeling or design step, engineers use PCs to produce schematic diagrams and to simulate the performance of devices in software. After they produce a prototype, they move from the PC domain to the physical domain in which they must perform physical measurements to characterize their device. This transition often presents a challenge because much of the design and simulation work isn't reused when engineers make actual measurements.
If, after review, a product concept seems worth pursuing, engineers move to the next step—analysis and simulation.
The analysis and simulation step lets engineers simulate the operation of complex equipment, say a satellite-navigation control system, using software that implements the control algorithms. (Simulation at this step involves systems rather than individual devices or components.) Simulation software can "construct" complex systems and produce data that describes a system's operation, or failure. In the electronic world, programs such as a VHDL, a hardware description language for digital circuits, and various dialects of SPICE, which simulates analog circuits, provide the means to test virtual circuits on a PC. This simulation step verifies the engineer's preliminary designs.
But this step, too, lacks a tight link between a simulated situation and the physical world. Ideally, a simulation should look just like real-world stimuli and should produce results just as measurement hardware would. That's not often the case. Instead the engineers must deal with peculiarities inherent in real hardware—inaccuracies in measurements, differences in measurement methods, and so on.
Based on the results of simulations, engineers can produce a prototype and further test their design. At the prototype stage, engineers have a physical representation of a device or system (or at least part of it), and testing may involve hardware-in-the-loop (HIL) to provide a realistic assessment of a prototype's operations. Traditionally, this type of testing involves standard rack-and-stack test instruments such as oscilloscopes, logic analyzers, and microprocessor emulators. Newer test systems employ PC-based instruments and software.
Finally, a product reaches validation. At this point, it undergoes real-world test conditions, and some tests may require environmental screening for sensitivity to humidity, vibration, and temperature. After making modifications to the design based on the results of these tests, a product moves on to manufacturing and into the hands of consumers.
Go with the flow: The flow of a new idea
from design to production involves several steps, but test results may
force a design back a stage for re-evaluation and redesign. Integrating
test results and design data with virtual instruments can keep these
In each of the five steps above—modeling, simulation, prototype verification, validation, and manufacturing—test results may require several "iterations," or revisions. And the faster the iteration process, the less costly it is. Better for engineers to tweak a design based on quick real-world measurements than to "push" a design back to a previous step after system integration or after a problem gets discovered late in validation. In general, lots of small, quick iterations loops always give better results than a few larger ones.
Imagine if engineers could find one set of software tools and test instruments they could apply, from the start of a new design to manufacturing test. The use of common instruments can speed the movement of a concept to a real product in several ways.
First, tests run at one stage of the design chain will use the same equipment and software as those tests run elsewhere. Thus, the iterations take less time. The use of common test equipment reduces the need to duplicate key tests and it ensures tests at one design stage can be run at another stage without changing instruments or test configurations. Also, the use of common equipment and software makes it easy for engineering groups to share test setups and test results.
Second, the same benefits apply across the entire design chain. Tests developed in the R&D lab transfer smoothly up the chain to a manufacturing line. Using the same hardware and software saves time and money, and it eliminates the need to produce new tests and testers specifically for manufacturing. Also, the use of standard equipment reduces maintenance costs for hardware and it simplifies educating employees about new equipment.
Third, common test equipment, along with common software provides for "scalability," so increasing test capabilities across multiple systems takes little time. And the construction of additional test systems requires almost no debugging and setup time.
Fourth, remember that tests evolve during the life of a product. More efficient and faster tests developed in the R&D lab can quickly flow to similar test equipment and software used on a production line.
The widespread use of PC-based instruments and the variety of software for design and test can make the dream of common hardware a reality. But taking advantage of that reality requires that an entire product-development staff—from design engineers to manufacturing engineers—carefully define their needs and requirements. Lest you think that design engineers don't influence manufacturing, or vice versa, consider this: Research performed by Reed Business Information shows that 97% of design engineers support their designs all the way through to the manufacturing lines. (RBI is the parent of Design News.) Although engineers often have different needs—for example, accuracy of tests in the lab vs. speed of tests on a production line—their needs often mesh in requirements for ease of use, programmability, compatibility, network connectivity, and so on.
Today's PC technologies combine the best of bench instruments and racks of "big iron" automated test equipment (ATE). Although older instrumentation bus standards, such as VXI and IEEE 488 still find use in test-equipment systems, new test systems take advantage of the computer industry's high throughput PCI-bus technology. The PCI Extension for Instrumentation (PXI) bus, supported by many instrument suppliers, offers over 1,000 measurement and control cards that system designers can use in test equipment. Moderate-cost high-speed data-acquisition cards work equally well in R&D labs and on production lines. And PXI-bus cards connect to test-related buses that control IEEE 488 instruments, boundary-scan test chains, USB instruments, and others. Tools for boundary-scan (IEEE 1149), for example, let engineers test connections between digital circuits through a serial 4-signal interface.
A virtual instrumentation approach throughout a design cycle combines general-purpose hardware with software, which defines how a test system functions. Because the results of design and simulation also exist as data within software, engineers have an opportunity to better integrate design and test. Conversely, a test-system vendor may define the firmware and interface for rack-and-stack instruments, so it's unlikely the instruments can easily take advantage of simulation and design data and other test results.