The trends of increasing device complexity and technology convergence are driving test systems to be more flexible. Test systems must accommodate device changes over time, even though cost pressures are demanding longer system lifetimes. The only way to accomplish these objectives is through a software-defined, modular architecture. This white paper introduces the software-defined concept through virtual instrumentation, provides options for the hardware platform and software implementations, and discusses how a modular system is ideally suited to meet automated test equipment (ATE) challenges.
Fundamentally, there are two types of instrumentation today, virtual and traditional. Figure 1 illustrates the architectures of these types.
Figure 1. Comparing traditional and virtual instrumentation architectures, both share similar hardware components; the primary difference between the architectures is where the software resides and whether it is user-accessible.
The diagrams show the similarities in these two approaches. Both have measurement hardware, a chassis, a power supply, a bus, a processor, an OS, and a user interface. Because the approaches use the same basic components, the most obvious difference from a purely hardware standpoint is how the components are packaged. A traditional, or stand-alone, instrument puts all of the components in the same box for every discrete instrument. An example of a stand-alone instrument is a manual instrument controlled with GPIB, USB, or LAN/Ethernet. These instruments are designed as discrete entities and not designed primarily for system use. While there are a large number of traditional instruments, the software processing and user interface are fixed in the instrument itself and can be updated only when and how the vendor chooses (for example, through a firmware update.) Thus, it is impossible for the user to perform measurements not included in the function list of a traditional instrument, and it makes it challenging to perform measurements for new standards or to modify the system if needs change.
By contrast, a software-defined virtual instrument makes the raw data from the hardware available to users to define their own measurements and user interface. With this software-defined approach, users can make custom measurements, perform measurements for emerging standards, or modify the system if requirements change (for example, to add instruments, channels, or measurements.) While user-defined software can be applied to stand-alone, application-specific hardware, it is ideally paired with general-purpose, modular hardware where the full flexibility and performance of the measurement software can be exploited. This combination of flexible, user-defined software and scalable hardware components is the core of modular instrumentation.
Modular instrumentation can take several forms. In a well-designed modular instrumentation system, many of the components - such as the chassis and power supply - are shared across instrument modules instead of duplicating these components for every instrument function. These instrument modules can also include different types of hardware, including oscilloscopes, function generators, digital, and RF. In some cases, as shown in Figure 2, the measurement hardware is simply a peripheral that is installed in one of the host computer peripheral ports or peripheral slots. In this case, the host PC provides the processor for performing the measurements in software as well as the chassis for the power supply and I/O.
Figure 2. Examples of measurement hardware choices for modular instrumentation include a USB peripheral module on the left, and a PCI Express plug-in module on the right.
In other cases, such as PXI (PCI eXtensions for Instrumentation) - a rugged platform for test, measurement, and control supported by more than 70 member companies - the measurement hardware is housed in an industrial chassis (see Figure 3).
Figure 3. This example of a modular instrumentation system uses PXI hardware and NI LabVIEW graphical development software.
In a PXI system, the host computer can be embedded in the chassis (as shown in Figure 3) or it can be a separate laptop, desktop, or server that controls the measurement hardware through a cabled interface. Because a PXI system uses the same buses internal to a PC – PCI and PCI Express – and off-the-shelf PC components to control the system, the same modular instrumentation concepts apply equally using a PXI system or a PC. (However, PXI does provide other benefits for modular instrumentation not presented here such as increased channel count, portability, and ruggedness (for more information on PXI, see ni.com/pxi.) Regardless of whether the system uses PXI, a desktop PC with internal plug-in modules, or a desktop PC with peripheral I/O modules, this sharing of the chassis and controller greatly reduces cost but also enables the user to control the measurement-and-analysis software. While there are many configuration choices for modular instrumentation, the differentiator between this approach and traditional instrumentation is that the software is open, so that users can define their own measurements as test needs change or as measurements are unavailable on traditional instruments.
It is important to note that this modular approach does not mean that instrument or channel synchronization suffers when compared to traditional instruments that combine functions into a single box. On the contrary, modular instruments are designed to be integrated for system use. All modular instruments provide timing and synchronization capabilities through shared clocks and triggers. For example, for the highest synchronization accuracy, baseband, IF, and RF instruments can synchronize to one another with less than 100 ps interinstrument skew – better than the skew across multiple channels on the same instrument.
While the term “modular” is sometimes misapplied based on the hardware packaging alone, modular instrumentation is about more than just packaging. Users should expect three things of a modular instrumentation system – reduced cost and size through a shared chassis, backplane, and processor; faster throughput through a high-speed connection to the host processor; and greater flexibility and longevity through user-defined software.
As detailed above, all instruments in a modular instrumentation system share the same power supply, chassis, and controller. Stand-alone instruments duplicate the power supply, chassis, and/or controller for every instrument, adding cost and size and decreasing reliability. In fact, every automated test system requires a PC regardless of the bus used; a modular architecture that shares this controller across all the instruments amortizes that cost across the entire system. In modular instrumentation systems, GHz PC processors analyze the data and make measurements using software. The result is measurements at 10 to 100 times the throughput of a test system built solely on traditional instruments, which use built-in vendor-defined firmware and application-specific processors. For example, a typical vector signal analyzer (VSA) performs 0.13 power-in-band measurements/s, whereas an NI modular VSA can perform 4.18 power-in-band measurements/s – a 33X improvement.
Modular instruments require a high-bandwidth, low-latency bus to connect the instrument modules to the shared processor for performing user-defined measurements. While USB provides an excellent user experience in terms of ease of use, PCI and PCI Express (and by extension, the PXI platform, which is based on these buses) provide the highest performance in modular instrumentation. PCI Express today provides slots with up to 4 GB/s, and PXI provides slots with bandwidth up to 2 GB/s each – more than 33 times as fast as Hi-Speed USB, 160 times as fast as 100 Mb/s Ethernet, and even 16 times as fast as emerging Gb/s Ethernet (Figure 4). Peripheral buses such as LAN and USB always connect to the PC processor through an internal bus such as PCI Express and are therefore, by definition, always lower in performance. As an example of how high-speed buses can impact test and measurement, consider a modular RF acquisition system. A PCI Express x4 (2 GB/s) slot in a desktop PC or PXI system can stream two channels of 100 MS/s, 16-bit IF (intermediate frequency) data directly to a processor for computation. Because neither LAN nor USB can meet these requirements, instruments that need this level of performance always include an embedded, vendor-defined processor to perform the measurements – in which case they are no longer modular.
Figure 4. PCI and PCI Express provide the highest bandwidth and lowest latency, decreasing test time and delivering flexibility and longevity through user-defined software.
In a modular instrument, the high-speed connection to the host delivers flexibility and longevity because it enables the software to reside on the host instead of on the instrument. With the software running on the host, the user, not the vendor, defines how the instrument operates. This architecture gives you the power to (1) make measurements that are not common enough to be included in the typical vendor-defined, nonmodular approach; (2) create measurements for unreleased standards; and (3) define the algorithms used to make specific measurements. The user-defined nature of the software also means that you can add or modify measurements, and even instruments, as the device under test changes. You can also use the direct software access to monitor or control these modular instruments across the network.
It is worth noting that these hardware implementations do not sacrifice measurement performance. Today, the instruments designed with a modular instrumentation approach include industry’s highest-resolution digitizer, the highest-bandwidth arbitrary waveform generator, and the most accurate 7½-digit digital multimeter.
The role of software in modular instrumentation cannot be overstated. Software converts the raw bit stream from hardware into a useful measurement. A well-designed modular instrumentation system considers multiple layers of software, including I/O drivers, application development, and test management, as shown in Figure 5.
Figure 5. Software layers are often used in a modular instrumentation system.
The bottom layer, Measurement and Control Services, is one of the most crucial elements of a modular instrumentation system, though often overlooked. This layer represents the driver I/O software and hardware configuration tools. This driver software is critical, because it provides the connectivity between the test development software and the hardware for measurement and control.
Instrument drivers provide a set of high-level, human-readable functions for interfacing with instruments. Each instrument driver is specifically tailored to a particular model of instrument to provide an interface to its unique capabilities. Of particular importance in an instrument driver is its integration with the development environment so that the instrument commands are a seamless part of the application development. System developers need instrument driver interfaces optimized for their development environment of choice, for example, NI LabVIEW, C, C++, or Microsoft .NET.
Also represented in Measurement and Control Services are configuration tools. These configuration tools include resources for configuring and testing I/O, as well as storing scaling, calibration, and channel-aliasing information. These tools are important for quickly building, troubleshooting, and maintaining an instrumentation system.
The software in the Application Development Environment layer provides the tools to develop the code or procedure for the application. Although graphical programming is not a requirement of a modular instrument system, these systems often uses graphical tools for their ease of use and rapid development. Graphical programming uses “icons” or symbolic functions that pictorially represent the action to be performed, as shown in Figure 6. These symbols are connected together through “wires” that pass data and determine order of execution. LabVIEW provides the industry’s most used and most complete graphical development environment.
Some applications also require an additional layer of software management, either for test execution or visibility into test data. These are represented in the System Management Software layer. For highly automated test systems, test management software provides a framework for sequencing, branching/looping, report generation, and database integration. The test management tool must also provide tight integration into the development environments where the application-specific code is created. National Instrument's test management software, for example, provides this framework for sequencing, branching, report generation, and database integration and includes connectivity to all common development environments. In other applications that need visibility into large quantities of test data, other tools may be useful. These needs include quick access to large volumes of scattered data, consistent reporting, and data visualization. These software tools aid in managing, analyzing, and reporting data collected during data acquisition and/or generated during simulations.
Every layer of this software architecture should be considered for a modular instrumentation system.
As devices become more complex and include more disparate technologies, test systems must become more flexible. While test systems must accommodate devices changing over time, cost pressures demand longer system lifetimes. The only way to accomplish these objectives is through a software-defined, modular architecture. Through shared components, high-speed buses, and open, user-defined software, modular instrumentation is best suited to meet the needs of ATE today and the future.
National Instruments, a leader in automated test, is committed to providing the hardware and software products engineers need to create these next-generation test systems.
Test System Development Resource Library
National Instruments has developed an extensive collection of technical guides to assist you with all elements of your test system design. The content for these guides is based on best practices shared by industry-leading test engineering teams that participate in NI customer advisory boards and the expertise of the NI test engineering and product research and development teams. Ultimately, these resources teach you test engineering best practices in a practical and reusable manner. Download guides from the Test System Development Resource Library.