Discover and Innovate with the Confidence of Real Measured Test Data


Learn how advancements in high-speed I/O, high-bandwidth data buses, multicore processing, and stream-to-disk technology have made data bottlenecks disappear.

Table of Contents

  1. Using Faster I/O and Buses to Increase the Amount of Data You Collect
  2. Increasing Storage and Processing Capabilities to PC Technology
  3. Removing the Software Bottlenecks
  4. The Value and Challenge of Real Data

Ten years ago, the amount of test data engineers could collect was greatly limited by their acquisition hardware, PCs, and software. I/O was typically expensive and significantly slower, computer storage was costly, and software for acquiring data could not handle large amounts of engineering data and required a computer science expert. Because of this, engineers had to rely on models and simulations to determine the results of large test applications. However, with cheaper storage costs, faster PC processors, and advancements in I/O, bus speed, drivers, and software, you do not have to face the same bottlenecks in the amount of data you can collect, manage, and analyze. You can move past models with the confidence of having real measured test data that you can manage and process for trends and results.

Back to top

Using Faster I/O and Buses to Increase the Amount of Data You Collect

With the decreasing costs of analog-to-digital converters (ADCs), a dedicated ADC per channel on a device is becoming more prevalent. With this multi-ADC approach, you can collect significantly more data than in comparable multiplexed solutions. Compare an NI PXI-6251 M Series module featuring a single ADC with an NI PXIe-6368 X Series module featuring multiple ADCs. As seen in Table 1, with the same amount of channels, sampling rate, and sample size of 2 B per sample, the multi-ADC design of NI X Series devices results in significantly more data than a single ADC device.

Table 1. The New Multi-ADC NI X Series Devices Remove the Bottleneck of I/O Speeds

However, multi-ADC boards are not a brand new solution. Previously, engineers could see similar I/O speeds with a standard NI S Series device but faced a bottleneck with the bandwidth of the bus that had a shared backplane bandwidth of 132 MB/s. However, with PCI Express and PXI Express, you have a dedicated 250 MB/s bandwidth per lane and up to 16 lanes. This results in up to 4 GB/s for a single device. This increase in the I/O and bus speeds has removed the bottleneck that once existed for acquiring real-world data. However, handling all of this data with your PC and software can still be challenging. In addition, total system throughput increases as you add more instruments to the system. 

Figure 1. PCI Express Throughput Scales with the Number of Devices Used

Back to top

Increasing Storage and Processing Capabilities to PC Technology

A major benefit of PC-based data acquisition is that you can take advantage of the constant innovation in off-the-shelf PC technology. Fifty years ago, IBM invented the first hard drive, which cost $250,000 USD per year to lease and provided 5 MB of storage capacity. Today, the price of memory is exponentially cheaper. You can find a 500 GB hard drive for less than $100 USD. With this inexpensive and readily available storage, you no longer have to make choices about which data to save in the interest of preserving hard drive space. You can save everything and decide during postprocessing which information is meaningful or needs further analysis.

At the same time, being able to quickly acquire, analyze, and store data from within the constraints of an off-the-shelf PC is also key. Yet processor speeds have hit a wall in recent years. Moore’s law, which states that the number of transistors on a chip will double every 18 to 24 months, still holds true as it has for the last 40 years, but it no longer translates into a linear increase in performance.

Figure 2. Moore's Law demonstrates that processors are no longer getting faster, hence the move to more processors on a single chip from chip vendors such as Intel and AMD.

Today, increasing clock speeds for performance gains is not viable because of power consumption and heat dissipation constraints. Chip vendors have instead moved to entirely new chip architectures with multiple-core processors on a single chip. With multicore processors, programmers can complete more total work than with one core alone. However, the key is to create applications that take advantage of the additional cores and can run pieces of code in parallel threads.

Traditionally, text-based programmers have had to explicitly define these threads in their applications. Because text-based programming is inherently serial in nature, attempting to visualize parallelism in a multithreaded piece of code is difficult. On the other hand, by harnessing the graphical nature of NI LabVIEW, you can easily visualize and program parallel applications, as seen in Figure 3. By parallelizing applications, you can generate and process large amounts of data quickly.

Figure 3. Compared to text-based languages, LabVIEW makes it simple to create multithreading applications.

For more information on multicore technology, visit

Back to top

Removing the Software Bottlenecks

In the end, the goal of every acquisition system is to analyze and visualize data quickly to get to results. But even though the data collection bottlenecks have been resolved, you can still lose efficiency when handling the data for analysis and reporting. One major bottleneck is how best to store data to disk. The number of different formats is limitless yet none seems to truly match the needs of engineers collecting measurement data at high speeds. This is why National Instruments introduced the Technical Data Management Streaming (TDMS) file format for streaming measurement data. The TDMS file format is a well-organized and self-documenting format that is optimized for high-speed streaming applications. Using TDMS, you can store and group channels of data and properties within a file to allow for easier postprocessing and exchangeability.

Figure 4. Each TDMS file contains descriptive information on the file, group, and channel levels.

Also, TDMS has been optimized for high-speed streaming to remove the bottleneck that previously existed for saving data to disk. In the past, engineers worked around the limitations of the file format with approaches such as decimating the data being written to the file, but this resulted in the literal loss of data. With TDMS, you can reach write speeds of up to 400 MB/s, removing the bottleneck introduced by file I/O.

Also, for extremely high speeds and ease of use, you can use the Configure Logging VI installed with the NI-DAQmx driver. This VI integrates NI-DAQmx scaling information in the header of the TDMS file so that only the raw data is written to file. Using just this single VI, you can achieve file writing speeds of up to 1.2 GB/s.

Figure 5. The Configure Logging VI is the fastest and easiest way to stream data to disk in LabVIEW.

Once you have stored all of this data to disk, you need to turn the raw data into meaningful information. Typically, engineers try to use common analysis packages such as Excel to sift through the data to find results. However, these spreadsheet programs were not designed for engineers. They cannot handle the large amounts of data being collected, and they do not include engineering analysis or make it easy to find trends within the data.

NI software such as LabVIEW and DIAdem was designed to help engineers get to results quickly. This software handles the large amounts of data engineers collect and includes the analysis and reporting tools that engineers require. And now, with the integration of NI DataFinder technology, you can quickly find the trends and correlations in your data without being bogged down by the sheer amount of data collected.

Figure 6. Postprocessing tools designed for engineers such as NI DIAdem make it easy to quickly analyze and report engineering data.

NI DataFinder is a data index that stores properties associated with your test data files. Therefore, you can create queries in LabVIEW or DIAdem that search through your data to find trends and outliers in your test data based on these properties. Once you have pinpointed this data, you can quickly process, visualize, and report on it using LabVIEW or DIAdem. Using NI DataFinder, you can quickly find the data that matters most and get to results. For more information on NI DataFinder data management technology, visit Best Practices for Saving Measurement Data.

Figure 7. Using NI DataFinder in LabVIEW, you can create applications to quickly find trends and outliers and generate results faster.

Back to top

The Value and Challenge of Real Data

Over the last 10 years, technology has shifted so you can easily collect real-world data as opposed to creating simulations and models. Although there is enormous value in having real data, unless your hardware, PC, and software can efficiently handle this data, challenges quickly arise. However, with new I/O, bus, storage, and software technology, you can avoid the major bottlenecks and use real-world data in your test applications to pinpoint results and trends over time.

Additional Resources


Back to top