LabVIEW for Measurement and Data Analysis

Publish Date: Jan 04, 2017 | 82 Ratings | 3.68 out of 5 | Print

Overview

Thousands of engineers and scientists rely on LabVIEW for a variety of applications: test and measurement, process control and automation, monitoring and simulation. LabVIEW is the tool of choice due to its unparalleled connectivity to instruments, powerful data acquisition capabilities, natural dataflow-based graphical programming interface, scalability, and overall function completeness. One need that persists regardless of the area of expertise is the fact that users must manipulate data and measurements, and make decisions based on it. This paper focuses on the capabilities that make LabVIEW the right tool for data and measurement analysis.

Table of Contents

  1. Introduction
  2. Choosing the Correct Method for Analysis
  3. Analysis Categories
  4. Which Analysis Tools are Available for LabVIEW?
  5. Conclusion
  6. For More Information

1. Introduction

Users generally start their work by acquiring data into an application or program, because their tasks typically require interaction with physical processes. In order to extract valuable information from that data, make decisions on the process, and obtain results, the data needs to be manipulated and analyzed. Unfortunately, combining analysis with data acquisition and data presentation is not always a straightforward process. Application software packages typically address one component of the application, but seldom address all aspects and needs to get to a complete solution. LabVIEW was designed to address the requirements for a start-to-finish, fully-integrated solution, so that customers can seamlessly integrate all phases of their application in a single environment.


Figure 1. LabVIEW Virtual Instrument Block Diagram

While there are many tools that independently address each of the requirements, only LabVIEW combines all of them with the power of graphical programming and state-of-the-art data acquisition hardware, using the power of your PC. It is the combination of data acquisition, data analysis, and presentation of results, that truly maximizes the power of Virtual Instrumentation. A virtual instrument consists of an industry-standard computer or workstation equipped with powerful application software, cost-effective hardware such as plug-in boards, and driver software, which together perform the functions of traditional instruments. This is why applications and programs built with LabVIEW are referred to as VIs (virtual instruments).

As an engineering-focused tool, LabVIEW makes hundreds of analysis functions available for researchers, scientists, and engineers, as well as students and professors. They can build these functions right into their applications to make intelligent measurements and obtain results faster.

Back to Top

2. Choosing the Correct Method for Analysis

Users incorporate analysis into their applications and programs in different ways. There are certain considerations that help determine the way in which analysis should be performed.

Inline vs. Offline analysis

Inline analysis implies that the data is analyzed within the same application where it is acquired. This is generally the case when dealing with applications where decisions have to be made during run time and the results have direct consequences on the process, typically through the changing of parameters or executing of actions. This is typically the case in control applications. When dealing with inline analysis, it is important to consider the amount of data acquired and the particular analysis routines that are performed on that data. A proper balance must be found because they could easily become computationally intensive and have an adverse effect on the performance of the application.

Other examples for inline analysis are applications where the parameters of the measurement need to be adapted to the characteristics of the measured signal. One case is where one or more signals need to be logged, but these change very slowly except for sudden bursts of high-speed activity. In order to reduce the amount of data logged, the application would have to quickly recognize the need for a higher sampling rate, and reduce it when the burst is over. By measuring and analyzing certain aspects of the signals the application can adapt to the circumstances and enable the appropriate execution parameters. Although this is only one example, there are thousands of applications where a certain degree of intelligence - the ability to make decisions based on various conditions - and adaptability are required, which can only be provided by adding analysis algorithms to the application.

Decisions based on acquired data are not always made in an automated manner. Very frequently, those involved with the process need to monitor the execution and determine whether it is performing as expected or if one or more variables need to be adjusted. Although it is not uncommon for users to log data, extract it from a file or database and then analyze it offline to modify the process, many times the changes need to happen during run time. In these cases, the application must handle the data coming from the process, and then manipulate, simplify, format, and present the data in a way that it is most useful to the user. LabVIEW users can then take advantage of the many visualization objects to present that data in the most concise and useful manner.

LabVIEW offers analysis and mathematical routines that natively work together with data acquisition functions and display capabilities, so that they can be easily built into any application. In addition, LabVIEW offers analysis routines for point-by-point execution; these routines are designed specifically to meet the needs of inline analysis in real-time applications. Users should consider certain aspects when deciding whether point-by-point routines are appropriate.

Point-by-point analysis is essential when dealing with control processes where high-speed, deterministic, point-by-point data acquisition is present. Any time resources are dedicated to real-time data acquisition, point-by-point analysis becomes a necessity as acquisition rates and control loops are increased by orders of magnitude. The point-by-point approach simplifies the design, implementation, and testing process, because the flow of the application closely matches the natural flow of the real-world processes that the application is monitoring and controlling.


Figure 2. Array-based Analysis vs. Point-by-Point Analysis
 

Real-time data acquisition and analysis continue to demand more streamlined and stable applications. Point-by-point analysis is streamlined and stable, because it ties directly into the acquisition and analysis process. With streamlined, stable point-by-point analysis, the acquisition and analysis process can move closer to the point of control in FPGA (field programmable gate array) chips, DSP chips, embedded controllers, dedicated CPUs, and ASICs.

To better understand the advantages of point-by-point analysis routines, National Instruments suggests reading the document titled "Getting Started with LabVIEW Point-By-Point VIs."

This document describes how to use the VIs and includes a case study that shows a complete application built in LabVIEW. The application demonstrates the simplicity and flexibility of point-by-point analysis.

By adding these powerful algorithms and routines into applications, users eliminate the guess work and create intelligent processes that can analyze results during run time, improving efficiency and iteratively correlating input variables to experiment or process performance.

Offline applications don’t typically have the demand for results to be obtained in real-time fashion in order to make decisions on the process. Offline analysis applications require only that sufficient computational resources are available. The main intent of such applications is to identify cause and effect of variables affecting a process by correlating multiple data sets. These applications generally require importing data from custom binary or ASCII files and commercial databases such as Oracle, Access, and other QL/ODBC-enabled databases. Once the data is imported into LabVIEW, users perform several or hundreds of available analysis routines, manipulate the data, and arrange it in a specific format for
reporting purposes. LabVIEW provides functions to access any type of file format and database, seamlessly connect to powerful reporting tools such as NI DIAdem and the Report Generation Toolkit for Microsoft Office, and execute the latest data-sharing technologies such as XML, Web-enabled data presentation, and ActiveX.

Programmatic vs. Interactive Analysis

As LabVIEW users, scientists and engineers are very familiar with the many ways in which they can acquire data from hundreds of devices. They build intelligence into their applications to perform inline analysis and present results while the applications are running. In addition, they are aware that acquiring data and processing it for the sake of online visualization is not enough. Users typically store hundreds or thousands of megabytes of data in hard drives and data bases. After anywhere from one to hundreds of runs of the application, users proceed to extract information in order to make decisions, compare results, and make appropriate changes to the process, until the desired results are achieved.

It is relatively easy to acquire amounts of data so large that it rapidly becomes unmanageable. In fact, with a fast DAQ board and enough channels, it may only take a few milliseconds to compile thousands of values. It is not a trivial task to make sense out of all that data. Engineers and scientists are typically expected to present reports, create graphs, and ultimately corroborate any assessments and conclusions with empirical data. Without the right tools, this can easily become a daunting task, resulting in lost productivity.

In order to simplify the process of analyzing measurements, LabVIEW programmers create applications that provide dialogs and interfaces that others can use so that depending on their input, specific analysis routines are performed on any given data set. By building this type of application, users build a certain degree of interactivity into their applications. For this to be efficient, the programmer must have extensive knowledge about the information and the types of analysis in which the user is interested.


Figure 3. Time Domain Reflection VI Based on Joint Time-Frequency Analysis Functions


With LabVIEW, users can easily perform significant data reduction and formatting before storing it to disk, so that when the stored data is retrieved for further analysis, it is easier to handle. LabVIEW also provides hundreds of functions for generating reports based on the results and information obtained from the acquired data.

National Instruments offers additional tools that are highly integrated with LabVIEW and are designed to enhance collaborative engineering. NI DIAdem is one such tool; it provides an easy-to-use environment for interactive, postacquisition analysis and report generation, with powerful technical data management capabilities. To learn more about DIAdem, go to ni.com/diadem.

Back to Top

3. Analysis Categories

LabVIEW offers hundreds of built-in analysis functions that cover different areas and methods of extracting information from acquired data. You can use these functions as is, or modify, customize, and extend them to suit a particular need. These functions are categorized in the following groups: Measurement, Signal Processing, Mathematics, Image Processing, Control, Simulation, and Application Areas.

Measurement
Amplitude and Level
Frequency (Spectral) Analysis
Noise and Distortion
Pulse and Transition
Signal and Waveform Generation
Time Domain Analysis
Tone Measurements

Signal Processing
Digital Filters
Convolution and Correlation
Frequency Domain
Joint Time-Frequency Analysis (Signal Processing Toolset)
Sampling/Resampling
Signal Generation
Superresolution Spectral Analysis (Signal Processing Toolset)
Transforms
Time Domain
Wavelet and Filter Bank Design (Signal Processing Toolset)
Windowing

Mathematics
Basic Math
Curve Fitting and Data Modeling
Differential Equations
Interpolation and Extrapolation
Linear Algebra
Nonlinear Systems
Optimization
Root Finding
Special Functions
Statistics and Random Processes

Image Processing
Blob Analysis and Morphology
Color Pattern Matching
Filters
High-Level Machine Vision Tools
High-Speed Grayscale Pattern Matching
Image Analysis
Image and Pixel Manipulation
Image Processing
Optical Character Recognition
Region-of-Interest Tools

Control
PID and Fuzzy Control

Simulation
Simulation Interface (Simulation Interface Toolkit)

Application Areas
Machine Condition Monitoring (Order Analysis Toolset)
Machine Vision (IMAQ, Vision Builder)
Motion Control
Sound and Vibration (Sound and Vibration Analysis Toolset)

Back to Top

4. Which Analysis Tools are Available for LabVIEW?

NI LabVIEW already includes a powerful set of tools for analysis. These tools encompass a built-in set of libraries and functions designed specifically for analysis, with which users can address a wide range of applications.

LabVIEW analysis tools cover a broad range of applications. Advanced analysis functions can measure such signal characteristics as total harmonic distortion, impulse response, frequency response, and cross-power spectrum. Scientists and engineers can also incorporate mathematics or numerical analysis into their applications for purposes such as solving differential equations, optimization, root finding, and other mathematical problems.

Although users can develop these functions themselves, built-in functions make it easy to work quickly on the problem instead of the tools. The advantage of using these functions also eliminates the need to understand the underlying theory required to build such algorithms.



Figure 4. Sound Level Meter Application based on the Sound and Vibration Analysis Toolset

Add-On Tools for Analysis

In addition to the built-in analysis libraries, users rely on add-on toolsets and modules to reduce development time for specialized application needs. By incorporating toolset components into custom applications, users reduce the need for particular expertise commonly associated with development more vertical applications such as advanced digital signal processing, sound and vibration measurements, order analysis, image processing, PID control, and simulation.

Advanced Signal Processing

The Signal Processing Toolset provides functions designed specifically for advanced digital signal processing (DSP). The included functions are divided into three categories: Joint Time-Frequency Analysis, Wavelet Analysis, and Superresolution Spectral Analysis. In addition, the toolset provides a graphical utility with which users can interactively design digital filters.

Joint Time-Frequency Analysis

Unlike conventional analysis technologies, the JTFA (joint time-frequency analysis) routines examine signals in both the time and frequency domains simultaneously. JTFA can be applied in almost all applications in which the FFT is used, such as biomedical signals, radar image processing, vibration analysis, machine testing, and dynamic signal analysis. However, with JTFA you get more information by analyzing the time and frequency domains simultaneously.

Like the classical Fourier analysis, the JTFA consists of two major methods - linear and quadratic. The linear algorithms include short-time Fourier transform (STFT) and Gabor expansion (inverse short-time Fourier transform.) LabVIEW users can take advantage of these linear transforms, to transfer a signal from the time domain into the joint time-frequency domain and vice versa. These routines are extremely powerful for noise reduction purposes. The quadratic methods contain adaptive spectrogram, Choi-Williams distribution, cone-shaped distribution, Gabor expansion-based spectrogram (also known as Gabor spectrogram), STFT-based spectrogram, and Wigner-Ville distribution. Users can apply the quadratic transforms, to easily see how the power spectrum of a signal evolves over time. The Gabor spectrogram results in the best compromise between high resolution and cross-term interference.

Wavelets

Wavelets are a relatively new signal processing method. A wavelet transform is almost always implemented as a bank of filters that decompose a signal into multiple signal bands. It separates and retains the signal features in one or a few of these subbands. Thus, one of the biggest advantages of using the wavelet transform is that signal features can be easily extracted. In many cases, a wavelet transform outperforms the conventional FFT when it comes to feature extraction and noise reduction. Because the wavelet transform can extract signal features, wavelet transforms find many applications in data compression, echo detection, pattern recognition, edge detection, cancellation, speech recognition, texture analysis, and image compression.

Model-based Spectral Analysis

A primary tool for spectral analysis is the Fast Fourier Transform (FFT). For high-resolution spectra, FFT-based methods need a large number of samples. However, in many cases the data set is limited because of a genuine lack of data or because users need to ensure that the spectral characteristics of the signal do not change over the duration of the data record. For cases where the number of data samples is limited, LabVIEW users can use model-based analysis to determine spectral characteristics. Using this technique, users assume a suitable signal model and determine the coefficients of the model. Based on this model, the application can then predict the missing points in the given finite data set to achieve high-resolution spectra. In addition, model-based methods can also be used for estimating the amplitude, phase, damping factor, and frequency of damped sinusoids. Superresolution spectral analysis can be used in diverse applications including biomedical research, economics, geophysics, noise, vibration and speech analysis.

Sound and Vibration Analysis

NI analysis software enables many common sound and vibration analysis applications including audio testing, acoustical measurements, environmental noise testing, vibration analysis, NVH measurements.   Specialized analysis capabilities include ANSI and IEC compliant fractional-octave analysis and zoom power spectra.  In addition, the Sound and Vibration Measurement Suite includes numerous functions for audio measurements such as gain, phase, THD, IMD, dynamic range, phase linearity, and swept-sine analysis.

Functions include full, 1/3, 1/6, 1/12 and 1/24 octave; user-defined sampling frequency; user-defined number of bands; A, B, C weighting in time domain; standards compliance; exponential averaging (Slow, Fast, and Custom time constant); cross-power spectrum; frequency response (H1, H2, and H3); coherence; and coherent output power. In addition, the toolset provides additional visualization tools such as waterfall graph, colormap graph, octave bar graph, and octave line graph that can be easily built into the front panels of LabVIEW applications.

Order Analysis

The NI Sound and Vibration Measurement Suite provides libraries to build custom LabVIEW-based measurement and automation applications with order analysis capabilities for order tracking, order extraction, and tachometer signal processing.

With the Gabor Order Tracking algorithm, LabVIEW users can analyze sound, vibration, and other dynamic signals from mechanical systems with rotating or reciprocating components. It offers flexible order energy selection in the joint time-frequency domain. Additional tools include plotting individual order(s) versus time or rpm, order extraction tools to separate order-specific signal components from the acquired signal, automatic order selection tools to find and specify the most significant orders, and user-specified order selection for analysis .

Back to Top

5. Conclusion

With the power and flexibility of today’s computers, engineers and scientists have an unprecedented ability to efficiently measure, control, monitor, diagnose, automate, test, and characterize any process. This however is not possible, without the ability to look at data and extract useful information.

National Instruments LabVIEW and the integrated analysis functions it provides, make up a powerful graphical application development environment designed specifically for engineers and scientists. LabVIEW provides solutions regardless of industry or area of focus within the engineering process, from design and validation to production.

In addition, LabVIEW offers unparalleled connectivity to plug-in DAQ devices and stand-alone instruments for acquiring data. LabVIEW provides powerful analysis libraries, routines, and algorithms that range from basic math to advanced signal processing, for both general-purpose applications as well as more vertical needs, which can be seamlessly integrated with all other functions in LabVIEW. These functions, in conjunction with powerful data visualization capabilities, make LabVIEW the ideal tool for any application.

Back to Top

6. For More Information

  • Learn how to save three weeks with LabVIEW by automating report generation
  • National Instruments home page for Signal Processing and Analysis
  • The Advanced Signal Processing Toolset  extends LabVIEW with tools for Wavelets, Joint Time-Frequency Analysis, and Time-Series Analysis.
  • The NI Sound and Vibration Toolkit extends LabVIEW with tools for sound / vibration level with weighting, waterfall plots, fractional octave analysis and more.
  • The NI Sound and Vibration Measurement Suite extends LabVIEW with functions for acoustic, vibration, and order analysis.

 

Back to Top

Bookmark & Share


Ratings

Rate this document

Answered Your Question?
Yes No

Submit