Watch the five-minute demo
Raw data does not always immediately convey useful information. Usually, you must transform the signal, remove noise disturbances, correct for data corrupted by faulty equipment, or compensate for environmental effects, such as temperature and humidity. For that reason, signal processing, which is the analysis, interpretation, and manipulation of signals, is a fundamental need in virtually all engineering applications. With the comprehensive analysis capabilities of LabVIEW software, you can perform all the signal processing you need without wasting time moving data between incompatible tools or writing your own analysis routines.
You can choose from several ways to incorporate analysis into your applications with LabVIEW. Typically, you want to select the method that best fits the decision(s) you make as a result of the analysis.
Inline analysis implies that data is analyzed and acquired in the same application. If your application involves monitoring a signal and changing behavior based on the characteristics of the incoming data, you need to perform analysis as you acquire the data. By measuring and analyzing certain aspects of the signals, you can make the application adapt to certain circumstances and enable the appropriate execution parameters – perhaps saving the data to disk or increasing the sampling rate. Although this is only one example, there are thousands of applications where a certain degree of intelligence – the ability to make decisions based on various conditions – and adaptability are required, which is possible only by adding analysis algorithms to the application.
Typically, the decisions made based on the data are automated. This implies that the logic is built into the application to handle certain behaviors. For example, a plant monitoring system might light an LED to signify when a temperature passes its threshold or a vibration level is getting too high. However, not all decisions based on acquired data are made in an automated manner. To determine whether the system is performing as expected, you often must monitor the execution. Rather than logging data, extracting it from a file or database, and then analyzing it offline only to discover errors in the acquisition, you should identify problems as you acquire data. In these cases, the application must handle the data coming from the process and then manipulate, simplify, format, and present the data in the way that is most useful. With the built-in suite of dialogs in LabVIEW, you can create an application that presents options to the operator or user. For example, if the temperature is too high, the dialog could force the operator to take a specified action and then press the “OK” or “Continue” button to proceed with the application.
Whether your decisions are made by built-in logic or a human user, LabVIEW offers analysis and mathematical routines that natively work together with data acquisition functions and display capabilities. This makes it possible for them to be easily built into any application without your enduring the tedious process of massaging data into the different formats required by separate tools. In addition, LabVIEW provides analysis routines for point-by-point execution; these routines are designed specifically to meet inline analysis needs in real-time applications.
Point-by-point analysis is a subset of inline analysis where results are calculated after every individual sample rather than on a group of samples. Such analysis is essential when dealing with control processes featuring high-speed, deterministic, single-point data acquisition. The point-by-point approach simplifies the design, implementation, and testing process because the application flow closely matches the natural flow of the real-world processes that the application is monitoring and controlling.
Figure 1. Array-Based Analysis versus Point-by-Point Analysis
With streamlined point-by-point analysis, the acquisition and analysis process can move closer to the point of control because the latency between acquisition and decision is minimized. You can further decrease this acquisition latency by deploying your analysis to field-programmable gate array (FPGA) chips, digital signal processing (DSP) chips, embedded controllers, dedicated CPUs, and ASICs.
When you add powerful algorithms and routines to applications, you eliminate the guesswork and create intelligent processes that can analyze results during run time, improving efficiency and iteratively correlating input variables to experiment or process performance.
Inline analysis is not always the correct methodology when implementing your analysis routines. You may choose to perform offline analysis when you do not need to make decisions as you acquire the data. Typically, the intent of an offline analysis application is to identify the cause and effect of variables by correlating multiple data sets. Because you perform this analysis after you acquire the data, you are not limited by the timing and memory constraints of data acquisition; such analysis requires only that sufficient computational power is available. This provides several advantages to performing analysis. First, offline analysis offers far greater data interactivity, giving you the ability to truly explore both the raw data and results of the analysis implementation. Histograms, trending, and curve-fitting are all common offline analysis tasks. Furthermore, being the bottleneck for a live acquisition is no longer a concern, considering the amount of time that intense signal processing algorithms can take when operating on large data sets.
Analyzing your acquired data outside the acquisition generally requires you to transfer the data to a file, whether it is in a binary, text, or custom format. LabVIEW is compatible with a wide variety of standard file formats, but DataPlugins extend the data file support capabilities of LabVIEW. You can use DataPlugins to describe any custom file format and tell LabVIEW how to interpret the data files in which your data is stored. The paradigm of applying signal processing to your acquired data is the same when you open the data from a file as when you stream it from hardware.
As you use LabVIEW, you will find yourself solving increasingly larger portions of your application without leaving the LabVIEW environment. Sometimes you still may need to take your data into another tool for offline analysis or dissemination within your company. For instance, moving to a program such as Microsoft Excel requires you to save your acquired data to a file in a format that Excel understands and then opening that data in Excel. Typically, the steps necessary to account for formatting and file compatibility differences between different applications are just another task that falls on you as the developer. Fortunately, LabVIEW helps to simplify these typically burdensome steps with both built-in and add-on tools. LabVIEW features built-in functions to help you transport data directly to Microsoft Excel to save the data in a compatible format and offers the LabVIEW Report Generation Toolkit so you can automatically build reports from your acquisition and analysis code. If you need to perform more interactive analysis on your data, LabVIEW can efficiently sit alongside NI DIAdem data management and interactive analysis software.
Acquiring data and processing it for the sake of seeing the data on a screen is not always enough. Oftentimes you need to store acquired data for future reference, and it is not uncommon to store hundreds or even thousands of megabytes of data in hard drives and databases. After anywhere from one to hundreds of runs of the application, you may proceed to extract information to make decisions, compare results, and make appropriate changes to the process until you achieve the desired results.
Blindly storing all of your acquired data makes it relatively easy to accumulate so much data that it becomes unmanageable. With a fast data acquisition board and enough channels, it may take only a few milliseconds to compile thousands of values (the NI PCI-6115 S Series data acquisition (DAQ) board acquires more than 57 MB of raw data a second). It is not a trivial task to make sense out of all that data. As an engineer or scientist, you are typically expected to present reports, create graphs, and ultimately corroborate any assessments and conclusions with empirical data. Without the right tools, this can be a daunting task, resulting in lost productivity.
With LabVIEW, you can easily perform significant data reduction and formatting before storing it to disk, so that when the stored data is retrieved for further analysis or review, it is easier to handle. Resampling, averaging, and mathematical transforms such as fast Fourier transforms (FFTs) can convert large amounts of raw data into more useful results for logging and future reference.
Combining analysis with data acquisition and data presentation in a single application is not possible in most software development environments. The typical software package is either a general-purpose programming language that lacks signal processing libraries, a dedicated turnkey application that performs only a single task (in other words, acquisition), or a numerical analysis tool with limited support for hardware and real-world signals. Few address all the requirements of a measurement system, including analysis, which forces you to waste time transferring data between tools and converting between intermediate formats. Unlike software development tools designed only for data acquisition or signal processing, LabVIEW was developed from the beginning to provide a completely integrated solution so you can simultaneously acquire and analyze data in a single environment.
Figure 2. Single VI Showing Acquisition, Analysis, and Logging to File with Express VIs
As an engineering-focused tool, LabVIEW graphical programming and its extensive set of signal processing and measurement functions greatly simplify the development of measurement and inline analysis applications. LabVIEW users can integrate these functions right into their applications to make intelligent measurements and obtain results faster.
LabVIEW contains more than 850 built-in signal processing, analysis, and mathematics functions that simplify development for a broad variety of applications. These functions range from high-level configuration-based assistants to low-level building blocks that you can combine to fully customize your algorithm. This wide range of functional implementations provides the flexibility to apply your required algorithms as needed.
Configuration-based Express VIs are the simplest way to add inline measurement analysis and signal processing to a LabVIEW application. When you add an Express VI to your block diagram, you are presented with a dialog that aids you in configuring the analysis you wish to perform. This reduces the complexity associated with adding analysis and signal processing algorithms to your application. The many signal analysis Express VIs provide a configuration approach to LabVIEW development and encompass much of the lower-level signal processing capabilities of LabVIEW.
Figure 3. Signal Analysis Palette Showing Extensive Express VIs for Signal Processing
With Express VIs, you can interactively explore the various analysis algorithms settings while immediately seeing the results in the configuration dialog. For example, the Amplitude and Level Measurements Express VI performs level measurements such as DC, RMS, maximum and minimum peak, peak-to-peak calculations, cycle average, and cycle RMS.
Figure 4. Configuration Window for Amplitude and Level Measurements Express VI
Similarly, the Filter Express VI provides tools to configure such digital filters as lowpass, highpass, bandpass, and bandstop. The configuration dialog for this Express VI provides controls to interactively configure filter settings such as high and low cutoff frequencies, number of taps for finite impulse response (FIR) filters, topology selection for infinite impulse response (IIR) filters (Butterworth, Chebyshev, inverse Chebyshev, elliptic, and Bessel), and order selection.
Figure 5. Configuration Window for Filter Express VI
One common challenge in analyzing data is dealing with multiple signals that must be correlated despite their different sampling rates. However, you can use the Align and Resample Express VI, which takes two or more signals and provides tools to align and resample the signals resulting from an acquisition with different sampling rates and acquisition parameters. This Express VI features tools to select the acquisition type, alignment interval, and resampling characteristics (lowest dt, user-defined dt, or based on a reference signal).
Figure 6. Configuration Window for Align and Resample Express VI
LabVIEW also includes Express VIs for the following high-level functions:
LabVIEW also contains a comprehensive library of lower-level signal analysis functions that perform specific analysis tasks. These VIs are grouped into two major categories: signal processing and mathematics. Signal processing libraries include functions for filtering, signal generation, signal analysis, transformation, waveform conditioning, waveform generation, waveform measurements, and windowing. Within the filtering VI subset alone, there are filtering VIs for Bessel, Butterworth, Chebyshev, elliptic, FIR windowed, inverse Chebyshev, and others. Mathematics libraries include functions for differential equations, curve fitting, geometry, integrals, interpolation, linear algebra, optimization, polynomials, and probability and statistics.
An example of a low-level signal analysis library is the spectral analysis library (shown in Figure 7).
Figure 7. Spectral Analysis Palette
One commonly used VI from this palette is the Auto Power Spectrum VI, which computes the single-sided, scaled, auto power spectrum of a time-domain signal. Instead of having to develop power spectrum code from scratch, you can use this VI immediately and save substantial time. If you need to view or edit the code that composes a VI like the Auto Power Spectrum.vi, you can see its source code instantly by double-clicking on its block diagram icon, which accesses its block diagram, as shown in Figure 8.
Figure 8. Auto Power Spectrum VI Block Diagram
These analysis libraries have 20 years of proven usage, and NI continues to invest extensively in the graphical math and signal processing libraries, adding new functions as well as single-core and multicore performance.
In LabVIEW, you have the freedom to choose the syntax you prefer for analysis when developing algorithms, analyzing results, or processing signals. Although LabVIEW is well-known as a development environment for a graphical programming language, it also offers math-oriented textual programming through a native compiler for .m files. This compiler, LabVIEW MathScript, uses the .m file script syntax and includes more than 800 commonly used functions for math, signal processing, analysis, and control.
The LabVIEW MathScript RT Module is an add-on for LabVIEW that installs the LabVIEW MathScript compiler along with two interfaces to implement your custom .m files.
The LabVIEW MathScript Window offers an interactive interface in which you can load, save, design, and execute your .m files. It is designed for concept exploration through a command-line interface that you use to enter commands one at a time or through the building of batch scripts in a simple text editor window. Figure 9 displays the LabVIEW MathScript Window, which you can access from the LabVIEW menus by choosing Tools»MathScript Window.
Figure 9. LabVIEW MathScript Window for Interacting with Your Custom .m Files
The LabVIEW MathScript Window provides immediate feedback in a variety of formats including graphs and text. You can use a variety of plotting commands to generate graphs from the LabVIEW MathScript Window similar to Figure 10.
Figure 10. Example Plot Window Generated from LabVIEW MathScript
You can generate a wide variety of plots from MathScript including the following:
These plotting capabilities help you visualize the results of your data to confirm the output of your analysis routines.
View for more information on using the MathScript Interactive Window for algorithm development.
Combining textual programming with traditional LabVIEW graphical programming is also possible using a script node interface. Script nodes are resizable text entry regions on the LabVIEW block diagram that you can add to your graphical programs. Through the MathScript Node, you can execute scripts during the run-time execution of a VI. Data enters the left border of the node, is used or modified during the sequential execution of the script text, and then exits the node through an output variable on the right border of the node.
Figure 11. MathScript Node Places Your Custom .m File Code Inline with Graphical G Code
You can type your scripts, copy and paste, or import them from a file. With the MathScript Node, you can reuse your custom .m files, even if you developed them outside LabVIEW MathScript, to bring your text-based math routines inline with your data acquisition in the graphical LabVIEW environment.
With LabVIEW MathScript and LabVIEW graphical programming, you have the power to choose the most appropriate syntax, which can often be a combination of the two. Consider the following script from the widely used textbook Digital Signal Processing Laboratory Using MATLAB® by Sanjit Mitra. It generates a test signal and then applies a moving-average filter to it.
% Simulation of an M-point Moving Average Filter
% Generate the input signal
n = 0:100;
s1 = cos(2*pi*0.05*n); % A low-frequency sinusoid
s2 = cos(2*pi*0.47*n); % A high frequency sinusoid
x = s1+s2;
% Implementation of the moving average filter
M = input('Desired length of the filter = ');
num = ones(1,M);
y = filter(num,1,x)/M;
% Display the input and output signals
clf;
subplot(2,2,1);
plot(n, s1);
axis([0, 100, -2, 2]);
xlabel('Time index n'); ylabel('Amplitude');
title('Signal #1');
subplot(2,2,2);
plot(n, s2);
axis([0, 100, -2, 2]);
xlabel('Time index n'); ylabel('Amplitude');
title('Signal #2');
subplot(2,2,3);
plot(n, x);
axis([0, 100, -2, 2]);
xlabel('Time index n'); ylabel('Amplitude');
title('Input Signal');
subplot(2,2,4);
plot(n, y);
axis([0, 100, -2, 2]);
xlabel('Time index n'); ylabel('Amplitude');
title('Output Signal');
axis;
This script generates two sinusoid signals, adds them together, and then applies a moving-average filter to the sum of the two. Figure 10 is the plot generated from this script. The LabVIEW MathScript Window provides an interface to interact with the script one run at a time. However, combining this script with the LabVIEW graphical programming paradigm provides a powerful method to automate the script, giving you the ability to interact with your input parameters on the fly.
Figure 12. Example Block Diagram Integrating Text-Based Math with G Code in LabVIEW
Figure 12 displays the integration of the script into the LabVIEW block diagram using the MathScript Node. Two changes were made to the script:
Figure 13. Example Front Panel Integrating LabVIEW User Interface Components with Text-Based Math
With this LabVIEW front panel, you can control the low and high frequencies of the generated sinusoids as well as the length of the moving-average filter being implemented. As the LabVIEW VI runs, you can change these values and easily visualize the output of the analysis routine and how it is affected by the changes to the input values. This common procedure of applying the interactivity of a LabVIEW VI to a text-based script is known as instrumenting your algorithm.
The extensive library of built-in routines for LabVIEW, along with the options for implementing these routines, saves you considerable development time. Because National Instruments has developed and tested these routines for more than 20 years, you spend less time validating the correctness of the routines. In many general-purpose programming languages that lack these built-in libraries, you have to not only build your routines from scratch but also validate that the outputs are correct.
The extensive library of built-in routines for LabVIEW gives you the power to use predefined algorithms, such as those in the Express VIs. However, the low-level libraries also offer you the opportunity to customize these routines to best fit your application. Whether it is choosing a point-by-point implementation or handling complex data, you can easily customize these routines to optimize the built-in libraries for your purposes.
Generating data for the purposes of testing or prototyping is an often overlooked capability of a programming language. LabVIEW has a wide variety of signals that you can generate to represent real-world semantics in your VI.
Figure 14. Signal Generation Functions Palette in LabVIEW
These functions, shown in Figure 14, give you the flexibility to develop applications without a sometimes-burdensome hardware setup by simulating the inputs from the hardware. Useful in both the testing and prototyping phases of most projects, this functionality provides a powerful alternative to always having hardware available.
The generation of general-purpose signals is also available in Express VI form. The Simulate Signal Express VI, shown in Figure 15, can generate sine, square, triangle, sawtooth, and DC signals, and is complemented by the Simulate Arbitrary Signal Express VI, which you can use to define the signal components.
Figure 15. Configuration Window for Simulate Signal Express VI
The LabVIEW signal processing, analysis, and mathematics libraries were designed for general-purpose applications within the science and engineering fields. In addition to these built-in analysis libraries, you can use add-on toolkits and modules to reduce development time for specialized needs in specific applications or industries. By incorporating toolkit components into custom applications, you reduce the need for the particular expertise commonly associated with the development of more vertical applications such as advanced digital signal processing, sound and vibration measurements, order analysis, image processing, proportional integral derivative (PID) control, and simulation.
The LabVIEW Advanced Signal Processing Toolkit features functions designed specifically for advanced digital signal processing (DSP). These are divided into three categories: joint time-frequency analysis, wavelet analysis, and superresolution spectral analysis. In addition, the toolkit provides a graphical utility with which you can interactively design digital filters.
Jump to complete list of analysis and signal processing functions included in this add-on.
Unlike conventional analysis technologies, joint time-frequency analysis (JTFA) routines examine signals in both the time and frequency domains simultaneously. You can apply JTFA in almost all applications in which you use the FFT, such as biomedical signals, radar image processing, vibration analysis, machine testing, and dynamic signal analysis. However, with JTFA, you obtain more information by analyzing the time and frequency domains simultaneously.
Like the classical Fourier analysis, JTFA consists of two major methods: linear and quadratic. The linear algorithms include short-time Fourier transform (STFT) and Gabor expansion (inverse short-time Fourier transform). LabVIEW users can take advantage of these linear transforms to transfer a signal from the time domain into the joint time-frequency domain and vice versa. These routines are extremely powerful for noise reduction purposes. The quadratic methods contain adaptive spectrogram, Choi-Williams distribution, cone-shaped distribution, Gabor expansion-based spectrogram (also known as Gabor spectrogram), STFT-based spectrogram, and Wigner-Ville distribution. You can apply the quadratic transforms to easily see how the power spectrum of a signal evolves over time. The Gabor spectrogram results in the best compromise between high resolution and cross-term interference.
Wavelets are a relatively new signal processing method. A wavelet transform is almost always implemented as a bank of filters that decomposes a signal into multiple signal bands. It separates and retains the signal features in one or a few of these subbands. Thus, one of the biggest advantages of using the wavelet transform is that you can easily extract signal features. In many cases, a wavelet transform outperforms the conventional FFT when it comes to feature extraction and noise reduction. Because the wavelet transform can extract signal features, they are used many applications for data compression, echo detection, pattern recognition, edge detection, cancellation, speech recognition, texture analysis, and image compression.
A primary tool for spectral analysis is the fast Fourier transform (FFT). For high-resolution spectra, FFT-based methods need a large number of samples. However, in many cases, the data set is limited because of a genuine lack of data or because users need to ensure that the spectral characteristics of the signal do not change over the duration of the data record. For cases where the number of data samples is limited, you can use model-based analysis to determine spectral characteristics. Using this technique, you assume a suitable signal model and determine the coefficients of the model. Based on this model, the application can then predict the missing points in the given finite data set to achieve high-resolution spectra. In addition, you can use model-based methods to estimate the amplitude, phase, damping factor, and frequency of damped sinusoids. You also can use superresolution spectral analysis in diverse applications including biomedical research, economics, geophysics, noise, vibration, and speech analysis.
The importance of digital filters is well-established. Digital filters, and, more generally, digital signal processing (DSP) algorithms are classified as discrete-time systems. They are commonly implemented on a general-purpose computer, on a dedicated DSP chip, or in an FPGA chip. Because of their well-known advantages, digital filters are often used to replace classical analog filters. Highlights of the LabVIEW Digital Filter Design Toolkit include the ability to work with live signals to facilitate real-world filter testing and the ability to automatically generate LabVIEW and ANSI C code for targeting a DSP, an FPGA, or other embedded systems.
Jump to complete list of analysis and signal processing functions included in this add-on.
NI analysis software helps you implement many common sound and vibration analysis applications including audio testing, acoustical measurements, environmental noise testing, vibration analysis, and noise, vibration, and harshness (NVH) measurements. Specialized analysis capabilities include ANSI- and IEC-compliant fractional-octave analysis and zoom power spectra. In addition, the NI Sound and Vibration Measurement Suite includes numerous functions for audio measurements such as gain, phase, THD, IMD, dynamic range, phase linearity, and swept-sine analysis. The NI Sound and Vibration Assistant provides an easy-to-use, interactive, configuration-based environment for analysis and data logging.
Functions include full, 1/3, 1/6, 1/12, and 1/24 octave; user-defined sampling frequency; user-defined number of bands; A, B, and C weighting in the time domain; standards compliance; exponential averaging (Slow, Fast, and Custom time constant); cross-power spectrum; frequency response (H1, H2, and H3); coherence; and coherent output power. The suite also provides additional visualization tools such as waterfall graph, colormap graph, octave bar graph, and octave line graph, which you can easily build into the front panels of LabVIEW applications.
The Sound and Vibration Measurement Suite offers libraries to build custom measurement and automation applications based on LabVIEW that feature order analysis capabilities for order tracking, order extraction, and tachometer signal processing.
With the Gabor Order Tracking algorithm, you can analyze sound, vibration, and other dynamic signals from mechanical systems with rotating or reciprocating components. It provides flexible order energy selection in the joint time-frequency domain. You also can plot individual order(s) versus time or rpm or use order extraction tools to separate order-specific signal components from the acquired signal, automatic order selection tools to find and specify the most significant orders, and user-specified order selection for analysis.
Jump to complete list of analysis and signal processing functions included in this add-on.
The NI Vision Development Module is a collection of image processing and machine vision functions for numerous programming languages such as NI LabVIEW and Microsoft C++, Visual Basic, and .NET. With these functions, you can enhance images, check for presence, locate features, identify objects, and measure parts. Along with programming libraries, the Vision Development Module includes the NI Vision Assistant and NI Vision Acquisition software.
The Vision Development Module delivers:
Today’s complex RF systems require a fast and flexible test platform to deliver reliable measurements from prototype to manufacturing. In fact, NI modular RF instruments incorporate technologies such as multicore processors and PCI Express to achieve measurement speeds that are 5X to 10X faster than traditional instruments in automated test applications. This platform for communications test operates from DC to 6.6 GHz with up to 100 MHz of instantaneous RF bandwidth. Effectively analyzing these signals requires specific functions available in several LabVIEW add-ons.
With the NI Wireless LAN (WLAN) Measurement Suite, you can perform common IEEE 802.11a/b/g measurements with industry-leading speed and accuracy. Combined with high-performance multicore processors, PXI Express WLAN measurement systems can complete most 802.11 measurements 5X to 10X faster than traditional box instruments. In addition, because PXI Express RF instrumentation is software-defined, you can test multiple standards with the same set of instrumentation. Thus, you can use the same hardware to test DVB-T, GPS, WiMAX, WCDMA, ZigBee, Bluetooth, and many other standards.
Jump to complete list of analysis and signal processing functions included in this add-on.
National Instruments supplies flexible software utilities and modular RF instrumentation for the emulation and measurement of radio frequency identification (RFID) readers and tags. You can combine these tools to perform comprehensive protocol and parametric tests in accordance with international RFID standards. Some benefits of using NI RFID tools include:
Jump to complete list of analysis and signal processing functions included in this add-on.
MATLAB® is a registered trademark of The MathWorks, Inc.