Example Code

Least Mean Square (LMS) Adaptive Filter

Products and Environment

This section reflects the products and operating system used to create the example.

To download NI software, including the products shown below, visit ni.com/downloads.

    Software

  • LabVIEW

Code and Documents

Attachment

Description

Overview


This article introduces the concept of adaptive filters and least mean square (LMS) adaptive algorithms. This article also introduces the implementation of the LMS finite impulse response (FIR) adaptive filter by using LabVIEW and the performance indicators of adaptive filters. This article includes a demo that you can use to learn the basic concepts of adaptive filters. The demo uses adaptive filters to identify the impulse response of an unknown system. In the demo, you can define the impulse response of the unknown system, select adaptive algorithm parameters, plot learning curves, and compare learning curves. By comparing learning curves from different adaptive filter settings, you can learn how the settings affect the performance of adaptive filters.

Least Mean Square (LMS) Adaptive Filter Concepts

An adaptive filter is a computational device that iteratively models the relationship between the input and output signals of a filter. An adaptive filter self-adjusts the filter coefficients according to an adaptive algorithm. Figure 1 shows the diagram of a typical adaptive filter.

Figure 1. Typical Adaptive Filter

where   x(n) is the input signal to a linear filter

            y(n) is the corresponding output signal

            d(n) is an additional input signal to the adaptive filter

            e(n) is the error signal that denotes the difference between d(n) and y(n).

The linear filter can be different filter types such as finite impulse response (FIR) or infinite impulse response (IIR). An adaptive algorithm adjusts the coefficients of the linear filter iteratively to minimize the power of e(n). The LMS algorithm is an adaptive algorithm among others which adjusts the coefficients of FIR filters iteratively. Other adaptive algorithms include the recursive least square (RLS) algorithms.

The LMS algorithm performs the following operations to update the coefficients of an adaptive FIR filter:

  1. Calculates the output signal y(n) from the FIR filter.

where     is the filter input vector and

             is the filter coefficients vector and

  1. Calculates the error signal e(n) by using the following equation: e(n) = d(n)–y(n)
  2. Updates the filter coefficients by using the following equation:

where    μ is the step size of the adaptive filter

            w(n) is the filter coefficients vector

            u(n) is the filter input vector.

Implementing LMS Adaptive Filters Using LabVIEW

You can implement an LMS adaptive filter using the LabVIEW Adaptive Filter Toolkit. You also can implement an LMS adaptive filter using the LabVIEW graphical development environment without the Adaptive Filter Toolkit. Typically, there are two types of application programming interface (API) designs for an adaptive filter. 

Type I LMS Adaptive Filter API Design

You can design an API that completes the functions in the dashed box in the following figure. In this API, the signals x(n) and e(n) are inputs, and y(n) and the filter coefficients are outputs.  

Figure 2. Type I LMS Adaptive Filter API

The following block diagram shows the FIR LMS 1 VI, which is the LabVIEW implementation of the Type I Adaptive Filter API.

Figure 3. Block Diagram of the FIR LMS 1 VI

Figure 4 shows the FIR LMS 1 VI icon with the inputs and outputs of the FIR LMS 1 VI. This icon represents the FIR LMS 1 VI in figure 5.

Figure 4. Icon of the FIR LMS 1 VI

To use the Type I LMS Adaptive Filter API, you need to compute the error signal e(n) outside the FIR LMS 1 VI and use a feedback node to connect e(n) to the input of the FIR LMS 1 VI. The following block diagram uses the FIR LMS 1 VI in a system identification application.


Figure 5. System Identification Application Using the FIR LMS 1 VI

The Unknown System VI in figure 5 uses a filter and noise to simulate the behavior of an unknown system. When the adaptive filter converges, e(n) becomes a zero mean white noise with minimal power, and d(n) equals y(n) in a statistical sense. The impulse response of the linear filter converges to the impulse response of the unknown system.

By using the Type I Adaptive Filter API design, the block diagram of the applications is analogous to the schematic diagram of adaptive filters in textbooks.

Type II LMS Adaptive Filter API Design

You also can design an adaptive filter API according to figure 6. In this API, x(n) and d(n) are inputs. y(n), the filter coefficients, and e(n) are outputs. The Type II LMS Adaptive Filter API computes the error signal e(n) inside an LMS adaptive filter VI.

Figure 6. Type II LMS Adaptive Filter API

The following block diagram shows the FIR LMS 2 VI, which is the LabVIEW implementation of the Type II Adaptive Filter API.


Figure 7. Block Diagram of the FIR LMS 2 VI

Figure 8 shows the FIR LMS 2 VI icon with the inputs and outputs of the FIR LMS 2 VI. You can see that this icon is used in figure 9.

Figure 8. Icon of the FIR LMS 2 VI

To use the Type II Adaptive Filter API, x(n) and d(n)connect to the FIR LMS 2 VI. The block diagram in figure 9 uses the Type II Adaptive Filter API in a system identification application.

Figure 9. System Identification Application Using the FIR LMS 2 VI

Compared to the Type I Adaptive Filter API, the application diagram of the Type II Adaptive Filter API is not analogous to the schematic diagram in textbooks. Unlike the applications that use the Type I Adaptive Filter API, the applications that use the Type II Adaptive Filter API do not need additional nodes, such as subtraction and feedback nodes. 

In demonstration or educational applications where you might like your application diagram to be analogous to schematic diagrams in textbooks, you can choose the Type I Adaptive FilterAPI.  In real applications where you acquire the stimulus and response signals of the unknown system by using the DAQmx VIs, you obtain x(n) and d(n) by using the DAQmx Read VI.  Refer to the LabVIEW Help for more information about the DAQmx VIs. The DAQmx VIs in LabVIEW vary according to the NI device driver installed. In most real applications, the Type II Adaptive Filter API is easier to use than the Type I Adaptive Filter API. The VIs in the Adaptive Filter Toolkit use the Type II Adaptive Filter API.

Figure 10 shows an example of a system identification application that uses the VIs from the Adaptive Filter Toolkit. This example acquires the stimulus and response signal of an unknown system and uses Adaptive Filter VIs to estimate the impulse response of the unknown system.  The Type II Adpative Filter APIs in the Adaptive Filter Toolkit make the Adaptive Filter VIs easier to use with the DAQmx VIs. 


Figure 10. System Identification Application Using the Adaptive Filter VIs

The Adaptive Filter Toolkit uses a reference object to manage the Adaptive Filter VIs.  A reference object is a reference to the memory space that saves information and data about the adaptive filter. In the previous example, you use the AFT Create FIR LMS VI to create a reference object for an LMS adaptive filter. You use the AFT Filter Signal and Update Coefficients VI and AFT Get Coefficients VI inside the While Loop. You use the AFT Destroy Adaptive Filter VI to close the reference object. These VIs guarantee no memory allocation inside the while loop. Therefore, the applications that use these VIs have better performance and determinism in a real-time environment.

Convergence Speed, Adoption Rate, and Settling Time

Adaptive filters optimize the filter coefficients to minimize the power of the error signal iteratively. The process of minimizing the power of the error signal is known as convergence. A fast convergence indicates that the adaptive filter takes a short time to calculate the appropriate filter coefficients that minimize the power of the error signal. Settling time is the time period that adaptive filters take to converge. Smaller settling time means quicker convergence speed. Convergence speed is also known as adoption rate. 

Steady State Error

Steady state is the state when the adaptive filter converges and the filter coefficients no longer have significant changes.  Because signals might include random noise or because adaptive filters are not optimum, the error signal outputs of adaptive filters are not necessarily zero when the adaptive filter converges. This error is called the steady state error.

Minimum Mean Square Error, Excess Mean Square Error, and Misadjustment

Winner filters are optimum filters that minimize the error signal. When adaptive filters converge to winner filters, the error signals are orthogonal and uncorrelated to the input signals.  The mean square of the winner filters’ error signal defines the minimum mean square error of adaptive filters.  However, not all adaptive filters converge to winner filters. The filter coefficients of adaptive filters might fluctuate around the winner filter coefficients and never equal winner filter coefficients exactly. The filter coefficients fluctuation introduces excess error to the error signal. Excess mean square error is the difference between the mean-square error introduced by adaptive filters and the minimum mean square error produced by corresponding winner filters [1]. Misadjustment is the ratio of the excess mean square error to the minimum mean square error.

Stability

FIR filters are stable inherently. But adaptive FIR filters are not always stable. Because adaptive algorithms adjust filter coefficients iteratively, the filter coefficients can become infinite. When filter coefficients become infinite, the adaptive filter is unstable.

Learning Curve

 A learning curve is a tool that can help better understand the performance of an adaptive filter, including convergence speed, steady state error, and stability. A learning curve is a plot of the mean square error (MSE) of the adaptive filter versus time or iteration. Usually learning curves decay exponentially to a constant value. Figure 11 shows the learning curve of an adaptive filter.

Figure 11. Learning Curve of an Adaptive Filter

As the iteration increases, the adaptive filter MSE decreases and converges to a value around 1e-8.  From looking at the learning curve you can consider that the adaptive filter converges at about the 600th sample. Indeed the error does not reduce anymore and the filter coefficients change little afterwards.  The steady state error is about 1e-8.

You can compute the learning curve by performing many realizations and averaging the square of each realization’s error signal. Figure 12 shows a block diagram that calculates the learning curve of an LMS adaptive filter. 


Figure 12. Calculation of the Learning Curve from Different Realizations

LMS Adaptive Filter Lab Demo

You can use the demo in this article to learn basic concepts of adaptive filters. This demo uses an adaptive filter to estimate the impulse response of an unknown system. This kind of application is also known as system identification.

You can perform the following tasks by using this demo.

  • Define the impulse response of an unknown system.
  • Define the input signal of the system.
  • Define the adaptive filter’s length, the algorithm type, and the parameters for the adaptive algorithm.
  • Simulate the behavior of the adaptive filter.
  • Plot the learning curve of the adaptive filter.
  • Compare learning curves.

Figure 13 shows the main page of the demo.

Figure 13. Demo Main Screen Displaying the Adaptive Filter Transfer Function

You can use this demo by completing the following steps.

  1. Click the x(n), Unknown System, and Adaptive Filter links on the screen to define x(n), the impulse response of the unknown system, and the parameters of the adaptive filter respectively. 
  2. Click the Simulate button to display the simulation UI according to figure 14. This page simulates the system you defined in the previous step. By simulating the system, you can observe how the value of the error signal minimizes to zero when the adaptive filter converges.


Figure 14. Simulation UI

  1. Define the parameters for the adaptive filter in the simulation UI and click the Start button to simulate the adaptive filter. Click the Stop button to terminate the simulation process.
  2. Click the Analyze button to display the analysis UI according to figure 15. This page plots the learning curve of the adaptive filter.


Figure 15. Analysis UI

  1. Define the parameters for the adaptive filter in the analysis UI and click the Compute button to plot the learning curve of the adaptive filter.
  2. Click the Save to Records button to add the current learning curve to the Records list so that you can compare several different learning curves later. 
  3. Click the Compare button to compare different learning curves that you previously saved in the Analysis UI.

By observing the learning curve, you can know how the adaptive filter parameter settings affect the performance of the adaptive filter. For example, figure 16 shows the learning curves of an LMS adaptive filter whose step size is 0.01 (white), 0.02 (red), and 0.03 (green). From the learning curve, you can deduce that adaptive filters with a large step size converge faster than adaptive filters with a small step size.


Figure 16. Influence of the Step Size on the Convergence Speed of an Adaptive Filter 

Downloads

LMS Adaptive Filter Lab (Exe).zip is the executable version of the demo. This demo requires the LabVIEW Run-Time Engine 8.6.1. You can download a free copy of the LabVIEW Run-Time Engine.

LMS Adaptive Filter Lab (Source).zip contains the LabVIEW VIs used in this demo. To use the LabVIEW VIs, you need to install LabVIEW 8.6 or later and LabVIEW Adaptive Filter Toolkit 1.0 or later. You can download the evaluation version of LabVIEW.

References

[1] Haykin, Simon S. 2001. Adaptive Filter Theory, 4th ed. Englewood Cliffs, New Jersey: Prentice Hall.

Example code from the Example Code Exchange in the NI Community is licensed with the MIT license.