Resamples the input signal X by interpolating X, passing the interpolated signal through an FIR filter, and decimating the filtered signal.
A Boolean that determines whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE, this node protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
Initialization of resampling. The default is FALSE. The first time this VI runs or when reset is TRUE, LabVIEW initializes the internal states of the VI to zero and uses start index to determine when the resampling starts. The next time this VI runs and reset is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this VI. To process a large data sequence that consists of smaller blocks, set reset to TRUE for the first block and to FALSE for all the remaining blocks in continuous resampling.
Is the input real signal for resampling. The sampling interval of X is 1.
Determines where the resampling starts for the first call to the VI or if reset is TRUE. Set the start index according to the signal after X is interpolated. start index greater than or equal to 0.
The interpolation factor and the decimation factor for resampling.
Is the interpolation factor for resampling.
Is the decimation factor for resampling.
Error conditions that occur before this node runs.
The node responds to this input according to standard error behavior.
Default: no error
FIR filter specifications
Minimum values this VI needs to specify the FIR filter.
alias rejection (dB)
Minimum attenuation level of signal components aliased after any resampling operation.
Fraction of the new sampling rate that is not attenuated.
Resampled signal. The sampling interval of Y is decimation/interpolation.
Time instance for the first sample of each signal in Y.