Home > Support > NI Product Manuals > LabVIEW Communications System Design Suite 1.0 Manual

Continuously decimates the input sequence x by the decimating factor and the averaging Boolean control. Wire data to the x input to determine the instance to use.



The first input sequence.


decimating factor

The factor by which the node decimates input sequence x. decimating factor must be greater than zero. If decimating factor is greater than the number of elements in x or less than or equal to zero, this node sets decimated array to an empty array and returns an error.

Default: 1



A Boolean that determines the method the node uses to handle the data points in x.

TRUE Each output point in decimated array is the mean of the decimating factor input points.
FALSE The node keeps every decimating factor point from x.

Default: FALSE


error in

Error conditions that occur before this node runs. The node responds to this input according to standard error behavior.

Default: no error


start index

Determines from which sample in x the decimation starts if LabVIEW calls the node for the first time or reset is TRUE. start index must be greater than or equal to zero.

Default: 0



Initialization of the decimation. If reset is TRUE or if the node runs for the first time, LabVIEW initializes the decimation from the sample of x specified by start index. When the node runs again with reset set to FALSE, LabVIEW initializes the decimation from the final states of the previous call to the node. To process a large data sequence that consists of smaller blocks, set reset to TRUE for the first block and to FALSE for all remaining blocks. You also can set reset to TRUE at regular intervals of blocks to periodically reset the sample from which the decimation begins.

Default: FALSE


decimated array

Decimated sequence of x.


error out

Error information. The node produces this output according to standard error behavior.