Decimation Algorithm Used to Display Data on a Graph in LabVIEW

Updated Nov 7, 2023

Reported In

Software

  • LabVIEW Full (Legacy)

Issue Details

The average LabVIEW graph is on the order of 300 to 1000 pixels wide, but LabVIEW can accurately display a data set containing over a million points. What type of decimation algorithm does LabVIEW use to display more points than there are pixels?

Solution

LabVIEW uses a max-min decimation algorithm to display data on a graph. Max-min decimation is decimation in which the maximum and minimum data points of each decimation interval are used to provide the decimation. Simple decimation uses the first point of each decimation interval for the data point of the decimation interval. Simple decimation leads to aliasing artifacts, so it should not be used unless time is of the utmost importance and accuracy is not important.

Figure 1 shows an example of max-min decimation. If you process the data in the left-hand image below using max-min decimation, it will produce the graph displayed on the right:
 

The max-min algorithm assures you always see the peaks of the data, giving you the solid band the high frequency sine wave in Figure 1 should produce. This all occurs with much less data plotted to the graph, resulting in much higher speeds.