Academic Company Events Community Support Solutions Products & Services Contact NI MyNI
3 ratings:
 2.66 out of 5   

What Type of Decimation Algorithm Does LabVIEW Use to Display Data on a Graph?



Primary Software:
Primary Software Version: 1.0
Primary Software Fixed Version: N/A
Secondary Software: LabVIEW Development Systems>>LabVIEW Full Development System

Problem:
The average LabVIEW graph is on the order of 300 to 1000 pixels wide, but LabVIEW can accurately display a data set containing over a million points. What type of decimation algorithm does LabVIEW use to display more points than there are pixels?

Solution:
LabVIEW uses a max-min decimation algorithm to display data on a graph. Max-min decimation is decimation in which the maximum and minimum data points of each decimation interval are used to provide the decimation. Simple decimation uses the first point of each decimation interval for the data point of the decimation interval. Simple decimation leads to aliasing artifacts, so it should not be used unless time is of the utmost importance and accuracy is not important.

Figure 1 shows an example of max-min decimation. If you process the data in the left-hand image below using max-min decimation, it will produce the graph displayed on the right:

 


Figure 1. Max-Min Decimation

The max-min algorithm assures you always see the peaks of the data, giving you the solid band the high frequency sine wave in Figure 1 should produce. This all occurs with much less data plotted to the graph, resulting in much higher speeds.


Related Links:
NI Developer Zone: Managing Large Data Sets in LabVIEW

Attachments:





Report Date: 07/29/2009
Last Updated: 08/24/2009
Document ID: 4ZSFIO3S

Your Feedback! poor Poor  |  Excellent excellent   Yes No
 Document Quality? 
 Answered Your Question? 
  1 2 3 4 5
Please Contact NI for all product and support inquiries.submit