How is the Convert (Channel) Clock Rate Determined in NI-DAQmx and Traditional NI-DAQ?Primary Software: Driver Software>>NI-DAQmx
Primary Software Version: 7.4
Primary Software Fixed Version: N/A
Secondary Software: N/A
Hardware: Multifunction DAQ (MIO)
How does NI-DAQ select a Convert (Channel) Clock rate when I only specify the Sample (Scan) Clock rate? How can I override this default rate and specify my own rate?
Note: The terms 'Channel Clock', 'Scan Clock' and 'Scan rate' used in Traditional NI-DAQ terminology have become 'Convert Clock', 'Sample Clock' and 'Sampling Rate' in NI-DAQmx. In the general case, the NI-DAQmx terminology will be used here with Traditional NI-DAQ terms in parenthesis.
Note: The Convert (Channel) Clock rate is the inverse of interchannel delay, for example a Convert (Channel) Clock rate of 80,000 Hz corresponds to an interchannel delay of 12.5 µs.
How NI-DAQ chooses the Convert (Channel) Clock rate
The Convert (Channel) Clock is determined differently depending on the version of the NI-DAQ driver you are using.
There may be special cases when you want to override the default Convert (Channel) Clock rate and specify your own rate. For example, you may want to increase interchannel delay to maximize amplifier settling time because you have a high source impedance. Conversely you may want to decrease interchannel delay as much as possible in order to achieve a more "simultaneous" acquisition of channels (sacrificing accuracy due to decreased amplifier settling time).
SCXI - NI-DAQmx 8.5 and greater
Starting in NI-DAQmx 8.5, a total delay of 20 µs was added to the fastest possible channel conversion period on an SCXI chassis connected to a 16-bit or 18-bit E or M Series device. If a sample rate is set fast enough to require a convert clock rate greater than the default rate, warning 200011 is generated, informing the user that the accuracy of the measurement might be compromised. However, since this is a warning, the convert clock used is based on the rate needed by the sample clock set by the user (that is, convert clock = sample clock * number of channels).
For SCXI chassis containing a module with track-and-hold circuitry, 10 µs of padding is now added to the fastest possible conversion rate when connected to a 12-bit E Series device.
These changes ensure the most accurate data is being acquired by maximizing the settling time between each channel. However, if the affect on the performance due to this change is too great (such as on an RT system using Hardware Timed Single Point in a control loop), then the convert clock rate attribute can be explicitly set, overriding the default rate.
To calculate the fastest sample rate where no accuracy is lost due to settling time (and no warning is thrown), use the following forumula: Sample Rate = 1 / (0.000020 * Number_of_Channels). If you need to run your application at a faster sample rate, check the user manual of your SCXI module to get the "minimum scan interval" accuracy specifications.
This change affects only multichannel acquisitions. In NI-DAQmx 8.4 and earlier, only 10 µs of delay was added to the fastest possible channel conversion rate on an SCXI chassis without a track-and-hold module connected to any DAQ device. In these versions of NI-DAQmx, no warning is generated when the sample rate needed a faster convert clock than was chosen by default. When an SCXI chassis with a track-and-hold module was present, the convert clock rate selected was the fastest possible rate that the DAQ device can handle, up to 333kHz.
Data Acquisition Support Homepage
Developer Zone Tutorial: Is Your Data Inaccurate Because of Instrumentation Amplifier Settling Time?
Developer Zone Community: Setting AI Convert Clock Rate for the Longest Settling Time in DAQmx.
KnowledgeBase 2D6CTML8: Data Acquisition Sampling Terminology
KnowledgeBase 30LDURMV: Difference Between the Sample Clock (Scan Clock) and the Convert Clock (Channel Clock)
Report Date: 05/26/2003
Last Updated: 10/12/2011
Document ID: 2XPE1QCW