How is the Convert (Channel) Clock Rate Determined in NI-DAQmx and Traditional NI-DAQ?
Primary Software Version:
Primary Software Fixed Version:
Multifunction DAQ (MIO)
How does NI-DAQ select a Convert (Channel) Clock rate when I only specify the Sample (Scan) Clock rate? How can I override this default rate and specify my own rate?
The terms 'Channel Clock', 'Scan Clock' and 'Scan rate' used in Traditional NI-DAQ terminology have become 'Convert Clock', 'Sample Clock' and 'Sampling Rate' in NI-DAQmx. In the general case, the NI-DAQmx terminology will be used here with Traditional NI-DAQ terms in parenthesis.Note:
The Convert (Channel) Clock rate is the inverse of interchannel delay, for example a Convert (Channel) Clock rate of 80,000 Hz corresponds to an interchannel delay of 12.5 µs.How NI-DAQ chooses the Convert (Channel) Clock rate
The Convert (Channel) Clock is determined differently depending on the version of the NI-DAQ driver you are using.
Overriding the default Convert (Channel) Clock with your own rate
- NI-DAQmx 7.4 and greater
Starting with NI-DAQmx 7.4 and greater the Convert Clock is determined the same way as with Traditional NI-DAQ.
- With NI-DAQmx, the driver chooses the fastest conversion rate possible based on the speed of the A/D converter and adds 10 μs of padding between each channel to allow for adequate settling time. This scheme enables the channels to approximate simultaneous sampling and still allow for adequate settling time. If the Sample Clock rate is too fast to allow for this 10 μs of padding, NI-DAQmx chooses the conversion rate so that the Convert Clock pulses are evenly spaced throughout the sample.
Here is an example:
The PCI-6220 M Series board has a maximum sampling rate of 250 kS/s, this is the maximum speed of the A/D converter. The time for a conversion at this rate is 4 µs. Taking into account the additional 10 µs padding, a single channel of data acquisition will require a 14 µs period, or a frequency of 71.4 kHz.
At a slower acquisition rate such as 10kHz with 2 channels, the convert clock is faster than your aggregate rate of 20 kS/s (10 kS/s/ch * 2 channels). The convert clock will pulse 4 µs to signal the first conversion, wait for 10 µs to settle, pulse for 4 µs to signal the second conversion, then wait another 10 µs to settle. At this point, the convert clock stays LOW until the next rising edge of the sample clock.
Once your aggregate rate exceeds 71.4 kS/s (this frequency depends on the max A/D conversion rate, from above), there will not be enough time between samples to acquire both channels and still add 10 µs of delay per channel. In this case, the convert clock rate is equal to the aggregate sample rate.
Remember also that the maximum sampling rate depends on the board, so for a board like the PCI-6250 with a max sampling rate of 1MS/s, at slower sampling rates the interchannel delay would be 11µs = 1µs (maximum channel conversion rate for a 1MS/s board) + 10 µs giving a Convert Clock rate of 90909.1 Hz.
Giving the amplifier maximum settling time is important. For example, to ensure accuracy to within +/- 1 LSB on the PCI-6220, the board requires a minimum amplifier settling time of 7 µs, even though the maximum channel conversion rate of for the board is 4 µs. Higher source impedance also increases amplifier settling time.
- NI-DAQmx 7.0 to 7.3
In these versions of the NI-DAQ driver the Convert Clock is always chosen to absolutely maximize amplifier settling time. Interchannel delay is simply the time between samples divided by the number of channels. The Convert Clock then is simply the Sample Clock rate multiplied by the number of channels being acquired. The two images attached below illustrate the difference between the Convert (Channel) Clock rates selected by Traditional NI-DAQ and NI-DAQmx 7.0 to 7.3
- Traditional NI-DAQ
By default the Traditional NI-DAQ driver chooses the fastest Channel Clock rate possible while still allowing extra time for adequate amplifier settling time. At slower scan rates 10 µs of delay is added to the fastest possible channel conversion rate of the board (the same as the maximum scan rate) to derive the Channel Clock.
As the scan rate increases, there comes a point where there is not enough time for a full 10 µs of additional delay time between channel conversions and still finish acquiring all channels before the next edge of the Scan Clock. At this point the driver simply uses round robin channel sampling, evenly dividing the time between scans by the number of channels being acquired to obtain the interchannel delay. In this case the Channel Clock can be calculated by simply multiplying the scan rate by the number of channels being acquired.
There may be special cases when you want to override the default Convert (Channel) Clock rate and specify your own rate. For example, you may want to increase interchannel delay to maximize amplifier settling time because you have a high source impedance. Conversely you may want to decrease interchannel delay as much as possible in order to achieve a more "simultaneous" acquisition of channels (sacrificing accuracy due to decreased amplifier settling time).
SCXI - NI-DAQmx 8.5 and greater
Using NI-DAQmx, you can manually set your Convert Clock rate using the AIConv.Rate property in the DAQmx Timing property node. The attached VI shows how this is done.
- NI-DAQ Function Calls
If you are using DAQ function calls you can get/set/reset the Convert Clock using the following function calls: DAQmxGetAIConvRate, DAQmxSetAIConvRate, DAQmxResetAIConvRate.
- Traditional NI-DAQ
Using Traditional NI-DAQ, you can set your Channel Clock rate manually with the interchannel delay input of the AI Config VI, which calls the Advanced AI Clock Config VI to configure the channel clock. This overrides the default Channel Clock rate.
Starting in NI-DAQmx 8.5, a total delay of 20 µs was added to the fastest possible channel conversion period on an SCXI chassis connected to a 16-bit or 18-bit E or M Series device. If a sample rate is set fast enough to require a convert clock rate greater than the default rate, warning 200011
is generated, informing the user that the accuracy of the measurement might be compromised. However, since this is a warning, the convert clock used is based on the rate needed by the sample clock set by the user (that is, convert clock = sample clock * number of channels).
For SCXI chassis containing a module with track-and-hold circuitry, 10 µs of padding is now added to the fastest possible conversion rate when connected to a 12-bit E Series device.
These changes ensure the most accurate data is being acquired by maximizing the settling time between each channel. However, if the affect on the performance due to this change is too great (such as on an RT system using Hardware Timed Single Point in a control loop), then the convert clock rate attribute can be explicitly set, overriding the default rate.
To calculate the fastest sample rate where no accuracy is lost due to settling time (and no warning is thrown), use the following forumula: Sample Rate = 1 / (0.000020 * Number_of_Channels). If you need to run your application at a faster sample rate, check the user manual of your SCXI module to get the "minimum scan interval" accuracy specifications.
This change affects only multichannel acquisitions. In NI-DAQmx 8.4 and earlier, only 10 µs of delay was added to the fastest possible channel conversion rate on an SCXI chassis without a track-and-hold module connected to any DAQ device. In these versions of NI-DAQmx, no warning is generated when the sample rate needed a faster convert clock than was chosen by default. When an SCXI chassis with a track-and-hold module was present, the convert clock rate selected was the fastest possible rate that the DAQ device can handle, up to 333kHz.
Related Links: Data Acquisition Support HomepageDeveloper Zone Tutorial: Is Your Data Inaccurate Because of Instrumentation Amplifier Settling Time?Developer Zone Community: Setting AI Convert Clock Rate for the Longest Settling Time in DAQmx.KnowledgeBase 2D6CTML8: Data Acquisition Sampling TerminologyKnowledgeBase 30LDURMV: Difference Between the Sample Clock (Scan Clock) and the Convert Clock (Channel Clock)