Academic Company Events Community Support Solutions Products & Services Contact NI MyNI
This Document is not yet Rated

Archived: How Is the Channel Clock Rate Determined in My Data Acquisition VI Using Traditional NI-DAQ?

This document has been archived and is no longer updated by National Instruments

Note: This page is about NI-DAQ also known as Traditional NI-DAQ (Legacy). NI-DAQmx replaced Traditional NI-DAQ (Legacy) in 2003. NI strongly recommends using NI-DAQmx for new designs and migrating existing Traditional NI-DAQ (Legacy) applications to NI-DAQmx. Please review the Getting Started with NI-DAQmx guide for more information on migration.

Primary Software: Driver Software>>NI-DAQ
Primary Software Version: 7.2
Primary Software Fixed Version: N/A
Secondary Software: LabVIEW Development Systems

How is the channel clock rate determined in my data acquisition VI using Traditional NI-DAQ when I:
  1. Use an external scan clock?
  2. Use an internal timer?
  3. "Scan" one channel using an external scan clock?

Using an external scan clock:

If you do not set up the channel clock, Traditional NI-DAQ automatically displays the channel clock interval to [1/maximal sampling rate of the board] + 10 microseconds. The 10 microseconds above the minimal interval permitted by the board are added to accommodate settling time for weak signals.
Using an internal timer:
For low scan rates, the same default applies. With higher scan rates, Traditional NI-DAQ adjusts the channel rate to support higher acquisition rates.

What if I do not want to use the default?
If the default behavior of Traditional NI-DAQ is inappropriate for your application, invoke AI Clock to set the channel clock more appropriately.

What if I want to "scan" only one channel so I can externally time my acquisition?
If you want to externally time an acquisition on a single channel, you must set the scan and the channel clock sources to be derived from the same external signal. Scan and channel timing on E Series boards works so that you use the channel clock even if you are only "scanning" one channel, which allows one channel interval to elapse between the start of scan and the first conversion within the scan. So, when an external scan clock pulse is applied to the board, and the default setting for channel clock is used, the board waits for [1/maximal rate of the board] + 10 microseconds before carrying out an A/D (analog to digital) conversion.

Related Links:
KnowledgeBase 2XPE1QCW: How is the Convert (Channel) Clock Rate Determined in NI-DAQmx and Traditional NI-DAQ?
KnowledgeBase 2D6CTML8: Data Acquisition Definitions: Scan Rate, Channel Rate, Sampling Rate, Scan List, Interchannel Delay
Products and Services: Data Acquisition


Report Date: 06/13/1997
Last Updated: 08/07/2009
Document ID: 0YCBDERO

Your Feedback! poor Poor  |  Excellent excellent   Yes No
 Document Quality? 
 Answered Your Question? 
  1 2 3 4 5
Please Contact NI for all product and support inquiries.submit