02-15-2010 03:49 AM
Hi
I have a question about different dt's when having "tick count" and "get date/time in seconds" in a loop.
I have attached the picture that should describe the problem.
Why is there a difference and what would be the best to do it?
Yves
Solved! Go to Solution.
02-17-2010 01:29 PM
Tick count vs get date/time in second do pretty much the same thing, but if you use the functions for calculating time difference, you must be aware that Tick count will eventually loop around.
Regarding your block digram, you have an array that calculate the time elapsed for each loop iteration, and it is always going to be 150ms, since it is the time-loop's job to make sure that it complete each loop at the exact amount of time specified, which is 150ms in your case. However, whatever inside the loop can be executed anytime during the 150ms that you have allocated the loop, i.e., you have 150ms to run whatever that is in the loop, but when do you run the code in the loop exactly is undetermined.
However, it seems like using Tick clock would give you a pretty consistant result, 150ms. It is possible that the execution time for the tick clock is more consistant than the other function.
02-17-2010 01:40 PM
02-17-2010 01:54 PM
jyang72211 wrote:Tick count vs get date/time in second do pretty much the same thing, but if you use the functions for calculating time difference, you must be aware that Tick count will eventually loop around.
Regarding your block digram, you have an array that calculate the time elapsed for each loop iteration, and it is always going to be 150ms, since it is the time-loop's job to make sure that it
completestart each loop at the exact amount of time specified, which is 150ms in your case. However, whatever inside the loop can be executed anytime during the 150ms that you have allocated the loop, i.e., you have 150ms to run whatever that is in the loop, but when do you run the code in the loop exactly is undetermined.
However, it seems like using Tick clock would give you a pretty consistant result, 150ms. It is possible that the execution time for the tick clock is more consistant than the other function.
Trivia:
Differnce between tick counts will always be correct even when it rolls over.
Time Loop determines when it starts, the code executing in it determines when it completes. If it was the other way around I would be slamming that code that takes 5 minutes to run into a time loop set to iterate once a second.
I agree with Yair that you are seeing the resolution in the OS time stamps.
Ben
02-17-2010 02:20 PM
Although is is not too likely to show up in a short run such as in your images, it is important to remember that the tick count and the time of day clocks run off two different and unsynchronized oscillators. This means that their timing accuracies will be different at the parts per million level and that they probably have different drift characteristics. If you wait long enough you will always see differences in the two timing systems. If the time of day clock is periodically reset by a network time server, you may also see jumps when the resets occur.
Lynn