Strange results while measuring delta time on Linux

Posted by pachanga on Stack Overflow See other posts from Stack Overflow or by pachanga
Published on 2010-05-21T12:43:29Z Indexed on 2010/05/21 13:00 UTC
Read the original article Hit count: 191

Filed under:
|
|

Folks, could you please explain why I'm getting very strange results from time to time using the the following code:

#include <unistd.h>
#include <sys/time.h>
#include <time.h>
#include <stdio.h>

int main()
{
  struct timeval start, end;
  long mtime, seconds, useconds;    

  while(1)
  {
    gettimeofday(&start, NULL);

    usleep(2000);

    gettimeofday(&end, NULL);

    seconds  = end.tv_sec  - start.tv_sec;
    useconds = end.tv_usec - start.tv_usec;

    mtime = ((seconds) * 1000 + useconds/1000.0) + 0.5;
    if(mtime > 10) 
      printf("WTF: %ld\n", mtime);
  }

  return 0;
}

(You can compile and run it with: gcc test.c -o out -lrt && ./out)

What I'm experiencing is sporadic big values of mtime variable almost every second or even more often, e.g:

$ gcc test.c -o out -lrt && ./out 
WTF: 14
WTF: 11
WTF: 11
WTF: 11
WTF: 14
WTF: 13
WTF: 13
WTF: 11
WTF: 16

How can this be possible? Is it OS to blame? Does it do too much context switching? But my box is idle( load average: 0.02, 0.02, 0.3).

Here is my Linux kernel version:

$ uname -a
Linux kurluka 2.6.31-21-generic #59-Ubuntu SMP Wed Mar 24 07:28:56 UTC 2010 i686 GNU/Linux

© Stack Overflow or respective owner

Related posts about linux

Related posts about c