It appears that on my system (WinXP, MinGW) the resolution of QTime::currentTime() is between 15 and 16 milliseconds.

The output of this increases in steps of 15 and 16:
Qt Code:
  1. QTime tt = QTime::currentTime();
  2. for (;;) {
  3. std::cout << tt.msecsTo(QTime::currentTime()) << std::endl;
  4. }
To copy to clipboard, switch view to plain text mode 

I believe the resolution of intervals between QObject::timerEvent() calls to be the same (i.e ~15 ms).

I noticed this when I was using QObject::timerEvent() to draw the frames of an animation. I implemented the animation in Processing earlier, and strangely the Processing version was running faster than the Qt version, even though the Qt version's CPU usage was always under 10%. It turns out that this is because the frame rates were not the same---because of the 15 ms resolution, in Qt I could only get frame rates of 1000/(N*15) with N=1,2,3,...

My question is: Is it possible to remedy this problem and have more precise frame rate control? Can I do anything to increase the resolution of Qt's timers? It is clearly possible to do more precise time measurements in Windows (the Java-based Processing appears to be able to do it), but is it possible to use them with Qt?