Bypass the 15ms inaccuracy

Written by Ingmar Verheij on April 21st, 2011. Posted in Performance testing

As Tim Mangan recently blogged, the system timer in a multiprocessor Windows machine operates at 15ms. The effect of this 15ms is that a measurement / calculation in your program can vary with 15ms. This is fine for most operations in your program, but not when you want accuracy to the millisecond.

Although the best solution to the problem is changing the system timers, as Tim Mangan wrote, a workaround is available.

In the winmm.dll (Multimedia functions) an API is available to request a minimum resolution for periodic timers. When this API is called with an argument of 1, the accuracy changes from 15ms to 1ms.

When accuracy is needed the API “timeBeginPeriod” is called with the minimum resolution in ms as argument. The timers wil now be more accurate.

Next, when the system time needs to be read the API “timeGetTime” is called. This API wil return the system time (since Windows was started) in milliseconds.

When the accuracy is no longer needed the request for the accuracy can be ended with the API “timeEndPeriod”.


Setting the timer accuracy is a system-wide setting, so it will affect the whole machine. This is important because it will cost more CPU cycles, which require more power, which require more battery. So this will drain you laptop battery a lot faster!


Ingmar Verheij

Ingmar Verheij

At the time Ingmar wrote this article he worked for PepperByte as a Senior Consultant (up to May 2014). His work consisted of designing, migrating and troubleshooting Microsoft and Citrix infrastructures. He was working with technologies like Microsoft RDS, user environment management and (performance) monitoring. Ingmar is User Group leader of the Dutch Citrix User Group (DuCUG). RES Software named Ingmar RES Software Valued Professional in 2014.

More Posts - Website

Follow Me:
TwitterLinkedInGoogle Plus

Tags: , ,

Leave a comment



%d bloggers like this: