# High Resolution Clock in C#

Clocks in computers have (among others) the following three properties: accuracy, precision, and resolution.

People generally agree on what’s the difference between accuracy and precision/resolution but there seem to be lots of opinions on what’s the difference between precision and resolution and which is which. So I’m going to shamelessly copy a definition I found on Stack Overflow that I’m agreeing with.

• Precision: the amount of information, i.e. the number of significant digits you report. (E.g. I’m 2m, 1.8m, 1.83m, 1.8322m tall. All those measurements are accurate, but increasingly precise.)
• Accuracy: the relation between the reported information and the truth. (E.g. “I’m 1.70m tall” is more precise than “1.8m”, but not actually accurate.)
• Resolution (or Granularity): the smallest time interval that a clock can measure. For example, if you have 1 ms resolution, there’s little point reporting the result with nanosecond precision, since the clock cannot possibly be accurate to that level of precision.

### DateTime ∞

C# provides the `DateTime` type (MSDN) that allows to:

• store a certain point in time
• get the current date and time (via `Now` or `UtcNow`)

First, lets take a look at precision: The `DateTime` type is basically just a 64 bit integer that counts “ticks”. One tick is 100 nanoseconds (or 0.0001 milliseconds) long (MSDN). So `DateTime`‘s precision can be up to 0.0001 milliseconds.

Next, resolution. Basically, we’re asking: “How long does it take for value of `DateTime.UtcNow` to change?” Lets find out.

The following C# code measures the resolution of `DateTime.UtcNow`:

```Console.WriteLine("Running for 5 seconds...");

var distinctValues = new HashSet<DateTime>();
var sw = Stopwatch.StartNew();

while (sw.Elapsed.TotalSeconds < 5)
{
}

sw.Stop();

Console.WriteLine("Precision: {0:0.000000} ms ({1} samples)",
sw.Elapsed.TotalMilliseconds / distinctValues.Count,
distinctValues.Count);```

This program records all the different values `DateTime.UtcNow` returns over the course of 5 seconds. This way, we know how often this value changes per second (or millisecond in this example) and that’s the resolution.

According to MSDN the resolution depends on the operating system but in my tests I found out that the resolution also seems to depend on the hardware (unless newer OS versions have a worse resolution).

Machine OS Resolution
Dev Box Windows 7 x64 1 ms
Laptop Windows 8 x64 16 ms

### High Resolution Clock ∞

On Windows 8 (or Windows Server 2012) or higher there’s a new API that returns the current time with a much higher resolution:

Here’s how to use it in C#:

```using System;
using System.Runtime.InteropServices;

public static class HighResolutionDateTime
{
public static bool IsAvailable { get; private set; }

[DllImport("Kernel32.dll", CallingConvention = CallingConvention.Winapi)]
private static extern void GetSystemTimePreciseAsFileTime(out long filetime);

public static DateTime UtcNow
{
get
{
if (!IsAvailable)
{
throw new InvalidOperationException(
"High resolution clock isn't available.");
}

long filetime;
GetSystemTimePreciseAsFileTime(out filetime);

return DateTime.FromFileTimeUtc(filetime);
}
}

static HighResolutionDateTime()
{
try
{
long filetime;
GetSystemTimePreciseAsFileTime(out filetime);
IsAvailable = true;
}
catch (EntryPointNotFoundException)
{
// Not running Windows 8 or higher.
IsAvailable = false;
}
}
}```

Using the same test code as above but using `HighResolutionDateTime.UtcNow` as input (instead of `DateTime.UtcNow`) leads to:

Machine OS Resolution
Dev Box Windows 7 x64 n/a
Laptop Windows 8 x64 0.0004 ms

So, on my laptop the resolution increased by a factor of 40000.

Note: The resolution can never be better/smaller than 0.0001 ms because this is the highest precision supported by `DateTime` (see above).

### Accuracy ∞

`DateTime.UtcNow` and `HighResolutionDateTime.UtcNow` are both very accurate. The first one has lower resolution, the second one has higher resolution.

There’s also `Stopwatch` in C#. `Stopwatch` has a high resolution. Using `Stopwatch.ElapsedTicks` as input for resolution measure code from above, I got these results:

Machine OS Resolution
Dev Box Windows 7 x64 0.0004 ms
Laptop Windows 8 x64 0.0004 ms

However, `Stopwatch` is not very accurate. On my laptop it drifts by 0.2 ms per second, i.e. it gets less accurate over time.

Here’s how to measure the drift/accuracy loss:

```var start = HighResolutionDateTime.UtcNow;
var sw = Stopwatch.StartNew();

while (sw.Elapsed.TotalSeconds < 10)
{
DateTime nowBasedOnStopwatch = start + sw.Elapsed;
TimeSpan diff = HighResolutionDateTime.UtcNow - nowBasedOnStopwatch;

Console.WriteLine("Diff: {0:0.000} ms", diff.TotalMilliseconds);

}```

This gives me an output like this:

```Diff: 0,075 ms
Diff: 0,414 ms
Diff: 0,754 ms
Diff: 0,924 ms
Diff: 1,084 ms
Diff: 1,247 ms
Diff: 1,409 ms
Diff: 1,571 ms
Diff: 1,734 ms
Diff: 1,898 ms```

As you can see, the difference increases over time. Thus, `Stopwatch` becomes less accurate over time.

1. heathi said: β

For further reference see the page <a href="http://msdn.microsoft.com/zh-cn/vstudio/dn553408%28v=vs.71%29&quot; title="Acquiring high-resolution time stamps" in the FAQ Question: How can I convert QPC to 100 nanosecond ticks so I can add it to a FILETIME?

The summary is GetSystemTimePreciseAsFileTime Tick period is 100ns and QueryPerformaceCounter and StopWatch Tick size is (1/QueryPerformanceFrequency) and (1/StopWatch.Frequency) respectively. There has to be a conversion to compensate for the difference in tick periods.

I modified your code to convert between tick sizes and also included the results when I run it.

``` static void Main(string[] args) { var start = HighResolutionDateTime.UtcNow; var sw = Stopwatch.StartNew();```

``` long frequency = Stopwatch.Frequency; Console.WriteLine(" Timer frequency in ticks per second = {0}", frequency); Decimal ticksPer100ns = frequency / 10000000.0m; Console.WriteLine(" Ticks per 100 ns = {0}", ticksPer100ns); while (sw.Elapsed.TotalSeconds < 10) { var swTicks = sw.ElapsedTicks; var highResNow = HighResolutionDateTime.UtcNow; long ticks = Convert.ToInt64(start.Ticks + (swTicks / ticksPer100ns)); DateTime nowBasedOnStopwatch = new DateTime(ticks); TimeSpan diff = highResNow - nowBasedOnStopwatch; Console.WriteLine("Diff: {0:0.000} ms", diff.TotalMilliseconds); Thread.Sleep(1000); } Console.ReadLine(); ```

``` } ```
Results:

Timer Frequency in ticks per second = 3117917
Ticks per 100 ns = 0.3117917

Diff: 0.025 ms
Diff: 0.027 ms
Diff: 0.027 ms
Diff: 0.027 ms
Diff: 0.027 ms
Diff: 0.027 ms
Diff: 0.034 ms
Diff: 0.027 ms
Diff: 0.027 ms
Diff: 0.027 ms

2. heathi said: β

Looking back at your original code, StopWatch.Elapsed is already been doing the conversion to 100ns ticks so the previously suggested modifications weren’t needed.

Maybe your computer doesn’t have a high resolution timer and there is some divergence in the two different ways of calculating the time or the power saving features are causing the problems. I ran the test on a desktop and my results show that at least on the hardware I tested they were really close and the error didn’t increase.

• Sebastian Krysmanski (post author) replied: β

Yeah, maybe the high resolution timer is better on some system and worse on others.

What’s the resolution of the high resolution timer on your computer? (You can use the first code for this.)

3. heathi said: β

Did you round your values to for resolution to 0.0004?

I tested the resolution code with a few different values changing the HashSet type as needed.

Tested with DateTime.UtcNow

15.576336 ms

Tested with HighResolutionDateTime.UtcNow

0.000355 ms

Tested with Stopwatch.Elapsed

0.000365 ms

Tested with Stopwatch.ElapsedTicks

0.000361 ms

I calculated the resolution for my system from StopWatch.Frequency

0.000321 ms

I tried a modification to the while loop in your resolution code

var lenghOfTest = new TimeSpan(0, 0, 5).Ticks;

while (sw.ElapsedTicks < lenghOfTest)

The Results:
Tested with HighResolutionDateTime.UtcNow

0.000342 ms

Tested with Stopwatch.Elapsed

0.000347 ms

Tested with Stopwatch.ElapsedTicks

0.000346 ms

I tried another modification that only count the different values instead of putting them in the HashSet. I basically got the same results for the the different methods with this method and is very close to the calculated resolution.

DateTime last = HighResolutionDateTime.UtcNow;
long count = 0;

while (sw.ElapsedTicks < lenghOfTest)
{
if (HighResolutionDateTime.UtcNow != last)
{
last = HighResolutionDateTime.UtcNow;
count++;
}
}

The Results:
Tested with HighResolutionDateTime.UtcNow

0.000323 ms

Tested with Stopwatch.Elapsed

0.000323 ms

Tested with Stopwatch.ElapsedTicks

0.000324 ms

4. Henry Ho said: β

// run once to eliminate effect of P/Invoke, L1 cache
var start = HighResolutionDateTime.UtcNow;
var sw = Stopwatch.StartNew();
sw.Stop();
// real run
start = HighResolutionDateTime.UtcNow;
sw.Restart();

my result is 0.005ms

5. Jake said: β

The calls to Thread.Sleep might be the source for the claimed Stopwatch inaccuracy.
Also, you write to the console, which of course takes time and skews the measurement.

You should find a way to repeat this test without those calls.

6. Jason said: β

Hi.

There are a number of issues in the original code, one is the wait time after Thread.Sleep from which it awakes, then tests the while. Also the time and elapsed should be as close together in the loop as below for when it does actually breach the 10 seconds in your test, 60 seconds in mine.

I have also ensured all types are loaded and ready in the example below. I run for 60 seconds.

TimeSpan elapsed = TimeSpan.Zero;
DateTime dt = DateTime.MinValue;
var sw = Stopwatch.StartNew();
var warmStart = HighResolutionDateTime.UtcNow;
var actualStart = HighResolutionDateTime.UtcNow;
sw.Restart();

while (sw.Elapsed.TotalSeconds < 60)
{
elapsed = sw.Elapsed;
dt = HighResolutionDateTime.UtcNow;
}

DateTime nowBasedOnStopwatch = actualStart + elapsed;
TimeSpan diff = dt – nowBasedOnStopwatch;
Console.WriteLine("Diff: {0:0.000} ms", diff.TotalMilliseconds);

7. Herr Herrmann Mann said: β

Your drift calculation code is definately off, altough I am not smart enough it seems, to figure how exactly. One thing is clear though, you leave out a variable in your explanation of what is happening in that snippet. DateTime and HighResolutionDateTime are not interchangable, yet you use both, and the Stopwatch.Elapsed TimeSpan of course, to calculate a new “diff” TimeSpan. If you calculate the “diff” TimeSpan using only HighResolutionDateTime, you will get vastly different results. So it seems to me, that what is really happening there is, that it is returning the difference between whatever time DateTime.UtcNow and HighResolutionDateTime.UtcNow is returning – plus the overhead of variable initialization and the StopWatch.Elapsed TimeSpan.

I also do not get incrementing differences, but the diff-Values are all staying in a pretty narrow range. Like roughly 1ms when using DateTime – and around 0.007 when using HighResolutionDateTime.

Nevertheless, this article gave me some nice insights – and I will dwell on them. So thanks for that.

8. Andrew Scott said: β

This doesn’t seem to be true anymore – I ran the code in .NET Core 3 and got the following results:

Running for 5 seconds…
DateTime Resolution: 0.000128 ms (39076239 samples)
Running for 5 seconds…
Stopwatch Resolution: 0.000116 ms (43265023 samples)
Running for 5 seconds…
High Resolution DateTime Resolution: 0.000149 ms (33502333 samples)

This site uses Akismet to reduce spam. Learn how your comment data is processed.