# High Resolution Clock in C#

Clocks in computers have (among others) the following three properties: accuracy, precision, and resolution.

People generally agree on what’s the difference between accuracy and precision/resolution but there seem to be lots of opinions on what’s the difference between precision and resolution and which is which. So I’m going to shamelessly copy a definition I found on Stack Overflow that I’m agreeing with.

• Precision: the amount of information, i.e. the number of significant digits you report. (E.g. I’m 2m, 1.8m, 1.83m, 1.8322m tall. All those measurements are accurate, but increasingly precise.)
• Accuracy: the relation between the reported information and the truth. (E.g. “I’m 1.70m tall” is more precise than “1.8m”, but not actually accurate.)
• Resolution (or Granularity): the smallest time interval that a clock can measure. For example, if you have 1 ms resolution, there’s little point reporting the result with nanosecond precision, since the clock cannot possibly be accurate to that level of precision.

This article will be mainly about resolution (and precision and accuracy to some extend).

### DateTime ∞

C# provides the `DateTime` type (MSDN) that allows to:

• store a certain point in time
• get the current date and time (via `Now` or `UtcNow`)

First, lets take a look at precision: The `DateTime` type is basically just a 64 bit integer that counts “ticks”. One tick is 100 nanoseconds (or 0.0001 milliseconds) long (MSDN). So `DateTime`‘s precision can be up to 0.0001 milliseconds.

Next, resolution. Basically, we’re asking: “How long does it take for value of `DateTime.UtcNow` to change?” Lets find out.

The following C# code measures the resolution of `DateTime.UtcNow`:

```Console.WriteLine("Running for 5 seconds...");

var distinctValues = new HashSet<DateTime>();
var sw = Stopwatch.StartNew();

while (sw.Elapsed.TotalSeconds < 5)
{
}

sw.Stop();

Console.WriteLine("Precision: {0:0.000000} ms ({1} samples)",
sw.Elapsed.TotalMilliseconds / distinctValues.Count,
distinctValues.Count);```

This program records all the different values `DateTime.UtcNow` returns over the course of 5 seconds. This way, we know how often this value changes per second (or millisecond in this example) and that’s the resolution.

According to MSDN the resolution depends on the operating system but in my tests I found out that the resolution also seems to depend on the hardware (unless newer OS versions have a worse resolution).

Machine OS Resolution
Dev Box Windows 7 x64 1 ms
Laptop Windows 8 x64 16 ms

### High Resolution Clock ∞

On Windows 8 (or Windows Server 2012) or higher there’s a new API that returns the current time with a much higher resolution:

Here’s how to use it in C#:

```using System;
using System.Runtime.InteropServices;

public static class HighResolutionDateTime
{
public static bool IsAvailable { get; private set; }

[DllImport("Kernel32.dll", CallingConvention = CallingConvention.Winapi)]
private static extern void GetSystemTimePreciseAsFileTime(out long filetime);

public static DateTime UtcNow
{
get
{
if (!IsAvailable)
{
throw new InvalidOperationException(
"High resolution clock isn't available.");
}

long filetime;
GetSystemTimePreciseAsFileTime(out filetime);

return DateTime.FromFileTimeUtc(filetime);
}
}

static HighResolutionDateTime()
{
try
{
long filetime;
GetSystemTimePreciseAsFileTime(out filetime);
IsAvailable = true;
}
catch (EntryPointNotFoundException)
{
// Not running Windows 8 or higher.
IsAvailable = false;
}
}
}```

Using the same test code as above but using `HighResolutionDateTime.UtcNow` as input (instead of `DateTime.UtcNow`) leads to:

Machine OS Resolution
Dev Box Windows 7 x64 n/a
Laptop Windows 8 x64 0.0004 ms

So, on my laptop the resolution increased by a factor of 40000.

Note: The resolution can never be better/smaller than 0.0001 ms because this is the highest precision supported by `DateTime` (see above).

### Accuracy ∞

`DateTime.UtcNow` and `HighResolutionDateTime.UtcNow` are both very accurate. The first one has lower resolution, the second one has higher resolution.

There’s also `Stopwatch` in C#. `Stopwatch` has a high resolution. Using `Stopwatch.ElapsedTicks` as input for resolution measure code from above, I got these results:

Machine OS Resolution
Dev Box Windows 7 x64 0.0004 ms
Laptop Windows 8 x64 0.0004 ms

However, `Stopwatch` is not very accurate. On my laptop it drifts by 0.2 ms per second, i.e. it gets less accurate over time.

Here’s how to measure the drift/accuracy loss:

```var start = HighResolutionDateTime.UtcNow;
var sw = Stopwatch.StartNew();

while (sw.Elapsed.TotalSeconds < 10)
{
DateTime nowBasedOnStopwatch = start + sw.Elapsed;
TimeSpan diff = HighResolutionDateTime.UtcNow - nowBasedOnStopwatch;

Console.WriteLine("Diff: {0:0.000} ms", diff.TotalMilliseconds);

}```

This gives me an output like this:

```Diff: 0,075 ms
Diff: 0,414 ms
Diff: 0,754 ms
Diff: 0,924 ms
Diff: 1,084 ms
Diff: 1,247 ms
Diff: 1,409 ms
Diff: 1,571 ms
Diff: 1,734 ms
Diff: 1,898 ms```

As you can see, the difference increases over time. Thus, `Stopwatch` becomes less accurate over time.

# Switching OpenID providers through delegation

Back in the days, when I decided to join StackOverflow, I was forced to create an OpenID – because this is the way to login on StackOverflow.

I decided to use an independent OpenID provider, called myOpenID. I also set up OpenID delegation. This way I could use my own domain name as my OpenID. (OpenID uses URLs as user names, like `http://manski.net`.)

Now, myOpenID is shutting down on Feburary 1, 2014. Thus, I had to switch my OpenID provider.

Fortunately, OpenID delegation makes this easy – you just replace the two delegation `<link>` tags and you’re done.

Unfortunately, not all OpenID providers seem to support this. I tried Google (which should work according to this), but StackOverflow always wanted to create a new account for me. (May also be StackOverflow’s fault, I don’t know.)

Fortunately, StackOverflow provides its own OpenID service:

So I created a new OpenID there, replaced the `<link>` tags (details), done. Works like a charm.

# Disable UAC in Windows 8

In Windows 8, Microsoft changed the UAC slider’s lowest setting from “Disable UAC” to “Hide UAC”.

So, even with the lowest setting programs will still not run with Administrator privileges (like in Windows 7).

Windows’ "Run" dialog with UAC still active.

To disable UAC, execute this PowerShell script as Administrator (e.g. via `powershell` from an Admin Command Prompt):

`Set-ItemProperty -Path "HKLM:\Software\Microsoft\Windows\CurrentVersion\Policies\System" -Name "EnableLUA" -Value "0"`

After that restart and UAC is disabled.

Windows’ "Run" dialog with UAC disabled.

Notes:

• Only do this, if you’re aware of the consequences. Disabling UAC may make the system less secure.
• The Windows 8 Store can’t be used anymore if UAC is disabled. (You can, especially, no longer installed Windows 8.1.)
• To reenable UAC, use `-Value "1"` in the command above.

# LINQ to SQL – bits and pieces

In a project I’m currently working on we’re using LINQ to SQL. While most of it is straight forward, there are some quirks that are not that obvious (at least to me).

This article is mostly a FAQ but I will explain some of the not-so-obvious features in more detail.

Note: I’m not going to explain how to setup the connection to the database in this article. I’m assuming that this already works.

, ,

# P/Invoke Tutorial: Passing strings (Part 2)

In the previous tutorial we passed a single string to a native C/C++ function by using P/Invoke.

This function was defined like this:

```// C++
void print_line(const char* str);```
```// C#
[DllImport("NativeLib.dll")]
private static extern void print_line(string str);```

However, there exists a hidden pitfall here:

What happens when the user passes a non-ASCII character to this function?