Updating MAMP

Working with MAMP is quite easy. However, the process of updating MAMP to a newer version might contain some uncertainties.

The following table contains things that one want to keep when upgrading to a newer version and whether they’re correctly migrated or not.

What Migrated
server ports (as specified in settings) no (tracked as #4796)
htdocs directory yes (moved)
htdocs location (as specified in settings) no (tracked as #4796)
Databases (db directory) yes (moved)

The process was tested with MAMP 3.0.7.3.

.NET Serializers Comparison Chart

There are many object serializers in C#/.NET but their details are often not so obvious, for example:

  • Does my class need a parameterless constructor?
  • Can I serialize private fields?
  • Can I serialize readonly fields?

So, I’ve compiled a comparison chart in this article that will compare the various serializers and their capabilities.

Read more →

Azure WAT150: This assembly is not in the package.

When publishing to Windows Azure, you may get the warning WAT150:

warning WAT150: The project ‘MyProject’ is dependent on the following assembly: J:\Sources\MyProject\packages\Microsoft.Data.Services.Client.5.6.0\lib\net40\Microsoft.Data.Services.Client.dll. This assembly is not in the package. To make sure that the role starts, add this assembly as a reference to the project and set the Copy Local property to true.

This warnings means that one of your projects uses an assembly that’s installed locally on your computer (e.g. using ASP.NET in a worker role) but is not installed on the Azure instance you want to run the package on. So, usually you can’t ignore this warning as the resulting package won’t run.

The usual fix is to use Nuget packages instead of globally installed assemblies. However, if this doesn’t work (like in my example above where the package clearly comes from Nuget), there’s another culprit:

The Global Assembly Cache (GAC)

Basically, if the referenced assembly is registered in the GAC with exactly the same version, then Visual Studio will always use the assembly from the GAC. Because of this, the assembly won’t be copied to the output directory and thus won’t be in the Azure package – resulting in WAT150 warnings.

This seems to be like this by design. Quoting from MSDN:

If you deploy an application that contains a reference to a custom component that is registered in the GAC, the component will not be deployed with the application, regardless of the CopyLocal setting. In previous versions of Visual Studio, you could set the CopyLocal property on a reference to ensure that the assembly was deployed. Now, you must manually add the assembly to the \Bin folder. This puts all custom code under scrutiny, reducing the risk of publishing custom code with which you are not familiar.

This statements is not entirely correct. Setting CopyLocal to true works – but only for the project directly referencing the assembly. Unfortunately, CopyLocal is not transitive, i.e. if project A references assembly X.dll (registered in the GAC; with CopyLocal set to true) and project B references project A, then X.dll will only appear in the output directory of project A but not in the output directory of project B. (It would appear in the output directory of project B if X.dll wasn’t in the GAC.)

How to fix

There are a couple of ways how to fix this problem, but they all are workarounds:

  • Delete the assembly from the GAC: Navigate to %WINDIR%\Microsoft.NET\assembly\GAC_MSIL and delete the assembly from here. Note though that these assemblies may still be need and some software may stop working.
  • Switch to a different assembly version that’s not in the GAC.
  • Reference the necessary assemblies directly in all role projects.

High Resolution Clock in C#

Clocks in computers have (among others) the following three properties: accuracy, precision, and resolution.

People generally agree on what’s the difference between accuracy and precision/resolution but there seem to be lots of opinions on what’s the difference between precision and resolution and which is which. So I’m going to shamelessly copy a definition I found on Stack Overflow that I’m agreeing with.

  • Precision: the amount of information, i.e. the number of significant digits you report. (E.g. I’m 2m, 1.8m, 1.83m, 1.8322m tall. All those measurements are accurate, but increasingly precise.)
  • Accuracy: the relation between the reported information and the truth. (E.g. “I’m 1.70m tall” is more precise than “1.8m”, but not actually accurate.)
  • Resolution (or Granularity): the smallest time interval that a clock can measure. For example, if you have 1 ms resolution, there’s little point reporting the result with nanosecond precision, since the clock cannot possibly be accurate to that level of precision.

This article will be mainly about resolution (and precision and accuracy to some extend).

DateTime

C# provides the DateTime type (MSDN) that allows to:

  • store a certain point in time
  • get the current date and time (via Now or UtcNow)

First, lets take a look at precision: The DateTime type is basically just a 64 bit integer that counts “ticks”. One tick is 100 nanoseconds (or 0.0001 milliseconds) long (MSDN). So DateTime‘s precision can be up to 0.0001 milliseconds.

Next, resolution. Basically, we’re asking: “How long does it take for value of DateTime.UtcNow to change?” Lets find out.

The following C# code measures the resolution of DateTime.UtcNow:

Console.WriteLine("Running for 5 seconds...");

var distinctValues = new HashSet<DateTime>();
var sw = Stopwatch.StartNew();

while (sw.Elapsed.TotalSeconds < 5)
{
    distinctValues.Add(DateTime.UtcNow);
}

sw.Stop();

Console.WriteLine("Precision: {0:0.000000} ms ({1} samples)",
                  sw.Elapsed.TotalMilliseconds / distinctValues.Count,
                  distinctValues.Count);

This program records all the different values DateTime.UtcNow returns over the course of 5 seconds. This way, we know how often this value changes per second (or millisecond in this example) and that’s the resolution.

According to MSDN the resolution depends on the operating system but in my tests I found out that the resolution also seems to depend on the hardware (unless newer OS versions have a worse resolution).

Machine OS Resolution
Dev Box Windows 7 x64 1 ms
Laptop Windows 8 x64 16 ms

High Resolution Clock

On Windows 8 (or Windows Server 2012) or higher there’s a new API that returns the current time with a much higher resolution:

GetSystemTimePreciseAsFileTime()

Here’s how to use it in C#:

using System;
using System.Runtime.InteropServices;

public static class HighResolutionDateTime
{
    public static bool IsAvailable { get; private set; }

    [DllImport("Kernel32.dll", CallingConvention = CallingConvention.Winapi)]
    private static extern void GetSystemTimePreciseAsFileTime(out long filetime);

    public static DateTime UtcNow
    {
        get
        {
            if (!IsAvailable)
            {
                throw new InvalidOperationException(
                    "High resolution clock isn't available.");
            }

            long filetime;
            GetSystemTimePreciseAsFileTime(out filetime);

            return DateTime.FromFileTimeUtc(filetime);
        }
    }

    static HighResolutionDateTime()
    {
        try
        {
            long filetime;
            GetSystemTimePreciseAsFileTime(out filetime);
            IsAvailable = true;
        }
        catch (EntryPointNotFoundException)
        {
            // Not running Windows 8 or higher.
            IsAvailable = false;
        }
    }
}

Using the same test code as above but using HighResolutionDateTime.UtcNow as input (instead of DateTime.UtcNow) leads to:

Machine OS Resolution
Dev Box Windows 7 x64 n/a
Laptop Windows 8 x64 0.0004 ms

So, on my laptop the resolution increased by a factor of 40000.

Note: The resolution can never be better/smaller than 0.0001 ms because this is the highest precision supported by DateTime (see above).

Accuracy

To complete this article, lets also talk about accuracy.

DateTime.UtcNow and HighResolutionDateTime.UtcNow are both very accurate. The first one has lower resolution, the second one has higher resolution.

There’s also Stopwatch in C#. Stopwatch has a high resolution. Using Stopwatch.ElapsedTicks as input for resolution measure code from above, I got these results:

Machine OS Resolution
Dev Box Windows 7 x64 0.0004 ms
Laptop Windows 8 x64 0.0004 ms

However, Stopwatch is not very accurate. On my laptop it drifts by 0.2 ms per second, i.e. it gets less accurate over time.

Here’s how to measure the drift/accuracy loss:

var start = HighResolutionDateTime.UtcNow;
var sw = Stopwatch.StartNew();

while (sw.Elapsed.TotalSeconds < 10)
{
    DateTime nowBasedOnStopwatch = start + sw.Elapsed;
    TimeSpan diff = HighResolutionDateTime.UtcNow - nowBasedOnStopwatch;

    Console.WriteLine("Diff: {0:0.000} ms", diff.TotalMilliseconds);

    Thread.Sleep(1000);
}

This gives me an output like this:

Diff: 0,075 ms
Diff: 0,414 ms
Diff: 0,754 ms
Diff: 0,924 ms
Diff: 1,084 ms
Diff: 1,247 ms
Diff: 1,409 ms
Diff: 1,571 ms
Diff: 1,734 ms
Diff: 1,898 ms

As you can see, the difference increases over time. Thus, Stopwatch becomes less accurate over time.