Maarten Balliauw {blog}

Web development, NuGet, Microsoft Azure, PHP, ...


And there it is - MvcSiteMapProvider v4 (beta)

imageIt has been a while since a new major update has been done to the MvcSiteMapProvider project, but today is the day! MvcSiteMapProvider is a tool that provides flexible menus, breadcrumb trails, and SEO features for the ASP.NET MVC framework, similar to the ASP.NET SiteMapProvider model.

To be honest, I have not done a lot of work. Thanks to the power of open source (and Shad who did a massive job on refactoring the whole, thanks!), MvcSiteMapProvider v4 is around the corner.

A lot of things have changed. And by a lot, I mean A LOT! The most important change is that we’ve stepped away from the ASP.NET SiteMapProvider dependency. This has been a massive pain in the behind and source of a lot of issues. Whereas I initially planned on ditching this dependency with v3, it happened now anyway.

imageOther improvements have been done around dependency injection: every component in the MvcSiteMapProvider can now be replaced with custom implementations. A simple IoC container is used inside MvcSiteMapProvider but you can easily use your preferred one. We’ve created several NuGet packages for popular containers: Ninject, StructureMap, Unity, Autofac and Windsor. Note that we also have packages with the modules only so you can keep using your own container setup.

The sitemap building pipeline has changed as well. A collection of sitemap builders is used to build the sitemap hierarchy from one or more sources. The default configuration of sitemap builders include an XML parser builder, a reflection-based builder, and a builder that implements the visitor pattern which is used to resolve the URLs before they are cached. Both the builders and visitors can be replaced with 1 or more custom implementations, opening up the door to alternate data sources and alternate visitor actions. In other words, you can build the tree any way you see fit. The only limitation is that only one of the builders must decide which node is the root node of the tree (although subsequent builders may change that decision, if needed).

Next to that, a series of new helpers have been added, bugs have been fixed, the security model has been made more performant and lots more. Consider v4 as almost a rewrite for the entire project!

We’ve tried to make the upgrade path as smooth as possible but there may be some breaking changes in the provider. If you currently have the ASP.NET MVC SiteMapProvider installed in your project, feel free to give the new version a try using the NuGet package of your choice (only one is needed for your ASP.NET MVC version).

Install-Package MvcSiteMapProvider.MVC2 -Pre
Install-Package MvcSiteMapProvider.MVC3 -Pre
Install-Package MvcSiteMapProvider.MVC4 -Pre

Speaking of NuGet packages: by popular demand, the core of MvcSIteMapProvider has been extracted into a separate package (MvcSiteMapProvider.MVC<version>.Core) so that you don’t have to include views and so on in your library projects.

Please give the beta a try and let us know your thoughts on GitHub (or the comments below). Pull requests currently go in the v4 branch.

Create a list of favorite ReSharper plugins

With the latest version of the ReSharper 8 EAP, JetBrains shipped an extension manager for plugins, annotations and settings. Where it previously was a hassle and a suboptimal experience to install plugins into ReSharper, it’s really easy to do now. And what is really nice is that this extension manager is built on top of NuGet! Which means we can do all sorts of tricks…

The first thing that comes to mind is creating a personal NuGet feed containing just those plugins that are of interest to me. And where better to create such feed than MyGet? Create a new feed, navigate to the Package Sources pane and add a new package source. There’s a preset available for using the ReSharper extension gallery!

Add package source on MyGet - R# plugins

After adding the ReSharper extension gallery as a package source, we can start adding our favorite plugins, annotations and extensions to our own feed.

Add ReSharper plugins to MyGet

Of course there are some other things we can do as well:

  • “Proxy” the plugins from the ReSharper extension gallery and post your project/team/organization specific plugins, annotations and settings to your private feed. Check this post for more information.
  • Push prerelease versions of your own plugins, annotations and settings to a MyGet feed. Once stable, push them “upstream” to the ReSharper extension gallery.


Using Amazon Login (and LinkedIn and …) with Windows Azure Access Control

One of the services provided by the Windows Azure cloud computing platform is the Windows Azure Access Control Service (ACS). It is a service that provides federated authentication and rules-driven, claims-based authorization. It has some social providers like Microsoft Account, Google Account, Yahoo! and Facebook. But what about the other social identity providers out there? For example the newly introduced Login with Amazon, or LinkedIn? As they are OAuth2 implementations they don’t really fit into ACS.

Meet It’s a service I created which does a protocol conversion and allows integrating ACS with other social identities. Currently it has support for integrating ACS with Twitter, GitHub, LinkedIn, BitBucket, StackExchange and Amazon. Let’s see how this works. There are 2 steps we have to take:

  • Link SocialSTS with the social identity provider
  • Link our ACS namespace with SocialSTS

Link SocialSTS with the social identity provider

Once an account has been created through, we are presented with a dashboard in which we can configure the social identities. Most of them require that you register your application with them and in turn, you will receive some identifiers which will allow integration.

SocialSTS - Register social identity provider

As you can see, instructions for registering with the social identity provider are listed on the configuration page. For Amazon, we have to register an application with Amazon and configure the following:

If we do this, Amazon will give us a client ID and client secret in return, which we can enter in the SocialSTS dashboard.

Amazon Login with Access Control on Windows Azure

That’s basically all configuration there is to it. We can now add our Amazon, LinkedIn, Twitter or GitHub login page to Windows Azure Access Control Service!

Link our ACS namespace with SocialSTS

In the Windows Azure Access Control Service management dashboard, we can register SocialSTS as an identity provider. SocialSTS will provide us with a FederationMetadata.xml URL which we can copy into ACS:

Add LinkedIn to ACS

We can now save this new identity provider, add some claims transformation rules through the rule groups (important!) and then start using it in our application:

Windows Identity Foundation claims from Amazon,LinkedIn and so on

Enjoy! And let me know your thoughts on this service.

Throttling ASP.NET Web API calls

Many API’s out there, such as GitHub’s API, have a concept called “rate limiting” or “throttling” in place. Rate limiting is used to prevent clients from issuing too many requests over a short amount of time to your API. For example, we can limit anonymous API clients to a maximum of 60 requests per hour whereas we can allow more requests to authenticated clients. But how can we implement this?

Intercepting API calls to enforce throttling

Just like ASP.NET MVC, ASP.NET Web API allows us to write action filters. An action filter is an attribute that you can apply to a controller action, an entire controller and even to all controllers in a project. The attribute modifies the way in which the action is executed by intercepting calls to it. Sound like a great approach, right?

Well… yes! Implementing throttling as an action filter would make sense, although in my opinion it has some disadvantages:

  • We have to implement it as an IAuthorizationFilter to make sure it hooks into the pipeline before most other action filters. This feels kind of dirty but it would do the trick as throttling is some sort of “authorization” to make a number of requests to the API.
  • It gets executed quite late in the overall ASP.NET Web API pipeline. While not a big problem, perhaps we want to skip executing certain portions of code whenever throttling occurs.

So while it makes sense to implement throttling as an action filter, I would prefer plugging it earlier in the pipeline. Luckily for us, ASP.NET Web API also provides the concept of message handlers. They accept an HTTP request and return an HTTP response and plug into the pipeline quite early. Here’s a sample throttling message handler:

1 public class ThrottlingHandler 2 : DelegatingHandler 3 { 4 protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) 5 { 6 var identifier = request.GetClientIpAddress(); 7 8 long currentRequests = 1; 9 long maxRequestsPerHour = 60; 10 11 if (HttpContext.Current.Cache[string.Format("throttling_{0}", identifier)] != null) 12 { 13 currentRequests = (long)System.Web.HttpContext.Current.Cache[string.Format("throttling_{0}", identifier)] + 1; 14 HttpContext.Current.Cache[string.Format("throttling_{0}", identifier)] = currentRequests; 15 } 16 else 17 { 18 HttpContext.Current.Cache.Add(string.Format("throttling_{0}", identifier), currentRequests, 19 null, Cache.NoAbsoluteExpiration, TimeSpan.FromHours(1), 20 CacheItemPriority.Low, null); 21 } 22 23 Task<HttpResponseMessage> response = null; 24 if (currentRequests > maxRequestsPerHour) 25 { 26 response = CreateResponse(request, HttpStatusCode.Conflict, "You are being throttled."); 27 } 28 else 29 { 30 response = base.SendAsync(request, cancellationToken); 31 } 32 33 return response; 34 } 35 36 protected Task<HttpResponseMessage> CreateResponse(HttpRequestMessage request, HttpStatusCode statusCode, string message) 37 { 38 var tsc = new TaskCompletionSource<HttpResponseMessage>(); 39 var response = request.CreateResponse(statusCode); 40 response.ReasonPhrase = message; 41 response.Content = new StringContent(message); 42 tsc.SetResult(response); 43 return tsc.Task; 44 } 45 }

We have to register it as well, which we can do when our application starts:

1 config.MessageHandlers.Add(new ThrottlingHandler());

The throttling handler above isn’t ideal. It’s not very extensible nor does it allow scaling out on a web farm. And it’s bound to being hosted in ASP.NET on IIS. It’s bad! Since there’s already a great project called WebApiContrib, I decided to contribute a better throttling handler to it.

Using the WebApiContrib ThrottlingHandler

The easiest way of using the ThrottlingHandler is by registering it using simple parameters like the following, which throttles every user at 60 requests per hour:

1 config.MessageHandlers.Add(new ThrottlingHandler( 2 new InMemoryThrottleStore(), 3 id => 60, 4 TimeSpan.FromHours(1)));

The IThrottleStore interface stores id + current number of requests. There’s only an in-memory store available but you can easily extend it to write this in a distributed cache or a database.

What’s interesting is we can change how our ThrottlingHandler behaves quite easily. Let’s give a specific IP address a better rate limit:

1 config.MessageHandlers.Add(new ThrottlingHandler( 2 new InMemoryThrottleStore(), 3 id => 4 { 5 if (id == "") 6 { 7 return 5000; 8 } 9 return 60; 10 }, 11 TimeSpan.FromHours(1)));

Wait… Are you telling me this is all IP based? Well yes, by default. But overriding the ThrottlingHandler allows you to do funky things! Here’s a wireframe:

1 public class MyThrottlingHandler : ThrottlingHandler 2 { 3 // ... 4 5 protected override string GetUserIdentifier(HttpRequestMessage request) 6 { 7 // your user id generation logic here 8 } 9 }

By implementing the GetUserIdentifier method, we can for example return an IP address for unauthenticated users and their username for authenticated users. We can then decide on the throttling quota based on username.

Once using it, the ThrottlingHandler will inject two HTTP headers in every response, informing the client about the rate limit:


Enjoy! And do checkout WebApiContrib, it contains almost all extensions to ASP.NET Web API you will ever need!

SymbolSource support for NuGet Package Source Discovery

A couple of weeks, I told you about NuGet Package Source Discovery. In short, it allows you to add some meta information to your website and use your website as a discovery document for NuGet feeds. And thanks to a contribution to the spec by Marcin from, Package Source Discovery (PSD) now supports configuring Visual Studio for consuming symbols as well. Nifty!

An example

Let’s go with an example. If we discover packages from my blog, some feeds will be added to NuGet in Visual Studio.

1 Install-Package DiscoverPackageSources 2 Discover-PackageSources -Url ""

Because my blog links to my feeds on MyGet, I can provide my MyGet credentials with it:

1 Install-Package DiscoverPackageSources 2 Discover-PackageSources -Url "" -Username maarten -Password s3cr3t

Note I’ve stripped out some of the secrets in the examples but I’m sure you get the idea.

What’s interesting is that because I provided credentials, MyGet also returned the SymbolSource URL for my feeds and it registered them automatically in Visual Studio.

Symbol server

Now that’s what I call being lazy in a professional manner!

On a side note… NuGet Feed Discovery

While not completely related to SymbolSource support, it’s worth mentioning that Package Source Discovery also got support for that other NuGet discovery protocol by the guys at Inedo, NuGet Feed Discovery (NFD). NFD differs from PSD in that both specs have a different intent.

  • NFD is a convention-based API endpoint for listing feeds on a server
  • PSD is a means of discovering feeds from any URL given

The fun thing is: if you add an NFD url to your web site’s metadata, it will also be added into Visual Studio by using NuGet Package Source Discovery. For reference, here’s an example where I add my local NuGet feeds to my blog for discovery:

1 <link rel="nuget" 2 type="application/atom+xml" 3 title="Local feeds" 4 href="http://localhost:8888/nugetext/discover-feeds" />


Running unit tests when deploying ASP.NET to Windows Azure Web Sites

Deployment failedOne of the well-loved features of Windows Azure Web Sites is the fact that you can simply push our ASP.NET application’s source code to the platform using Git (or TFS or DropBox) and that sources are compiled and deployed on your Windows Azure Web Site. If you’ve checked the management portal earlier, you may have noticed that a number of deployment steps are executed: the deployment process searches for the project file to compile, compiles it, copies the build artifacts to the web root and has your website running. But did you know you can customize this process?

[update] Mstest seems to work now as well, using the console runner from VS2012.

Customizing the build process

To get an understanding of how to customize the build process, I want to explain you how this works. In the root of your repository, you can add a .deployment file, containing a simple directive: which command should be run upon deployment.

1 [config] 2 command = build.bat

This command can be a batch file, a PHP file, a bash file and so on. As long as we can tell Windows Azure Web Sites what to execute. Let’s go with a batch file.

1 @echo off 2 echo This is a custom deployment script, yay!

When pushing this to Windows Azure Web Sites, here’s what you’ll see:

Windows Azure Web Sites custom build

In this batch file, we can use some environment variables to further customize the script:

  • DEPLOYMENT_SOURCE - The initial "working directory"
  • DEPLOYMENT_TARGET - The wwwroot path (deployment destination)
  • DEPLOYMENT_TEMP - Path to a temporary directory (removed after the deployment)
  • MSBUILD_PATH - Path to msbuild

After compiling, you can simply xcopy our application to the %DEPLOYMENT_TARGET% variable and have your website live.

Generating deployment scripts

Creating deployment scripts can be a tedious job, good thing that the azure-cli tools are there! Once those are installed, simply invoke the following command and have both the .deployment file as well as a batch or bash file generated:

1 azure site deploymentscript --aspWAP "path\to\project.csproj"

For reference, here’s what is generated:

1 @echo off 2 3 :: ---------------------- 4 :: KUDU Deployment Script 5 :: ---------------------- 6 7 :: Prerequisites 8 :: ------------- 9 10 :: Verify node.js installed 11 where node 2>nul >nul 12 IF %ERRORLEVEL% NEQ 0 ( 13 echo Missing node.js executable, please install node.js, if already installed make sure it can be reached from current environment. 14 goto error 15 ) 16 17 :: Setup 18 :: ----- 19 20 setlocal enabledelayedexpansion 21 22 SET ARTIFACTS=%~dp0%artifacts 23 24 IF NOT DEFINED DEPLOYMENT_SOURCE ( 25 SET DEPLOYMENT_SOURCE=%~dp0%. 26 ) 27 28 IF NOT DEFINED DEPLOYMENT_TARGET ( 29 SET DEPLOYMENT_TARGET=%ARTIFACTS%\wwwroot 30 ) 31 32 IF NOT DEFINED NEXT_MANIFEST_PATH ( 33 SET NEXT_MANIFEST_PATH=%ARTIFACTS%\manifest 34 35 IF NOT DEFINED PREVIOUS_MANIFEST_PATH ( 36 SET PREVIOUS_MANIFEST_PATH=%ARTIFACTS%\manifest 37 ) 38 ) 39 40 IF NOT DEFINED KUDU_SYNC_COMMAND ( 41 :: Install kudu sync 42 echo Installing Kudu Sync 43 call npm install kudusync -g --silent 44 IF !ERRORLEVEL! NEQ 0 goto error 45 46 :: Locally just running "kuduSync" would also work 47 SET KUDU_SYNC_COMMAND=node "%appdata%\npm\node_modules\kuduSync\bin\kuduSync" 48 ) 49 IF NOT DEFINED DEPLOYMENT_TEMP ( 50 SET DEPLOYMENT_TEMP=%temp%\___deployTemp%random% 51 SET CLEAN_LOCAL_DEPLOYMENT_TEMP=true 52 ) 53 54 IF DEFINED CLEAN_LOCAL_DEPLOYMENT_TEMP ( 55 IF EXIST "%DEPLOYMENT_TEMP%" rd /s /q "%DEPLOYMENT_TEMP%" 56 mkdir "%DEPLOYMENT_TEMP%" 57 ) 58 59 IF NOT DEFINED MSBUILD_PATH ( 60 SET MSBUILD_PATH=%WINDIR%\Microsoft.NET\Framework\v4.0.30319\msbuild.exe 61 ) 62 63 :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: 64 :: Deployment 65 :: ---------- 66 67 echo Handling .NET Web Application deployment. 68 69 :: 1. Build to the temporary path 70 %MSBUILD_PATH% "%DEPLOYMENT_SOURCE%\path.csproj" /nologo /verbosity:m /t:pipelinePreDeployCopyAllFilesToOneFolder /p:_PackageTempDir="%DEPLOYMENT_TEMP%";AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release 71 IF !ERRORLEVEL! NEQ 0 goto error 72 73 :: 2. KuduSync 74 echo Kudu Sync from "%DEPLOYMENT_TEMP%" to "%DEPLOYMENT_TARGET%" 75 call %KUDU_SYNC_COMMAND% -q -f "%DEPLOYMENT_TEMP%" -t "%DEPLOYMENT_TARGET%" -n "%NEXT_MANIFEST_PATH%" -p "%PREVIOUS_MANIFEST_PATH%" -i ".git;.deployment;deploy.cmd" 2>nul 76 IF !ERRORLEVEL! NEQ 0 goto error 77 78 :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: 79 80 goto end 81 82 :error 83 echo An error has occured during web site deployment. 84 exit /b 1 85 86 :end 87 echo Finished successfully. 88

This script does a couple of things:

  • Ensure node.js is installed on Windows Azure Web Sites (needed later on for synchronizing files)
  • Setting up a bunch of environment variables
  • Run msbuild on the project file we specified
  • Use kudusync (a node.js based tool, hence node.js) to synchronize modified files to the wwwroot of our site

Try it: after pushing this to Windows Azure Web Sites, you’ll see the custom script being used. Not much added value so far, but that’s what you have to provide.

Unit testing before deploying

Unit tests would be nice! All you need is a couple of unit tests and a test runner. You can add it to your repository and store it there, or simply download it during the deployment. In my example, I’m using the Gallio test runner because it runs almost all test frameworks, but feel free to use the test runner for NUnit or xUnit instead.

Somewhere before the line that invokes msbuild and ideally in the “setup” region of the deployment script, add the following:

1 IF NOT DEFINED GALLIO_COMMAND ( 2 IF NOT EXIST "%appdata%\Gallio\bin\Gallio.Echo.exe" ( 3 :: Downloading unzip 4 echo Downloading unzip 5 curl -O 6 IF !ERRORLEVEL! NEQ 0 goto error 7 8 :: Downloading Gallio 9 echo Downloading Gallio 10 curl -O 11 IF !ERRORLEVEL! NEQ 0 goto error 12 13 :: Extracting Gallio 14 echo Extracting Gallio 15 unzip -q -n -d %appdata%\Gallio 16 IF !ERRORLEVEL! NEQ 0 goto error 17 ) 18 19 :: Set Gallio runner path 20 SET GALLIO_COMMAND=%appdata%\Gallio\bin\Gallio.Echo.exe 21 )

See what happens there?  We check if the local system on which your files are stored in WindowsAzure Web Sites already has a copy of the Gallio.Echo.exetest runner. If not, let’s download a tool which allows us to unzip. Next, the entire Gallio test runner is downloaded and extracted. As a final step, the %GALLIO_COMMAND% variable is populated with the full path to the test runner executable.

Right before the line that calls “kudusync”, add the following:

1 echo Running unit tests 2 "%GALLIO_COMMAND%" "%DEPLOYMENT_SOURCE%\SampleApp.Tests\bin\Release\SampleApp.Tests.dll" 3 IF !ERRORLEVEL! NEQ 0 goto error

Yes, the name of your test assembly will be different, you should obviously change that. What happens here? Well, we’re invoking the test runner on our unit tests. If it fails, we abort deployment. Push it to Windows Azure and see for yourself. Here’s what is displayed on success:

Windows Azure Web Site unit tests

All green! And on failure, we get:

Gallio test runner Windows Azure

In the portal, you can clearly see that deployment was aborted:

Deployment fail when unit tests fail

That’s it. Enjoy!

NuGet Package Source Discovery

It’s already been 2 years since NuGet was introduced. This.NET package manager features the concept of feeds, or “package sources”, on which packages containing .NET libraries and tools can be hosted. In fact, support for feeds inspired us to build While not all people are aware of this, Microsoft started out with two feeds as well: one for, the other one for the Orchard CMS.

More and more feeds are being created daily, both by Microsoft as well as others. Here’s a list of feeds Microsoft has that I know of (there are probably more):

Wouldn’t it be nice if we could add them all to our Visual Studio package sources without having to know these URL’s? Meet the NuGet Package Source Discovery specification, or in short: PSD, a specification Xavier, Scott, PhilJeff, Howard and myself have been working on (thanks guys!)

Package Source Discovery

Because PowerShell says more than words, try the following. Open Visual Studio and open any solution. Then issue the following in the Package Manager Console:

1 Install-Package DiscoverPackageSources 2 Discover-PackageSources -Url ""

While we’re at it, perhaps the Glimpse project has something to discover as well.

1 Discover-PackageSources -Url ""

Close and re-open Visual Studio and check your package sources. Notice anything new? My blog has provided you with 2 feeds. And you’ve also been subscribed to Glimpse’s nightly builds feed.

But there’s more. If you would have been authenticated when connecting to my blog, it will yield API keys as well. This allows the PSD client to setup everything that is needed for me to work with my personal feeds, both consuming and producing, by just remembering the URL of my blog.

Package Source Discovery boils down to trust. Since you apparently trust me, you can discover feeds from my blog. If you trust Microsoft, discover feeds from Do you trust Windows Azure? Get their packages by discovering feeds at Need your company feeds? Discover them at http://nuget. A lot of options and possibilities there!

Recycling standards

If you are a blogger and are using Windows Live Writer, you’ve already used this before. We’ve written the NuGet Package Source Discovery specification based on what happens with blogs: when a simple <link /> element is added to your HTML, you are compatible with feed discovery. Here are the two elements that are listed in the source code for my blog:

1 <link rel="nuget" type="application/atom+xml" title="Maarten Balliauw NuGet feed" href="" /> 2 <link rel="nuget" type="application/rsd+xml" href="" />

The first one points directly to a feed. Using the URL and the title attribute, we can add this one to our NuGet package sources with ease. The second one points to an RSD file, known since ages as the Really Simple Discovery format described on We’ve recycled it to allow a lot of things at the client side. Since not all required metadata can be obtained from the RSD format, the Dublin Core schema is present in the PSD response as well.

Here’s an an example:

1 <?xml version="1.0" encoding="utf-8"?> 2 <rsd version="1.0" xmlns:dc=""> 3 <service> 4 <engineName>MyGet</engineName> 5 <engineLink></engineLink> 6 7 <dc:identifier></dc:identifier> 8 <dc:creator>maartenba</dc:creator> 9 <dc:owner>maartenba</dc:owner> 10 <dc:title>Staging feed for GoogleAnalyticsTracker</dc:title> 11 <dc:description>Staging feed for GoogleAnalyticsTracker</dc:description> 12 <homePageLink></homePageLink> 13 14 <apis> 15 <api name="nuget-v2-packages" preferred="true" apiLink="" blogID="" /> 16 <api name="nuget-v2-push" preferred="true" apiLink="" blogID=""> 17 <settings> 18 <setting name="apiKey">abcdefghijkl</setting> 19 </settings> 20 </api> 21 <api name="nuget-v1-packages" preferred="false" apiLink="" blogID="" /> 22 </apis> 23 </service> 24 </rsd> 25

As you can see, using RSD we can embed a lot more information about a feed in this document. If we wanted to add a link to someone’s GitHub and have a client that wants to use this, we can add another <api /> element in here.

Who is using this?

I am (, Xavier is (, Glimpse is (, NancyFX is ( and MyGet has implemented several endpoints as well. Why don't you join the wonderful world of package source discovery?

Feedback needed!

This is not part of NuGet out of the box yet. We need your feedback, comments, implementations and so on. Head over to our GitHub repository, read through the spec and all examples and provide us with your thoughts. Try the two clients we’ve crafted (more on Xavier's blog) and make your NuGet repositories discoverable. Feel free to post a link to your blog below.

Enjoy and let the commenting begin!

Remote profiling Windows Azure Cloud Services with dotTrace

Here’s another cross-post from our JetBrains .NET blog. It’s focused around dotTrace but there are a lot of tips and tricks around Windows Azure Cloud Services in it as well, especially around working with the load balancer. Enjoy the read!

With dotTrace Performance, we can profile applications running on our local computer as well as on remote machines. The latter can be very useful when some performance problems only occur on the staging server (or even worse: only in production). And what if that remote server is a Windows Azure Cloud Service?

Note: in this post we’ll be exploring how to setup a Windows Azure Cloud Service for remote profiling using dotTrace, the “platform-as-a-service” side of Windows Azure. If you are working with regular virtual machines (“infrastructure-as-a-service”), the only thing you have to do is open up any port in the loadbalancer, redirect it to the machine’s port 9000 (dotTrace’s default) and follow the regular remote profiling workflow.

Preparing your Windows Azure Cloud Service for remote profiling

Since we don’t have system administrators at hand when working with cloud services, we have to do some of their work ourselves. The most important piece of work is making sure the load balancer in Windows Azure lets dotTrace’s traffic through to the server instance we want to profile.

We can do this by adding an InstanceInput endpoint type in the web- or worker role’s configuration:

Windows Azure InstanceInput endpoint

By default, the Windows Azure load balancer uses a round-robin approach in routing traffic to role instances. In essence every request gets routed to a random instance. When profiling later on, we want to target a specific machine. And that’s what the InstanceInput endpoint allows us to do: it opens up a range of ports on the load balancer and forwards traffic to a local port. In the example above, we’re opening ports 9000-9019 in the load balancer and forward them to port 9000 on the server. If we want to connect to a specific instance, we can use a port number from this range. Port 9000 will connect to port 9000 on server instance 0. Port 9001 will connect to port 9000 on role instance 1 and so on.

When deploying, make sure to enable remote desktop for the role as well. This will allow us to connect to a specific machine and start dotTrace’s remote agent there.

Windows Azure Remote Desktop RDP

That’s it. Whenever we want to start remote profiling on a specific role instance, we can now connect to the machine directly.

Starting a remote profiling session with a specific instance

And then that moment is there: we need to profile production!

First of all, we want to open a remote desktop connection to one of our role instances. In the Windows Azure management portal, we can connect to a specific instance by selecting it and clicking the Connect button. Save the file that’s being downloaded somewhere on your system: we need to change it before connecting.

Windows Azure connect to specific role instance

The reason for saving and not immediately opening the .rdp file is that we have to copy the dotTrace Remote Agent to the machine. In order to do that we want to enable access to our local drives. Right-click the downloaded .rdp file and select Edit from the context menu. Under the Local Resources tab, check the Drives option to allow access to our local filesystem.

Windows Azure access local filesystem

Save the changes and connect to the remote machine. We can now copy the dotTrace Remote Agent to the role instance by copying all files from our local dotTrace installation. The Remote Agent can be found in C:\Program Files (x86)\JetBrains\dotTrace\v5.3\Bin\Remote, but since the machine in Windows Azure has no clue about that path we have to specify \\tsclient\C\Program Files (x86)\JetBrains\dotTrace\v5.3\Bin\Remote instead.

From the copied folder, launch the RemoteAgent.exe. A console window similar to the one below will appear:


Not there yet: we did open the load balancer in Windows Azure to allow traffic to flow to our machine, but the machine’s own firewall will be blocking our incoming connection. To solve this, configure Windows Firewall to allow access on port 9000. A one-liner which can be run in a command prompt would be the following:

netsh advfirewall firewall add rule name="Profiler" dir=in action=allow protocol=TCP localport=9000


Since we’ve opened ports 9000 thru 9019 in the Windows Azure load balancer and every role instance gets their own port number from that range, we can now connect to the machine using dotTrace. We’ve connected to instance 1, which means we have to connect to port 9001 in dotTrace’s Attach to Process window. The Remote Agent URL will look like http://<yourservice>

Attach to process

Next, we can select the process we want to do performance tracing on. I’ve deployed a web application so I’ll be connecting to IIS’s w3wp.exe.

Profile application dotTrace

We can now user our application and try reproducing performance issues. Once we feel we have enough data, the Get Snapshot button will download all required data from the server for local inspection.

dotTrace get performance snapshot

We can now perform our performance analysis tasks and hunt for performance issues. We can analyze the snapshot data just as if we had recorded the snapshot locally. After determining the root cause and deploying a fix, we can repeat the process to collect another snapshot and verify that you have resolved the performance problem. Note that all steps in this post should be executed again in the next profiling session: Windows Azure’s Cloud Service machines are stateless and will probably discard everything we’ve done with them so far.

Analyze snapshot data

Bonus tip: get the instance being profiled out of the load balancer

Since we are profiling a production application, we may get in the way of our users by collecting profiling data. Another issue we have is that our own test data and our live user’s data will show up in the performance snapshot. And if we’re running a lot of instances, not every action we do in the application will be performed by the role instance we’ve connected to because of Windows Azure’s round-robin load balancing.

Ideally we want to temporarily remove the role instance we’re profiling from the load balancer to overcome these issues.The good news is: we can do this! The only thing we have to do is add a small piece of code in our WebRole.cs or WorkerRole.cs class.

1 public class WebRole : RoleEntryPoint 2 { 3 public override bool OnStart() 4 { 5 // For information on handling configuration changes 6 // see the MSDN topic at 7 8 RoleEnvironment.StatusCheck += (sender, args) => 9 { 10 if (File.Exists("C:\\Config\\profiling.txt")) 11 { 12 args.SetBusy(); 13 } 14 }; 15 16 return base.OnStart(); 17 } 18 }

Essentially what we’re doing here is capturing the load balancer’s probes to see if our node is still healthy. We can choose to respond to the load balancer that our current instance is busy and should not receive any new requests. In the example code above we’re checking if the file C:\Config\profiling.txt exists. If it does, we respond the load balancer with a busy status.

When we start profiling, we can now create the C:\Config\profiling.txt file to take the instance we’re profiling out of the server pool. After about a minute, the management portal will report the instance is “Busy”.

Role instance marked Busy

The best thing is we can still attach to the instance-specific endpoint and attach dotTrace to this instance. Just keep in mind that using the application should now happen in the remote desktop session we opened earlier, since we no longer have the current machine available from the Internet.


Once finished, we can simply remove the C:\Config\profiling.txt file and Windows Azure will add the machine back to the server pool. Don't forget this as otherwise you'll be paying for the machine without being able to serve the application from it. Reimaging the machine will also add it to the pool again.


Custom media types for ASP.NET Web API versioning

There is a raging discussion on the interwebs on whether to version API’s by using their URL or by using a custom media type. Some argue that doing it in the URL breaks REST (since a different URL is a different resource while versions don’t necessarily mean a new resource is available). While I still feel good about both approaches, I guess it depends on the domain you are working with.

But that is not the topic of this talk. I recently found a sample on CodePlex providing support for routing versioned URL’s to different namespaces. In short, it maps /api/v1/values to MyApp.V1.Controllers and /api/v2/values to MyApp.V2.Controllers. Great! But that only supports the URL-versioning side of the discussion. Let’s implement this sample and build ASP.NET Web API support for versioning an API using custom media types…

Custom Media Types

If you have no clue about what I am talking about, no worries. I’ll give you a quick primer on this using the GitHub API as an example. Since their API version 3, endpoints for the API (or “resource addresses”) will no longer change every version of the API. Instead, they will be parsing the Accept HTTP header to determine the incoming message version and the expected response version.

Getting a list of repositories from the API? The URL will always be /users/repos. However different incoming and outgoing responses are possible, varying based on their media types. Want to use the V3 message format in JSON? Use application/vnd.github.v3+json. Prefer the V3 message format in XML? Use application/vnd.github.v3+xml. Whenever they update their messages, they can add a new media type such as application/vnd.github.v4 without changing any URL. Nifty trick, aye? Let’s do this for our own API.


The IHttpControllerSelector interface allows you to interfere in selecting the right controller for the current request. This is an ideal location for grabbing all contextual information and providing ASP.NET Web API with a controller based on that context.

1 public class AcceptHeaderControllerSelector : IHttpControllerSelector 2 { 3 private const string ControllerKey = "controller"; 4 5 private readonly HttpConfiguration _configuration; 6 private readonly Func<MediaTypeHeaderValue, string> _namespaceResolver; 7 private readonly Lazy<Dictionary<string, HttpControllerDescriptor>> _controllers; 8 private readonly HashSet<string> _duplicates; 9 10 public AcceptHeaderControllerSelector(HttpConfiguration config, Func<MediaTypeHeaderValue, string> namespaceResolver) 11 { 12 _configuration = config; 13 _namespaceResolver = namespaceResolver; 14 _duplicates = new HashSet<string>(StringComparer.OrdinalIgnoreCase); 15 _controllers = new Lazy<Dictionary<string, HttpControllerDescriptor>>(InitializeControllerDictionary); 16 } 17 18 private Dictionary<string, HttpControllerDescriptor> InitializeControllerDictionary() 19 { 20 var dictionary = new Dictionary<string, HttpControllerDescriptor>(StringComparer.OrdinalIgnoreCase); 21 22 // Create a lookup table where key is "namespace.controller". The value of "namespace" is the last 23 // segment of the full namespace. For example: 24 // MyApplication.Controllers.V1.ProductsController => "V1.Products" 25 IAssembliesResolver assembliesResolver = _configuration.Services.GetAssembliesResolver(); 26 IHttpControllerTypeResolver controllersResolver = _configuration.Services.GetHttpControllerTypeResolver(); 27 28 ICollection<Type> controllerTypes = controllersResolver.GetControllerTypes(assembliesResolver); 29 30 foreach (Type t in controllerTypes) 31 { 32 var segments = t.Namespace.Split(Type.Delimiter); 33 34 // For the dictionary key, strip "Controller" from the end of the type name. 35 // This matches the behavior of DefaultHttpControllerSelector. 36 var controllerName = t.Name.Remove(t.Name.Length - DefaultHttpControllerSelector.ControllerSuffix.Length); 37 38 var key = String.Format(CultureInfo.InvariantCulture, "{0}.{1}", segments[segments.Length - 1], controllerName); 39 40 // Check for duplicate keys. 41 if (dictionary.Keys.Contains(key)) 42 { 43 _duplicates.Add(key); 44 } 45 else 46 { 47 dictionary[key] = new HttpControllerDescriptor(_configuration, t.Name, t); 48 } 49 } 50 51 // Remove any duplicates from the dictionary, because these create ambiguous matches. 52 // For example, "Foo.V1.ProductsController" and "Bar.V1.ProductsController" both map to "v1.products". 53 foreach (string s in _duplicates) 54 { 55 dictionary.Remove(s); 56 } 57 return dictionary; 58 } 59 60 // Get a value from the route data, if present. 61 private static T GetRouteVariable<T>(IHttpRouteData routeData, string name) 62 { 63 object result = null; 64 if (routeData.Values.TryGetValue(name, out result)) 65 { 66 return (T)result; 67 } 68 return default(T); 69 } 70 71 public HttpControllerDescriptor SelectController(HttpRequestMessage request) 72 { 73 IHttpRouteData routeData = request.GetRouteData(); 74 if (routeData == null) 75 { 76 throw new HttpResponseException(HttpStatusCode.NotFound); 77 } 78 79 // Get the namespace and controller variables from the route data. 80 string namespaceName = null; 81 foreach (var accepts in request.Headers.Accept) 82 { 83 namespaceName = _namespaceResolver(accepts); 84 if (namespaceName != null) 85 { 86 break; 87 } 88 } 89 if (namespaceName == null) 90 { 91 throw new HttpResponseException(HttpStatusCode.NotFound); 92 } 93 94 string controllerName = GetRouteVariable<string>(routeData, ControllerKey); 95 if (controllerName == null) 96 { 97 throw new HttpResponseException(HttpStatusCode.NotFound); 98 } 99 100 // Find a matching controller. 101 string key = String.Format(CultureInfo.InvariantCulture, "{0}.{1}", namespaceName, controllerName); 102 103 HttpControllerDescriptor controllerDescriptor; 104 if (_controllers.Value.TryGetValue(key, out controllerDescriptor)) 105 { 106 return controllerDescriptor; 107 } 108 else if (_duplicates.Contains(key)) 109 { 110 throw new HttpResponseException( 111 request.CreateErrorResponse(HttpStatusCode.InternalServerError, 112 "Multiple controllers were found that match this request.")); 113 } 114 else 115 { 116 throw new HttpResponseException(HttpStatusCode.NotFound); 117 } 118 } 119 120 public IDictionary<string, HttpControllerDescriptor> GetControllerMapping() 121 { 122 return _controllers.Value; 123 } 124 }

To be honest, I did not write much code in this. I grabbed the IHttpControllerSelector implementation from the sample on CodePlex and added just these lines to check the Accept header instead.

1 // Get the namespace and controller variables from the route data. 2 string namespaceName = null; 3 foreach (var accepts in request.Headers.Accept) 4 { 5 namespaceName = _namespaceResolver(accepts); 6 if (namespaceName != null) 7 { 8 break; 9 } 10 } 11 if (namespaceName == null) 12 { 13 throw new HttpResponseException(HttpStatusCode.NotFound); 14 }

The real logic in finding out the version that is called is delegated to the user of this IHttpControllerSelector. Let’s wire it up!

Wiring it up

ASP.NET Web API has a lot of “plugs”, among which there’s one where we can plug in our custom IHttpControllerSelector, Let’s override the default one and add our own:

1 config.Services.Replace(typeof(IHttpControllerSelector), 2 new AcceptHeaderControllerSelector(config, accept => 3 { 4 foreach (var parameter in accept.Parameters) 5 { 6 if (parameter.Name.Equals("version", StringComparison.InvariantCultureIgnoreCase)) 7 { 8 switch (parameter.Value) 9 { 10 case "1.0": return "v1"; 11 case "2.0": return "v2"; 12 } 13 } 14 } 15 16 return "v2"; // default namespace, return null to throw 404 when namespace not given 17 }));

As you can see, we can pass in a lambda which gets called with the contents of the Accept header and must return the namespace obtained from the header. The above example will work when using the version property of a header, e.g.: application/json;version=1.0 and application/json;version=2.0. The last statement returns “v2” as the default version when no specific media header is given. Return null if you want this to result in a 404 Page Not Found.

Using this header scheme is recommended but of course other options are possible. It’s your lambda!

Another approach would be going "GitHub style" and use things like application/vnd.api.v1+json and similar?

1 config.Services.Replace(typeof(IHttpControllerSelector), 2 new AcceptHeaderControllerSelector(config, accept => 3 { 4 var matches = Regex.Match(accept.MediaType, @"application\/vnd.api.(.*)\+.*"); 5 if (matches.Groups.Count >= 2) 6 { 7 return matches.Groups[1].Value; 8 } 9 return "v2"; // default namespace, return null to throw 404 when namespace not given 10 }));

Note that when using the GitHub-style media type, it’s best to also configure the default media type formatters to recognize these new types. That way you can even use different media type formats for each API version.

1 // Add custom media types as supported to their default formatters 2 config.Formatters.JsonFormatter.SupportedMediaTypes.Add(new MediaTypeWithQualityHeaderValue("application/vnd.api.v1+json")); 3 config.Formatters.JsonFormatter.SupportedMediaTypes.Add(new MediaTypeWithQualityHeaderValue("application/vnd.api.v2+json")); 4 5 config.Formatters.XmlFormatter.SupportedMediaTypes.Add(new MediaTypeWithQualityHeaderValue("application/vnd.api.v1+xml")); 6 config.Formatters.XmlFormatter.SupportedMediaTypes.Add(new MediaTypeWithQualityHeaderValue("application/vnd.api.v2+xml"));

That’s basically it. We can now implement our controllers in different namespaces, like so:

1 namespace TestSelector.Controllers.V1 2 { 3 public class ValuesController : ApiController 4 { 5 public string Get() 6 { 7 return "This is a V1 response."; 8 } 9 } 10 } 11 12 namespace TestSelector.Controllers.V2 13 { 14 public class ValuesController : ApiController 15 { 16 public string Get() 17 { 18 return "This is a V2 response."; 19 } 20 } 21 }

When providing different Accept headers, we now get routed to the correct namespace depending on our custom media type. REST maturity level up!

I’ve issued a pull request on the official samples page, in the meanwhile here’s the download: (238.43 kb)


[edit] there's a project on GitHub containing other implementations as well, check

Taking over the @msdnbelux Twitter account

Just a quick post to let you know I’ll be taking over the @msdnbelux Twitter account for the next two weeks. This is the official Twitter account for MSDN BeLux. It’s not hacked, I did not steal the password: they gave it to me!


The best thing about this takeover is that there are no constraints: I can tweet whatever I want to tweet! So far it's been fun to do, I've seen a lot of reactions on my tweets as well. Let me know how I do! Who knows, I might just change the password and keep this account for myself after these two weeks :-)

Follow @msdnbelux and I’ll provide you with great ASP.NET MVC, ASP.NET Web API, JavaScript and Windows Azure related content.