Maarten Balliauw {blog}

ASP.NET MVC, Microsoft Azure, PHP, web development ...

NAVIGATION - SEARCH

Building future .NET projects is quite pleasant

You may remember my ranty post from a couple of months back. If not, read about how building .NET projects is a world of pain and here’s how we should solve it. With Project K ASP.NET vNext ASP.NET 5 around the corner, I thought I had to look into it again and see if things will actually get better… So here goes!

Setting up a build agent is no longer a world of pain

There, the title says it all. For all .NET development we currently do, this world of pain will still be there. No way around it, you will want to commit random murders if you want to do builds targeting .NET 2.0 – .NET 4.5. A billion SDK’s all packaged in MSI’s that come with weird silent installs so you can not really script their setup, it will be there still. Reason for that is that dependencies we have are all informal: we build against some SDK, and assume it will be there. Our application does not define what it needs, so we have to provide the whole world on our build machines…

But if we forget all that and focus just on ASP.NET 5 and the new runtime, this new world is bliss. What do we need on the build agent? A few things, still.

  • An operating system (Windows, Linux and even Mac OS X will do the job)
  • PowerShell, or any other shell like Bash
  • Some form of .NET installed, for example mono

Sounds pretty standard out-of-the-box to me. So that’s all good! What else do we need installed permanently on the machine? Nothing! That’s right: NOTHING! Builds for ASP.NET 5 are self-contained and will make sure they can run anytime, anywhere. Every project specifies its dependencies, that will all be downloaded when needed so they are available to the build. Let’s see how builds now work…

How ASP.NET 5 projects work…

As an example for this post, I will use the Entity Framework repository on GitHub, which is built against ASP.NET 5. When building a project in Visual Studio 2015, there will be .sln files that represent the solution, as well as new .kproj files that represent our project. For Visual Studio. That’s right: you can ignore these files, they are just so Visual Studio can figure out how it all fits together. “But that .kproj file is like a project file, it’s msbuild-like and I can add custom tasks in there!” – Crack! That was the sound of a whip on your fingers. Yes, you can, and the new project system actually adds some things in there to make building the project in Visual Studio work, but stay away from the .kproj files. Don’t touch them.

The real project files are these: global.json and project.json. The first one, global.json, may look like this:

{ "sources": [ "src" ] }

It defines the structure of our project, where we say that source code is in the folder named src. Multiple folders could be there, for example src and test so we can distinguish where which type of project is stored. For every project we want to make, we can create a folder under the sources folder and in there, add a project.json file. It could look like this:

{ "version": "7.0.0-*", "description": "Entity Framework is Microsoft's recommended data access technology for new applications.", "compilationOptions": { "warningsAsErrors": true }, "dependencies": { "Ix-Async": "1.2.3-beta", "Microsoft.Framework.Logging": "1.0.0-*", "Microsoft.Framework.OptionsModel": "1.0.0-*", "Remotion.Linq": "1.15.15", "System.Collections.Immutable": "1.1.32-beta" }, "code": [ "**\\*.cs", "..\\Shared\\*.cs" ], "frameworks": { "net45": { "frameworkAssemblies": { "System.Collections": { "version": "", "type": "build" }, "System.Diagnostics.Debug": { "version": "", "type": "build" }, "System.Diagnostics.Tools": { "version": "", "type": "build" }, "System.Globalization": { "version": "", "type": "build" }, "System.Linq": { "version": "", "type": "build" }, "System.Linq.Expressions": { "version": "", "type": "build" }, "System.Linq.Queryable": { "version": "", "type": "build" }, "System.ObjectModel": { "version": "", "type": "build" }, "System.Reflection": { "version": "", "type": "build" }, "System.Reflection.Extensions": { "version": "", "type": "build" }, "System.Resources.ResourceManager": { "version": "", "type": "build" }, "System.Runtime": { "version": "", "type": "build" }, "System.Runtime.Extensions": { "version": "", "type": "build" }, "System.Runtime.InteropServices": { "version": "", "type": "build" }, "System.Threading": { "version": "", "type": "build" } } }, "aspnet50": { "frameworkAssemblies": { "System.Collections": "", "System.Diagnostics.Debug": "", "System.Diagnostics.Tools": "", "System.Globalization": "", "System.Linq": "", "System.Linq.Expressions": "", "System.Linq.Queryable": "", "System.ObjectModel": "", "System.Reflection": "", "System.Reflection.Extensions": "", "System.Resources.ResourceManager": "", "System.Runtime": "", "System.Runtime.Extensions": "", "System.Runtime.InteropServices": "", "System.Threading": "" } }, "aspnetcore50": { "dependencies": { "System.Diagnostics.Contracts": "4.0.0-beta-*", "System.Linq.Queryable": "4.0.0-beta-*", "System.ObjectModel": "4.0.10-beta-*", "System.Reflection.Extensions": "4.0.0-beta-*" } } } }

Whoa! My eyes! Well, it’s not so bad. A couple of things are in here:

  • The version of our project (yes, we have to version properly, woohoo!)
  • A description (as I have been preaching a long time: every project is now a package!)
  • Where is our source code stored? II n this case, all .cs files in all folders and some in a shared folder one level up.
  • Dependencies of our project. These are identifiers of other packages, that will either be searched for on NuGet, or on the filesystem. Since every project is a package, there is no difference between a project or a NuGet package. During development, you can depend on a project. When released, you can depend on a package. Convenient!
  • The frameworks supported and the framework components we require.

That’s the project system. These are not all supported elements, there are more. But generally speaking: our project now defines what it needs. One I like is the option to run scripts at various stages of the project’s lifecycle and build lifecycle, such as restoring npm or bower packages. SLight thorn in my eye there is that the examples out there all assume npm and bower are on the build machine. Yes, that’s a hidden dependency right there…

The good things?

  • Everything is a package
  • Everything specifies their dependencies explicitly (well, almost everything)
  • It’s human readable and machine readable

So let’s see what we would have to do if we want to automate a build of, say, the Entity Framework repository on GitHub.

Automated building of ASP.NET 5 projects

This is going to be so dissappointing when you read it: to build Entity Framework, you run build.cmd (or build.sh on non-Windows OS). That’s it. It will compile everything into assemblies in NuGet packages, run tests and that’s it. But what does this build.cmd do, exactly? Let’s dissect it! Here’s the source code that’s in there at time of writing this blog post:

@echo off cd %~dp0 SETLOCAL SET CACHED_NUGET=%LocalAppData%\NuGet\NuGet.exe IF EXIST %CACHED_NUGET% goto copynuget echo Downloading latest version of NuGet.exe... IF NOT EXIST %LocalAppData%\NuGet md %LocalAppData%\NuGet @powershell -NoProfile -ExecutionPolicy unrestricted -Command "$ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest 'https://www.nuget.org/nuget.exe' -OutFile '%CACHED_NUGET%'" :copynuget IF EXIST .nuget\nuget.exe goto restore md .nuget copy %CACHED_NUGET% .nuget\nuget.exe > nul :restore IF EXIST packages\KoreBuild goto run .nuget\NuGet.exe install KoreBuild -ExcludeVersion -o packages -nocache -pre .nuget\NuGet.exe install Sake -version 0.2 -o packages -ExcludeVersion IF "%SKIP_KRE_INSTALL%"=="1" goto run CALL packages\KoreBuild\build\kvm upgrade -runtime CLR -x86 CALL packages\KoreBuild\build\kvm install default -runtime CoreCLR -x86 :run CALL packages\KoreBuild\build\kvm use default -runtime CLR -x86 packages\Sake\tools\Sake.exe -I packages\KoreBuild\build -f makefile.shade %*

Did I ever mention my dream was to have fully self-contained builds? This is one. Here’s what happens:

  • A NuGet.exe is required, if it’s found that one is reused, if not, it’s downloaded on the fly.
  • Using NuGet, 2 packages are installed (currently from the alpha feed the ASP.NET team has on MyGet, but I assume these will end up on NuGet.org someday)
    • KoreBuild
    • Sake
  • The KoreBuild package contains a few things (go on, use NuGet Package Explorer and see, I’ll wait)
    • A kvm.ps1, which is the bootstrapper for the ASP.NET 5 runtime that installs a specific runtime version and kpm, the package manager.
    • A bunch of .shade files
  • Using that kvm.ps1, the latest CoreCLR runtime is installed and activated
  • Sake.exe is run from the Sake package

Dissappointment, I can feel it! This file does botstrap having the CoreCLR runtime on the build machine, but how is the actual build performed? The answer lies in the .shade files from that KoreBuild package. A lot of information is there, but distilling it all, here’s how a build is done using Sake:

  • All bin folders underneath the current directory are removed. Consider this the old-fashioned “clean” target in msbuild.
  • The kpm restore command is run from the folder where the global.json file is. This will ensure that all dependencies for all project files are downloaded and made available on the machine the build is running on.
  • In every folder containing a project.json file, the kpm build command is run, which compiles it all and generates a NuGet package for every project.
  • In every folder containing a project.json file where a command element is found that is named test, the k test command is run to execute unit tests

This is a simplified version, as it also cleans and restores npm and bower, but you get the idea. A build is pretty easy now. KoreBuild and Sake do this, but we could also just run all steps in the same order to achieve a fully working build. So that’s what I did…

Automated building of ASP.NET 5 projects with TeamCity

To see if it all was true, I decided to try and automate things using TeamCity. Entity Framework would be to easy as that’s just calling build.bat. Which is awesome!

I crafted a little project on GitHub that has a website, a library project and a test project. The goal I set out was automating a build of all this using TeamCity, and then making sure tests are run and reported. On a clean build agent with no .NET SDK’s installed at all. I also decided to not use the Sake approach, to see if my theory about the build process was right.

So… Installing the runtime, running a clean, build and test, right? Here goes:

@echo off cd %teamcity.build.workingDir% SETLOCAL IF EXIST packages\KoreBuild goto run %teamcity.tool.NuGet.CommandLine.DEFAULT.nupkg%\tools\nuget.exe install KoreBuild -ExcludeVersion -o packages -nocache -pre -Source https://www.myget.org/F/aspnetvnext/api/v2 :run CALL packages\KoreBuild\build\kvm upgrade -runtime CLR -x86 CALL packages\KoreBuild\build\kvm install default -runtime CoreCLR -x86 CALL packages\KoreBuild\build\kvm use default -runtime CLR -x86 :clean @powershell -NoProfile -ExecutionPolicy unrestricted -Command "Get-ChildItem %mr.SourceFolder% "bin" -Directory -rec -erroraction 'silentlycontinue' | Remove-Item -Recurse; exit $Lastexitcode" :restore @powershell -NoProfile -ExecutionPolicy unrestricted -Command "Get-ChildItem %mr.SourceFolder% global.json -rec -erroraction 'silentlycontinue' | Select-Object -Expand DirectoryName | Foreach { cmd /C cd $_ `&`& CALL kpm restore }; exit $Lastexitcode" :buildall @powershell -NoProfile -ExecutionPolicy unrestricted -Command "Get-ChildItem %mr.SourceFolder% project.json -rec -erroraction 'silentlycontinue' | Foreach { kpm build $_.FullName --configuration %mr.Configuration% }; exit $Lastexitcode" @powershell -NoProfile -ExecutionPolicy unrestricted -Command "Get-ChildItem %mr.SourceFolder% *.nupkg -rec -erroraction 'silentlycontinue' | Where-Object {$_.FullName -match 'bin'} | Select-Object -Expand FullName | Foreach { Write-Host `#`#teamcity`[publishArtifacts `'$_`'`] }; exit $Lastexitcode" :testall @powershell -NoProfile -ExecutionPolicy unrestricted -Command "Get-ChildItem %mr.SourceFolder% project.json -rec -erroraction 'silentlycontinue' | Where-Object { $_.FullName -like '*test*' } | Select-Object -Expand DirectoryName | Foreach { cmd /C cd $_ `&`& k test -teamcity }; exit $Lastexitcode"

(note: this may not be optimal, it’s as experimental as it gets, but it does the job – feel free to rewrite this in Ant or Maven to make it cross platform on TeamCity agents, too)

TeamCity will now run the build and provide us with the artifacts generated during build (all the NuGet packages), and expose them in the UI after each build:

Projeckt K ASP.NET vNext TeamCity

Even better: since TeamCity has a built-in NuGet server, these packages now show up on that feed as well, allowing me to consume these in other projects:

NuGet feed in TeamCity for ASP.NET vNext

Running tests was unexpected: it seems the ASP.NET 5 xUnit runner still uses TeamCity service messages and exposes results back to the server:

Test results from xUnit vNext in TeamCity

But how to set the build number, you ask? Well, turns out that this is coming from the project.json. The build umber in there is leading, but we can add a suffix by creating a K_VERSION_NUMBER environment variable. On TeamCity, we could use our build counter for it. Or run GitVersion and use that as the version suffix.

TeamCity set ASP.NET 5 version number

Going a step further, running kpm pack even allows us to build our web applications and add the entire generated artifact to our build, ready for deployment:

ASP.NET 5 application build on TeamCity

Very, very nice! I’m liking where ASP.NET 5 is going, and forgetting everything that came before gives me high hopes for this incarnation.

Conclusion

This is really nice, and the way I dreamt it would all work. Everything is a package, and builds are self-contained. It’s all still in beta state, but it gives a great view of what we’ll soon all be doing. I hope a lot of projects will use the builds like the Entity Framework one. having one or two build.bat files in there that do the entire thing. But even if not and you have a boilerplate VS2015 project, using the steps outlined in this blog post gets the job done. In fact, I created some TeamCity meta runners for you to enjoy (contributions welcome). How about adding one build step to your ASP.NET 5 builds in TeamCity…

TeamCity ASP.NET build by convention

Go grab these meta runners now! I have created quite a few:

  • Install KRE
  • Convention-based build
  • Clean sources
  • Restore packages
  • Build one project
  • Build all projects
  • Test one project
  • Test all projects
  • Package application

PS: Thanks Mike for helping me out with some PowerShell goodness!

Could not load file or assembly… NuGet Assembly Redirects

When working in larger projects, you will sometimes encounter errors similar to this one: “Could not load file or assembly 'Newtonsoft.Json, Version=4.5.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' or one of its dependencies. The system cannot find the file specified.” Or how about this one? “System.IO.FileLoadException : Could not load file or assembly 'Moq, Version=3.1.416.3, Culture=neutral, PublicKeyToken=69f491c39445e920' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)

Search all you want, most things you find on the Internet are from the pre-NuGet era and don’t really help. What now? In this post, let’s go over why this error sometimes happens. And I’ll end with a beautiful little trick that fixes this issue when you encounter it. Let’s go!

Redirecting Assembly Versions

In 90% of the cases, the errors mentioned earlier are caused by faulty assembly redirects. What are those, you ask? A long answer is on MSDN, a short answer is that assembly redirects let us trick .NET into believing that assembly A is actually assembly B. Or in other words, we can tell .NET to work with Newtonsoft.Json 6.0.0.4 whenever any other reference requires an older version of Newtonsoft.Json.

Assembly redirects are often created by NuGet, to solve versioning issues. Here’s an example which I took from an application’s Web.config:

<?xml version="1.0" encoding="utf-8"?> <configuration> <!-- ... --> <runtime> <legacyHMACWarning enabled="0" /> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> <dependentAssembly> <assemblyIdentity name="Newtonsoft.Json" publicKeyToken="30ad4fe6b2a6aeed" culture="neutral" /> <bindingRedirect oldVersion="0.0.0.0-6.0.0.0" newVersion="6.0.0.0" /> </dependentAssembly> </assemblyBinding> </runtime> </configuration>

When running an application with this config file, it will trick any assembly that wants to use any version < 6.0.0.0 of Newtonsoft.Json into working with the latest 6.0.0.0 version. Neat, as it solves dependency hell where two assemblies require a different version of a common assembly dependency. But… does it solve that?

NuGet and Assembly Redirects

The cool thing about NuGet is that it auto-detects whenever assembly redirects are needed, and adds them to the Web.config or App.config file of your project. However, this not always works well. Sometimes, old binding redirects are not removed. Sometimes, none are added at all. Resulting in fine errors like the ones I opened this post with. At compile time. Or worse! When running the application.

One way of solving this is manually checking all binding redirects in all configuration files you have in your project, checking assembly versions and so on. But here comes the trick: we can let NuGet do this for us!

All we have to do is this:

  1. From any .config file, remove the <assemblyBinding> element and its child elements. In other words: strip your app from assembly binding redirects.
  2. Open the Package Manager Console in Visual Studio. This can be done from the View | Other Windows | Package Manager Console menu.
  3. Type this one, magical command that solves it all: Get-Project -All | Add-BindingRedirect. I repeat: Get-Project -All | Add-BindingRedirect

NuGet Add Binding Redirect

NuGet will get all projects and for every project, add the correct assembly binding redirects again. Compile, run, and continue your day without rage. Enjoy!

PS: For the other cases where this trick does not help, check Damir Dobric’s post on troubleshooting NuGet references.

Automatically strong name signing NuGet packages

Some developers prefer to strong name sign their assemblies. Signing them also means that the dependencies that are consumed must be signed. Not all third-party dependencies are signed, though, for example when consuming packages from NuGet. Some are signed, some are unsigned, and the only way to know is when at compile time when we see this:

Referenced assembly does not have a strong name

That’s right: a signed assembly cannot consume an unsigned one. Now what if we really need that dependency but don’t want to lose the fact that we can easily update it using NuGet… Turns out there is a NuGet package for that!

The Assembly Strong Naming Toolkit can be installed into our project, after which we can use the NuGet Package Manager Console to sign referenced assemblies. There is also the .NET Assembly Strong-Name Signer by Werner van Deventer, which provides us with a nice UI as well.

The problem is that the above tools only sign the assemblies once we already consumed the NuGet package. With package restore enabled, that’s pretty annoying as the assemblies will be restored when we don’t have them on our system, thus restoring unsigned assemblies…

NuGet Signature

Based on the .NET Assembly Strong-Name Signer, I decided to create a small utility that can sign all assemblies in a NuGet package and creates a new package out of those. This “signed package” can then be used instead of the original, making sure we can simply consume the package in Visual Studio and be done with it. Here’s some sample code which signs the package “MyPackage” and creates “MyPackage.Signed”:

var signer = new PackageSigner(); signer.SignPackage("MyPackage.1.0.0.nupkg", "MyPackage.Signed.1.0.0.nupkg", "SampleKey.pfx", "password", "MyPackage.Signed");

This is pretty neat, if I may say so, but still requires manual intervention for the packages we consume. Unless the NuGet feed we’re consuming could sign the assemblies in the packages for us?

NuGet Signature meets MyGet Webhooks

Earlier this week, MyGet announced their webhooks feature. After enabling the feature on our feed, we could pipe events, such as “package added”, into software of our own and perform an action based on this event. Such as, signing a package.

MyGet automatically sign NuGet package

Sweet! From the GitHub repository here, download the Web API project and deploy it to Microsoft Azure Websites. I wrote the Web API project with some configuration options, which we can either specify before deploying or through the management dashboard. The application needs these:

  • Signature:KeyFile - path to the PFX file to use when signing (defaults to a sample key file)
  • Signature:KeyFilePassword - private key/password for using the PFX file
  • Signature:PackageIdSuffix - suffix for signed package id's. Can be empty or something like ".Signed"
  • Signature:NuGetFeedUrl - NuGet feed to push signed packages to
  • Signature:NuGetFeedApiKey - API key for pushing packages to the above feed

The configuration in the Microsoft Azure Management Dashboard could look like the this:

Azure Websites

Once that runs, we can configure the web hook on the MyGet side. Be sure to add an HTTP POST hook that posts to <url to your deployed app>/api/sign, and only with the package added event.

MyGet Webhook configuration

From now on, all packages that are added to our feed will be signed when the webhook is triggered. Here’s an example where I pushed several packages to the feed and the NuGet Signature application signed the packages themselves.

NuGet list signed packages

The nice thing is in Visual Studio we can now consume the “.Signed” packages and no longer have to worry about strong name signing.

Thanks to Werner for the .NET Assembly Strong-Name Signer I was able to base this on.

Enjoy!

Optimizing calls to Azure storage using Fiddler

Last week, Xavier and I were really happy for achieving a milestone. After having spent quite some evenings on bringing Visual Studio Online integration to MyGet, we were happy to be mentioned in the TechEd keynote and even pop up in quite some sessions. We also learned ASP.NET vNext was coming and it would leverage NuGet as an important part of it. What we did not know, however, is that the ASP.NET team would host all vNext preview packages from MyGet. But we soon noticed and found our evening hours were going to be very focused for another few days…

On May 12th, we all of a sudden saw usage of our service double in an instant. Ouch! Here’s what Google Analytics told us:

image

Luckily for us, we are hosted on Azure and could just pull the slider to the right and add more servers. Scale out for the win! Apart for some hickups when we enabled auto scaling (we thought traffic would go down at some points during the day), MyGet handled traffic pretty well. But still, we had to double our server capacity for being able to host one high-traffic NuGet feed. And even though we doubled sever capacity, response times went up as well.

image

Time for action! But what…

Some background on our application

When we started MyGet, our idea was to leverage table storage and blob storage, and avoid SQL completely. The reason for that is back then MyGet was a simple proof-of-concept and we wanted to play with new technology. Growing, getting traction and onboarding users we found out that what we had in place to work on this back-end was very nice to work with and we’ve since evolved to a more CQRS-ish and event driven (-ish) architecture.

But with all good things come some bad things as well. Adding features, improving code, implementing quota so we could actually meter what our users were doing and put a price tag on it had added quite some calls to table storage. And while it’s blazingly fast if you know what you are doing, they are still calls to an external system that have to open up a TCP connection, do an SSL handshake and so on. Not so many milliseconds, but summing them all together they do add up. So how do you find out what is happening? Let’s see…

Analyzing Azure storage traffic

There is no profiler out there that currently allows you to easily hook into what traffic is going over the wire to Azure storage. Fortunately for us, the Azure team added a way of hooking a proxy server between your application and storage itself. Using the development storage emulator, we can simply change our storage connection string to the following and hook Fiddler in:

UseDevelopmentStorage=true;DevelopmentStorageProxyUri=http://ipv4.fiddler

Great! Now we have Fiddler to analyze our traffic to Azure storage. All requests going to blob, table or queue storage services are now visible: URL, result, timing and so forth.

image

The picture above is not coming from MyGet but just to illustrate the idea. You can clear the list of requests, load a specific page or action in your application and see the calls going out to storage. And we had some critical paths where we did over 7 requests. If each of them is 30ms on average, that is 210ms just to grab some data. And then you’ve not even done anything with it… So we decided to tackle that in our code.

Another thing we noticed by looking at URLs here is that we had some of those requests filtering only using the table storage RowKey, resulting in a +/- 2 second roundtrip on some requests. That is bad. And so we also fixed that (on some occasions by adding some caching, on others by restructuring the way data is stored and moving our filter to PartitionKey or a combination of PartitionKey and RowKey as you should).

The result of this? Well have a look at that picture above where our response times are shown: we managed to drastically reduce response times, making ourselves happy (we could kill some VM’s), and our users as well because everything is faster.

A simple trick with great results!

Speeding up ASP.NET vNext package restore

TL;DR: If you have multiple NuGet feeds configured on your machine, it may be worth to do some tweaking in the NuGet.config file shipping with your project.

Last week, the ASP.NET team released a preview of “ASP.NET vNext”, a first step in the good direction for solving the pain building .NET projects is, but more than that a great step towards having an open and cross-platform ASP.NET that is super developer friendly. If you haven’t checked it out yet, do so now.

One of the things ASP.NET vNext leans on heavily is NuGet. In fact, every application comes with a project.json file that describes an application’s dependencies. Only when running kpm restore these dependencies are downloaded and the application can be run. Running this package restore (it’s NuGet after all) is usually pretty fast, but if you, like me, are a heavy NuGet user, chances are the restore is not happening in the most optimal way. Have a look at the output of my kpm restore command right after I installed ASP.NET vNext on my system:

Project K package restore

It’s not easy to capture a screenshot that proves the point I'm about to make, but if you do this yourself and you have multiple NuGet feeds configured on your system, you’ll see that ASP.NET vNext is trying to restore packages from all configured feeds. In my case, I’m using a personal feed on MyGet, a feed hosted on my TeamCity server, a feed on my local filesystem (testing purposes) and then the ASP.NET vNext MyGet feed as well as NuGet.org. That’s 5 feeds being checked over and over again for the dependencies listed in my project.json… Let’s see if we can reduce this a bit.

If we look at the samples shipped in ASP.NET vNext, we can find a NuGet.config file in there. And as we know, NuGet has this thing called configuration file inheritance. This means that the feeds defined in here will be enriched with the feeds configured at the machine level, in my case 5 of them. But that also means we can easily fix this: adding a <clear /> element under the <packageSources> element will do the trick of removing all previously defined feeds and using just the ones defined for the project I’m working on:

<?xml version="1.0" encoding="utf-8"?> <configuration> <packageSources> <clear /> <add key="AspNetVNext" value="https://www.myget.org/F/aspnetvnext/" /> <add key="NuGet.org" value="https://nuget.org/api/v2/" /> </packageSources> <!-- ... --> </configuration>

Use this trick for your own ASP.NET vNext projects as well: specify the feeds you want to use explicitly and everything will be faster for you and other developers working with your code. It ensures that kpm or NuGet for that matter only check the feeds that are relevant to your project and not every feed that is configured on your system.

Building .NET projects is a world of pain and here’s how we should solve it

During the past few weeks, I’ve been working on and off on setting up a build agent that can build as many open-source .NET projects as possible in an effort to learn how hard it is to do. Allow me to open this blog post with a rant… One which will feel very familiar if you’ve recently installed a build agent yourself.

Setting up a .NET build machine is insane

As the minimal installation of components I started with installing the .NET framework 2.0, 3.0, 3.5, 4.0, 4.0.1 (yes, that exists), 4.0.2, 4.0.3, 4.5, 4.5.1 and their multi-targeting packs on the build agent. Next, I took 100 random C# projects from GitHub that had activity in the last year or so and started building and reading build logs. Great news! There are a lot of self-contained open source projects out there that build happily on this minimal install. Most of these seem to be class libraries, often depending on some NuGet packages that are installed using NuGet package restore.

Unfortunately, there are a great number of projects that do not build on this minimal setup: those that require specific SDK’s and components installed. So I started delving deeper into build logs and tackled project by project with the necessary “headless installs” of SDK’s. In practice, this sometimes means running an installer with specific commands to only install what is required to build projects on it. In other cases it means copying .targets and reference assemblies from my Windows 8.1 machine to the Windows Server 2012 R2 machine that was my build agent (yes, you can build Windows Store apps on Windows Server if you are persistent…). And in other cases (looking at you, Windows Phone SDK!) it meant running the installer in compatibility mode with some registry keys changed to overcome installer checks that do not allow installing that SDK on Windows Server.

In the end, I had to install pretty much the entire world on the build agent, or at least all SDK’s and tools that have been released between Visual Studio 2010 and the latest 2013. Here’s 17.6 GB of sh… dependencies for you.

Installed programs and features

What is the issue?

Well, there isn’t just “one” issue. There are several. Here’s a quick list of issues and questions

  • There is no way to clearly specify dependencies on SDK’s and tooling in .NET projects. The only way to know what is required is to build, read the build log, build, read the build log and someday succeed in finding the right SDK for the job. These dependencies are all implicit and there is no good way of finding out what they are, except trial and error.
  • The fact that I need this amount of SDK’s installed is crazy in itself. Why is this? Most builds simply need a .targets file and some DLLs, not all the other stuff that is in the download of such SDK.
  • Some SDK’s don’t install on every platform. Why is that? Why can’t SDK X install on platform Y?
  • Will I be able to install future versions of the SDK side-by-side so “older” projects build on my machine? Or will I need a machine for every Visual Studio version separately? How to isolate these things?

This is not only Microsoft tooling and SDK’s. Various other SDKs also require installs, prerequisites, configuration, … If only that picture above would allow scrolling so you could see Amazon, Xamarin and many others in that list.

How should we solve this?

Let’s look at the Node.js community and how they manage to do things. Every project, whether an actual application, a library or component, contains an important file: packages.json. It contains a description of the project itself, as well as the dependencies it requires, both id and version. All you need to build or run most of such projects is a node executable and an Internet connection to download dependencies on the fly. Sounds familiar? It does!

We’ve been using NuGet for quite a while now in the .NET space (if you haven’t, look into it now, even for in-house frameworks hosted on private feeds!). We’re distributing open source projects as NuGet packages that we can depend on in our own software. We can publish our own software as a NuGet package so others can depend on it. Awesome! Then why aren’t we doing this with the 17.6 GB of SDK madness we have to install on a build machine?

I do not think we can solve this quickly and change history. But I do think from now on we have to start building SDK’s differently. Most projects only require an MSBuild .targets file and some assemblies, either containing MSBuild tasks or reference assemblies, to do their compilation work. What if… we shipped the minimum files required to succesfully build a project as NuGet packages? The NuGet gallery contains some examples of this, but there are only a few. Another example is the ReSharper SDK which is shipped as a NuGet package. Need a test runner? Wrap the executable in a NuGet package and I’ll bring it down and run it during build. My takeaway: if you have a .targets file and are wrapping it in an MSI, you are doing it wrong.

Does that mean MSI's should disappear? No! They can exist and add tooling or whatever they need to add to a developer machine. All I want is the .targets file and supporting assemblies to be distributed separately as a self-contained package which I can reference explicitly, rather than the implicit way it is done now.

In my ideal world, all .NET projects would have a packages.config file in their root folder in which library dependencies as well as MSBuild dependencies can be described. My build machine would contain the .NET framework and Mono. And during build, all dependencies would be magically brought down for just that build.

P.S.: A lot of the new packages like ASP.NET MVC and WebApi, the OData packages and such are being shipped as NuGet packages which is awesome. The ones that I am missing are those that require additional build targets that are typically shipped in SDK's. Examples are the Windows Azure SDK, database tools and targets, ... I would like those to come aboard the NuGet train and ship their Visual Studio tooling separately from teh artifacts required to run a build.

NuGet Configuration File inheritance is awesome

One way to remove friction from using NuGet in multiple projects is by making use of NuGet Configuration File inheritance, probably the awesomest unknown feature in there.

By default, all NuGet clients (the command-line tool, the Visual Studio extension and the Package Manager Console) all make use of the default NuGet configuration file which lives under %AppData%\NuGet\NuGet.config. NuGet can make use of other configuration files as well! In fact, NuGet can walk an entire tree of configuration files and fetch settings from those.

Which configuration file will be used?

Good question and happy you asked! The standard answer I always give to any question is: it depends. In this case on the client you are using. But ignoring that fact, here’s a generalized version of the three that is walked for building the configuration the client will work with.

  • The current directory and all its parents
  • The user-specific config file located under %AppData%\NuGet\NuGet.config
  • IDE-specific configuration files, for example:
        %ProgramData%\NuGet\Config\{IDE}\{Version}\{SKU}\*.config (e.g. %ProgramData%\NuGet\Config\VisualStudio\12.0\Pro\NuGet.config)
        %ProgramData%\NuGet\Config\{IDE}\{Version}\*.config
        %ProgramData%\NuGet\Config\{IDE}\*.config
        %ProgramData%\NuGet\Config\*.config
  • The machine-wide config file located under %ProgramData%\NuGet\NuGetDefaults.config (which, as a sysadmin, is a good one to put default configuration options in using an Active Directory Group Policy, just saying)

Full details can be found in the NuGet docs, just keep in mind that first item of the list: all clients start with a NuGet.config in the current directory and then walk up to the drive root, and only then are the standard files checked. Wow. Just WOW! This means every parent folder of a project or solution can contain additional configuration details that will be applied (note: the file that is first consulted wins).

So in short, if I have a solution file C:\Projects\CustomerA\AwesomeSolution\AwesomeSolution.sln, all NuGet clients will load configuration values from:

  • C:\Projects\CustomerA\AwesomeSolution\NuGet.config
  • C:\Projects\CustomerA\NuGet.config
  • C:\Projects\NuGet.config
  • C:\NuGet.config
  • All the other locations mentioned above

This gives some pretty interesting scenarios! Let’s cover a few. But again, check the NuGet docs for more information on possible entries in a NuGet.config.

Example 1: a project-specific configuration

So you are using a private feed? That’s a good thing! (I do hope it’s on MyGet ;-)). It’s the default for your current project? Even better! But why do all your developers have to add this feed to their NuGet configuration if a NuGet.config can be shipped in source control? Simply putting the following file right next to your .sln file will do the job:

<?xml version="1.0" encoding="utf-8"?> <configuration> <packageSources> <add key="Chuck Norris Feed" value="https://www.myget.org/F/chucknorris" /> </packageSources> </configuration>

Want to block access to NuGet.org and simply use the private feed all the time? Here’s some more:

<?xml version="1.0" encoding="utf-8"?> <configuration> <packageSources> <add key="Chuck Norris Feed" value="https://www.myget.org/F/chucknorris" /> </packageSources> <disabledPackageSources> <add key="nuget.org" value="true" /> </disabledPackageSources> <activePackageSource> <add key="Chuck Norris Feed" value="https://www.myget.org/F/chucknorris" /> </activePackageSource> </configuration>

Example 2: help, my devs are pushing our internal framework to NuGet.org!

Good one, good one. We don’t want that to happen. Probably they forgot the -Source parameter to NuGet.exe, but still. Accidental pushes are not fun! Place this one next to the .sln file and you should be good:

<?xml version="1.0" encoding="utf-8"?> <configuration> <config> <add key="DefaultPushSource" value="https://www.myget.org/F/chucknorris/api/v2/package" /> </config> </configuration>

Feel free to combine it with example 1, it may make sense!

Example 3: NuGet.exe always asks me for proxy credentials

That is not funny. Proxies are like printers: the idea is great but when you need them things don’t always go well. Good thing is we can configure default proxy credentials. While possible to put this one in a project, it’s probably better to do this in the default %AppData%\NuGet\NuGet.config:

<?xml version="1.0" encoding="utf-8"?> <configuration> <config> <add key="http_proxy" value="host" /> <add key="http_proxy.user" value="username" /> <add key="http_proxy.password" value="encrypted_password" /> </config> </configuration>

Example 4: feed inheritance and package restore

We have multiple customers, each with a specific feed they can use. Awesome! Every customer project can contain the following NuGet.config:

<?xml version="1.0" encoding="utf-8"?> <configuration> <packageSources> <add key="Customer X" value="https://www.myget.org/F/customerx" /> </packageSources> </configuration>

In the C:\Projects folder, we can add another configuration file which adds in another feed for every project located under C:\Projects. All customer projects use both of these feeds, typically. Customer specific components as well as that framework built in-house, each on their own feed. But help! All of a sudden, package restore started complaining no package named X can be found!

The reason for that is probably the active package source is set to one specific feed and not the “aggregate” of all configured feeds. Here’s a solution to that which can go in C:\Projects\NuGet.config:

<?xml version="1.0" encoding="utf-8"?> <configuration> <packageSources> <add key="Our Cool Framework" value="https://www.myget.org/F/ourcoolframework" /> </packageSources> <activePackageSource> <add key="All" value="(Aggregate source)" /> </activePackageSource> </configuration>

All sorts of fancy combinations are possible, the only thing you have to do is find an approach that works for you.

Enjoy!

Source Control considered harmful

TL;DR: Using source control is a really bad idea. Or is it? Skip to Conclusion for the meat of this post.

One of the first things I do with a new project in Visual Studio is not add it to source control. There are many reasons, but it all boils down to this: Source Control introduces more problems than it solves.

Before I dive into this, I'll share the solution with you. Put your sources on a USB drive. Yes, it's that simple.

Implications

If you're like most other people, you don't like that solution, because it feels inefficient:

  • USB drives can get lost
  • USB drives can end up in the dishwasher
  • I have to buy a USB drive for every developer on the team
  • Sharing sources with distributed teams is more difficult: USB drives have to be shipped by snail mail

All of that is true, but then again...

  • You can always make a copy of a USB drive to safeguard against loss
  • Sharing USB drives is really easy: plug and play! Ease of use!
  • You can have lots of coffee waiting for a USB drive to arrive with that contribution to your OSS project

Still, many people go for source control: Source Control and a central repository solve all implications of using a USB drive, so why not use source control?

Fragility

Have you ever let a junior developer loose on a git repository? I can promise you, it's not pretty.

  • Merges will go wrong
  • They will find out about rebasing and mess up the entire system
  • Pull requests on GitHub? One click to merge, no need to test or review!
  • Developers will forget to check in specific files

Again: all of this is easy with a USB drive: one location to store the project. Yes, merging is slightly difficult too but then again replaying history in source control is much worse.

And I haven't even talked about having to have a network share or a GitHub account in which you can have private repositories. That's all extra costs and extra risks. What if the Internet connection goes down. What if a dev's laptop breaks? You might even say a USB drive is too advanced and a typewriter is an even better way to write code!

Cost

Did I mention the cost of USB drives? At most conferences and shops you will get them for free. Even if you buy them, they are probably around 0.10$ per GB. USB drives are very inexpensive.

Compare that with source control: we need an Internet connecion, a GitHub repository, and most importantly: devs will have to read documentation on using git or be coached by someone on the team. That's really inefficient and costs a lot of time!

Conclusion

You may have noted that this is a slightly strange post. You are correct, it is. I’m responding to some of the outrages regarding yesterday’s NuGet.org outage. Tweets and blogs mention to not use NuGet, or use NuGet but definitely not use package restore. That’s perfectly fine, but I don’t think the reasons for not using it are well founded, hence the above sarcasm. If it wasn’t clear: you should be using source control.

Should you use NuGet package restore? I think it depends on your preference, mostly. It should not depend on NuGet.org outages, nor on the microwave destroying your WiFi signal and failing your builds utilizing package restore. Should you add packages to your repository or use package restore? It depends on what you want to achieve and how you want to work. I prefer not to do this because they are dependencies that are versioned (package version and packages.config) so why version them again? We don’t add the issues from our issue tracker to source control either, right?

We put issues in a specialized system for managing issues. In my opinion, the same should be true for software and component dependencies. But then again: if you want to add packages to source control, fine by me. As some tweets said, you don’t have to do it for the minimal disk space optimizations. All that matters is if it makes sense to your process. 

Just like with source control, issue trackers and other things (like package restore) in your build process, you should read up on them, play with them and know the risks. Do we know that our Internet connection can break during solar storms? Well yes. It’s a minor risk but if it’s important to your shop do mitigate that risk. Do laptops break? Yes. If it’s important that you can keep working even if a laptop crashes, buy some more and keep them up-to-date with your main development machine. If you rely on GitHub and want to get work done if they have issues, make sure you have an up to date fork somewhere on a file share. Make that two file shares!

And if you rely on NuGet package restore… you get the point, right? For NuGet, there are private repositories available that can host your in-house packages and the ones you are using from upstream sources like NuGet.org. Use them, if they matter for your development process. Know about NuGet 2.8’s automatic fallback to the local cache you have on disk and if something goes wrong, use that cache until the package source is back up.

The development process and the tools are part of your system. Know your tools. Even if it requires you to read crazy books like how to work with git. Or Pro NuGet 2.

Pro NuGet second edition is out

Pro NuGet will learn you all there is to know about NuGetPfew! Around February 2013, Xavier and I started planning work on an update of our book. Eight months later, we’re proud to present you with Pro NuGet (second edition). It’s been a tough couple of months writing this: Xavier has become a father for the second time (congratulations!), we’ve had two massive updates to NuGet we had to work in our book, … But here it is!

What’s new?

  • A number of workflows with NuGet have changed and have been added. Expect all of these, including NuGet’s old and new package restore functionality.
  • Want to work with NuGet and Windows Azure Websites, TeamCity, Visual Studio Online, OctopusDeploy, NuGet Gallery, ProGet or MyGet? We have a bunch of recipes for you!
  • Pitfalls of package versioning
  • Building a plugin system based on NuGet

Next to that there is a lot more meat in there!

  • Understand how NuGet fits into the big picture of your software development process to save you time and money.
  • How to keep your team working when your project depends on an external resource (such as a web service or cloud) which suddenly becomes unavailable.
  • Whether or not to auto-update NuGet packages within a continuous integration process for maximum reliability and speed.
  • How to combine NuGet with PowerShell to create your own Cmdlets and extend the base toolset in an extremely powerful manner.
  • Evaluate the pros-and-cons of hosting your own NuGet repository.
  • How to incorporate NuGet seamlessly within your continuous integration process.
  • Much much more!

We would love to get your feedback! E-mail us or write a review on your blog or Amazon. Enjoy the read!

PS: Thanks to our excellent reviewers (the NuGet team) and everyone at Apress! There is a lot of people involved in getting a quality book out there. Thanks!

A new year's present: introducing Glimpse plugins for Windows Azure

Glimpse plugin for Windows AzureHave you tried Glimpse before? It shows you server-side information like execution times, server configuration, request data and such in your browser. At the February MVP Summit this year, Anthony, Nik and I had a chat about what would be useful information to be displayed in Glimpse when working on Windows Azure. Some beers and a bit of coding later, we had a proof-of-concept showing Windows Azure runtime configuration data in a Glimpse tab.

Today, we are happy to announce a first public preview of two Windows Azure tabs in Glimpse: the Glimpse.WindowsAzure package displaying runtime information, and Glimpse.WindowsAzure.Storage collecting information about traffic from and to storage.

Want to give it a try? You can install these two NuGet packages from NuGet.org (prerelease packages for now). Sources can be found on GitHub. And all comments, remarks and suggestions can go in the comments to this blog post.

Now let’s have a look at what these packages have to offer!

Glimpse.WindowsAzure

The Glimpse.WindowsAzure package adds a new tab to Glimpse, displaying environment information when the web application is hosted on Windows Azure. It does this for Cloud Services as well as for Windows Azure Web Sites.

Installation is easy: simply add the Glimpse.WindowsAzure package to your project and you’re done. If you are running on .NET 4.5, you will have to add the following setting to your Web.config:

<appSettings>
  <add key="Glimpse:DisableAsyncSupport" value="true"/>
</appSettings>

When hosting in a Windows Azure Cloud Service (or the full emulator available in the Windows Azure SDK), the Azure Environment tab will provide information gathered from the RoleEnvironment class. Youcan see the deployment ID, current role instance information, a list of configured endpoints, which fault and uopdate domain our application is running in and so on.

Windows Azure Role Environment

When the web application is hosted on Windows Azure Web Sites, we get information like Compute Mode (Shared or Reserved) as well as Site Mode (Limited in the screenshot below means the application is running on a Free web site).

Glimpse Windows Azure Web Sites

The Azure Environment tab will also provide a link to the Kudu Remote Console, a feature in Windows Azure Web Sites where you can run commands on the box hosting the web site,

Kudu Console

Pretty handy if you ask me!

Glimpse.WindowsAzure.Storage

The Glimpse.WindowsAzure.Storage package adds an “Azure Storage” tab to Glimpse, displaying all sorts of information about traffic from and to Windows Azure storage. It will also estimate the cost for loading the current page depending on number of transactions and traffic to blobs, tables and/or queues. Note that this package can also be used in ASP.NET web sites that are not hosted on Windows Azure yet making use of Windows Azure Storage.

Once the package is installed into your project, you can almost start inspecting all this information. Almost? Well, see the caveat further down…

 

Number of transactions and a cost estimate

The first type of data displayed in the Azure Storage tab is the total number of transactions, traffic consumed and a cost estimate for 10.000 pageviews. This information can be used for several scenarios:

  • Know how many calls are made to storage. Maybe you can reduce the number of calls to reduce the toal number of transactions, one of the billing metrics for Windows Azure.
  • Another billing metric is the amount of traffic consumed. When running in the same datacenter as the storage account, it’s less important for cost but still, reducing the traffic can reduce the page load time.

Windows Azure Storage Transactions and bandwidth consumed

Now where do we get the price per 10.000 pageviews? Well, this is a very rough estimate, based om the pay-per-use pricing in Windows Azure. It is very likely that the actual price willk be lower if you are running on an MSDN subscription, a pre-paid plan or an Enterprise Agreement.

Warnings and analysis of requests

One feature we’re particularly proud of is this one: warnings and analysis of requests to Windows Azure Storage. First of all, we’ll analyse the settings for communicating over the network. In the screenshot below, you can see several general hints to optimize throughput by disabling the Nagle algorithm or disabling HTTP 100 Continue.

Another analysis we’ll do is verifying the requests themselves. In the example below, Glimpse is giving a warning about the fact that I’m querying table storage on properties that are not indexed, potentially causing timeouts in my application.

There are several more inspections in there, if you have suggestions for others feel free to let us know!

Analysis of requests

List of requests and Timeline

When using Windows Azure Storage, Glimpse will show you all requests that have been made together with the status code and total duration of the request.

image

Since a plain list is often not that easy to analyze, the Timeline tab is extended with this information as well. It shows you a summary of when calls to Windows Azure Storage have been made, as well as full details of the requests:

Timeline tracing Windows Azure Storage

One caveat

Because of a current limitation in the Windows Azure Storage SDK, you will have to explicitly add one parameter to every call that is made to Windows Azure Storage.

The idea is that the OperationContext parameter for calls to storage has to be a special Glimpse OperationContext obtained by calling OperationContextFactory.Current.Create(). This Glimpse-specific implementation provides us all the information required to do display information in the Azure Storage tab. here’s an example on how to wire it in for a call to create a blob storage container:

var account = CloudStorageAccount.DevelopmentStorageAccount;
var blobclient
= account.CreateCloudBlobClient();
var container1
= blobclient.GetContainerReference("glimpse1");
container1.CreateIfNotExists(operationContext: OperationContextFactory.Current.Create());

We are talking with Microsoft about this and are pretty sure this shortcoming will be addressed in the future.

What’s next?

It would be great if you could give these two packages a try! NuGet packages are available from NuGet.org (prerelease packages for now). Sources can be found on GitHub. And all comments, remarks and suggestions can go in the comments to this blog post.

We’re still looking at load balanced environments. You can implement Glimpse’s IPersistenceStore but we would like to have a zero-configuration setup.

Once we’re confident Glimpse.WindowsAzure and Glimpse.WindowsAzure.Storage are working properly, we’ll have a look at Windows Azure Caching and Service Bus.

Enjoy!