Maarten Balliauw {blog}

ASP.NET MVC, Microsoft Azure, PHP, web development ...

NAVIGATION - SEARCH

Setting up a webfarm using Windows Azure Virtual Machines

With the release of Microsoft’s Windows Azure Virtual Machines, a bunch of new scenarios became available on their cloud platform. If you plan to host multiple web applications, you can either go with Windows Azure Web Sites or go with a webfarm you create using the new IaaS capabilities. The first is okay for any type of application, the latter may be suitable when running a large-scale web application that can not be deployed easily in the PaaS offering. In this blog post, I’ll show you how to build a webfarm with (free!) load balancing.

Note: I’ll be using the built-in Windows Azure load balancer. If required, you can also deploy your own load balancer VM or reverse proxy. But since the Windows Azure load balancer comes with no extra cost, I think it’s the better choice for a lot of scenarios.

Creating a first virtual machine

After logging in to the new Windows Azure management portal, create a new virtual machine. You can choose to create a Linux or a Windows machine from a template or upload your own VM. I’ll go with a Windows machine but everything explained in this post is valid for a Linux webfarm, too.

Creating a Windows Azure Virtual Machine

Navigate through the wizard, selecting the VM size and administrator username of choice. In step 3 where you have to specify the DNS name and some other settings, be sure to choose an affinity group (giving better networking performance due to the fact that machines are on the same network in the Windows Azure datacenter). The DNS name can be anything you want to name your webfarm.

Windows Azure Virtual Machine Windows Linux

Before finishing the wizard, there is an important thing to do: in step 4, make sure to create an availability group in which all machines of the webfarm will reside. An availability group ensures that whenever maintenance occurs in the datacenter, this only occurs on one or some of your webfarm machines and not on all at once.

Windows Azure VM options

Adding an HTTP endpoint to the first machine

After the first virtual machine has been created, navigate to its configuration dashboard in the Windows Azure management portal. In order to have port 80 connected to this machine, a new endpoint should be added to the machine. Add the endpoints of choice, I chose to have port 80 open.

Windows Azure configure VM endpoints

It is important to understand that the endpoints added here are only opened at the load balancer level. That’s right: even a single machine will be behind a load balancer. This is incredibly powerful, as you’ll see when we add a new machine to our IaaS webfarm. It also poses an extra configuration step for single machines though: you’ll have to open port 80 on the machine’s firewall, too. You can safely use remote desktop (Windows) or SSH (Linux) to do so:

Install IIS on WIndows Azure Virtual Machine

Cloning the first machine

To make things easy, I’ve first configured IIS on the first machine. I simply enabled the webserver and made sure Windows Firewall allows connections to IIS. From this point on, I simply want to clone this machine and add it to my webfarm.

The first thing to do when cloning (or “capturing”) a VM is “sysprepping” it. On Linux, there’s a similar option in the Windows Azure agent. Sysprep ensures the machine can be cloned into a new machine, getting it’s own settings like a hostname and IP address. A non-sysprepped machine can thus never be cloned.

Windows azure virtual machine requires sysprep

After sysprepping the machine, shut it down. If you’ve selected the option during sysprep, the machine will automatically shutdown. Otherwise you can do so through remote desktop or SSH, or simply through the Windows Azure portal.

Shutdown virtual machine on Windows Azure

Next, click the “Capture” button to create a disk image from this machine. Give it a name and  check the “Yes, I’ve sysprepped the machine” checkbox in order to be able to continue.

Windows Azure Capture virtual machine

After clicking the “ok” button, Windows Azure will create an image of our first webserver.

After the image has been created, you’ll notice that your first webserver has disappeared! This is normal: the machine has been disemboweled in order to create a template from it. You can now simply re-create this machine using the same settings as before, except you can now base it on this newly created VM image instead of basing it off a VM template Microsoft provides.

In the endpoints configuration, make sure to add the HTTP endpoint again listening on port 80.

Creating a second virtual machine

To create the second machine in your webfarm, create a fresh virtual machine. As the base disk, choose the image we’ve created earlier:

Windows Azure create your own virtual machine image

In step 3 of the machine creation, make sure to connect this machine to our existing web server. In step 4, locate the VM in the same availability set.

Connect to an existing virtual machine in Windows Azure

You now have two machines running, yet they aren’t load balanced at this moment. You’ll notice that both machines are already behind the same hostname (http://webfarm.cloudapp.net) and that they share the same public virtual IP address. This is due to the fact that we “linked” the machines earlier. If you don’t, you will never be able to use the out-of-the-box load balancer that comes with Windows Azure. This also means that the public remote desktop endpoint for both machines will be different: there’s only one IP address exposed to the outside world so you’ll have to think about endpoints.

Don’t add the HTTP endpoint to this machine just yet.

Configuring the Windows Azure load balancer

The last part of setting up our webfarm will be load balancing.  This is in fact really, really easy. Simply go to second machine’s dashboard in the Windows Azure portal and navigate to the Endpoints tab. We’ve already added public HTTP endpoints on our first machine, which means for our second machine we can just subscribe to load balancing:

Windows Azure comes with free load balancing

Easy, huh? You now have free round-robin load balancing with checks every few seconds to ensure that all machines are up and running. And since we linked these machines through an availability set, they are on different fault domains in the datacenter reducing the chance of errors due to malfunctioning hardware or maintenance. You can safely shut down a machine too. In short: anything you’d expect from a load balancer (except sticky sessions).

Final words

There is of course more to it. In ASP.NET, you’ll have to configure machine keys and such in the same way you would do it on-premise. But at the infrastructure level, we’re covered. Enjoy! And be sure to brag about this adventure to any IT pro you know :-)

Use NuGet Package Restore to avoid pushing assemblies to Windows Azure Websites

Windows Azure Websites allows you to publish a web site in ASP.NET, PHP, Node, … to Windows Azure by simply pushing your source code to a TFS or Git repository. But how does Windows Azure Websites manage dependencies? Do you have to check-in your assemblies and NuGet packages into source control? How about no…

NuGet 1.6 shipped with a great feature called NuGet Package Restore. This feature lets you use NuGet packages without adding them to your source code repository. When your solution is built by Visual Studio (or MSBuild, which is used in Windows Azure Websites), a build target calls nuget.exe to make sure any missing packages are automatically fetched and installed before the code is compiled. This helps you keep your source repo small by keeping large packages out of version control.

Enabling NuGet Package Restore

Enabling NuGet package restore can be done from within Visual Studio. Simply right-click your solution and click the “Enable NuGet Package Restore” menu item.

NuGet package restore Windows Azure Websites Antares

Visual Studio will now do the following with the projects in your solution:

  • Create a .nuget folder at the root of your solution, containing a NuGet.exe and a NuGet build target
  • Import this NuGet target into all your projects so that MSBuild can find, download and install NuGet packages on-the-fly when creating a build

Be sure to push the files in the .nuget folder to your source control system. The packages folder is not needed, except for the repositories.config file that sits in there.

But what about my non-public assembly references? What if I don't trust auto-updating from NuGet.org?

Good question. What about them? A simple answer would be to create NuGet packages for them. And if you already have NuGet packages for them, things get even easier. Make sure that you are hosting these packages in an online feed which is not the public NuGet repository at www.nuget.org, unless you want your custom assemblies out there in public. A good choice would be to checkout www.myget.org and host your packages there.

But then a new question surfaces: how do I link a custom feed to my projects? The answer is pretty simple: in the .nuget folder, edit the NuGet.targets file. In the PackageSources element, you can supply a semicolon (;) separated list of feeds to check for packages:

1 <?xml version="1.0" encoding="utf-8"?> 2 <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> 3 <PropertyGroup> 4 <!-- ... --> 5 6 <!-- Package sources used to restore packages. By default will used the registered sources under %APPDATA%\NuGet\NuGet.Config --> 7 <PackageSources>"http://www.myget.org/F/chucknorris;http://www.nuget.org/api/v2"</PackageSources> 8 9 <!-- ... --> 10 </PropertyGroup> 11 12 <!-- ... --> 13 </Project>

By doing this and pushing the targets file to your Windows Azure Websites Git or TFS repo, the build system backing Windows Azure Websites will go ahead and download your packages from an external location, not cluttering your sources. Which makes for one, happy cloud.

Windows Azure Git Deploy

GitHub for Windows Azure Websites

Windows Azure Websites Git Github for WindowsWith the new release of Windows Azure and Windows Azure Websites, a lot of new scenarios with Windows Azure just became possible. One I like a lot, especially since Appharbor and Heroku have similar offers too, is the possibility to push source code (ASP.NET or PHP) to Windows Azure instead of binaries using Windows Azure Websites.

Not everyone out there is a command-line here though: if you want to use Git as a mechanism of pushing sources to Windows Azure Websites chances are you may go crazy if you are unfamiliar with command-line git commands. Luckily, a couple of weeks ago, GitHub released GitHub for Windows. It features an easy-to-use GUI on top of GitHub repositories. And with a small trick also on top of Windows Azure Websites.

Setting up a Windows Azure Website

Since you’re probably still unfamiliar with Windows Azure Websites, let me guide you through the setup. It’s a simple process. First of all, navigate to the new Windows Azure portal. It looks different than the one you’re used to but it’s way easier to use. In the toolbar at the bottom, click New, select Web site, Quick Create and enter a hostname of choice. I chose “websiteswithgit”:

Creating a Windows Azure Website

After a couple of seconds, you’ll be presented with the dashboard of your newly created Windows Azure Website. This dashboard features a lot of interesting metrics about your website such as data traffic, CPU usage, errors, … It also displays the available means for publishing a site to Windows Azure Websites: TFS deploy, Git deploy, Webdeploy and FTP publishing. That’s it: your website has been set up and if you navigate to the newly created URL, you’ll be greeted with the default Windows Azure Websites landing page.

Setting up Git publishing

Since we’ll be using Git, click the Set up Git Publishing option.

Windows Azure Websites Dashboard

If you haven’t noticed already: Windows Azure Websites makes Windows Azure a lot easier. After a couple of seconds, Git publishing is configured and all it takes to deploy your website is commit your source code, whether ASP.NET, ASP.NET Webpages or PHP to the newly created Git repository. Windows Azure Websites will take care of the build process (cool!) and will deploy this to Windows Azure in just a couple of seconds. Whoever told you deploying to Windows Azure takes ages lied to you!

Connecting GitHub for Windows to Windows Azure Websites

After setting up Git publishing, you probably have noticed that there’s a Git repository URL being displayed. Copy this one to your clipboard as we’ll be needing it in a minute. Open GitHub for Windows, right-click the UI and choose to “open a shell here”. Make sure you’re in the folder of choice. Next, issue a “git clone <url>” command, where <url> of course is the Git repository URL you’ve just copied.

Windows Azure Git Repository Build

The (currently empty) Windows Azure Website Git repository will be cloned onto your system. Now close this command-line (I promised we would use GitHub for Windows instead).

Git folder

Open the folder in which you cloned the Git repo and drag it onto GitHub for Windows. It will look kind of empty, still:

A Windows Azure Websites repository in GitHub for Windows

Next, add any file you want. A PHP file, a plain HTML file or a complete ASP.NET or ASP.NET MVC Web Application. GitHub for Windows will detect these changes and you can commit them to your local repository:

GitHub commit Windows Azure

All that’s left to do after a commit is clicking the Publish button. GitHub for Windows will now copy all changesets to the Windows Azure Websites GitHub repository which will in turn trigger an eventual build process for your web site. The result? A happy Windows Azure Websites dashboard and a site up and running. Rinse, repeat, commit. Happy deployments to Windows Azure Websites using GitHub for Windows!

Antares Windows Azure Websites Deployment History Build

Ready for a Windows Azure roadtrip?

image

A while ago, Michelangelo van Dam (a hardcore PHP guy), François Hertay (a hardcore Java guru) and myself (a .NET guy) were asked if we wanted to write about some of our experiences with the Windows Azure platform.

Starting next week, you’ll find about 3 blog posts per week and/or videos on Windows Azure (both a getting started level as well as some fun advanced posts) on a freshly released website: www.azure-roadtrip.be

I have the honour of kicking off a series of blog posts on .NET, starting with a level 100 one: the what & why of Windows Azure for .NET developers.

Let’s knock at an open door: Windows Azure is a cloud platform. It’s a group of services offered by Microsoft that enable you to build your applications using the typical characteristics of a cloud platform: everything is self-service, on-demand and pay-per-use. You are the one demanding resources like virtual machines or storage and you pay for what you use, depending on how much you use for how long. No contract periods – well, there is one: you always pay for at least one hour if you use these resources.

Read more on wwww.azure-roadtrip.be! And keep an eye on that site as I have some fun posts coming there :-)

Social meet up on Twitter for MEET Windows Azure on June 7th

AzureBanner_300x250Here’s a perhaps rather redundant event for you but it should be kind of fun: MEET Windows Azure on Twitter (+ Beer). The idea is to list people who have a twitter account and intend to follow the MEET Windows Azure event via live streaming on June 7th (1pm PDT).

So see you online for the event on the 7th! My Twitter handle is @maartenballiauw

MEET Windows Azure Blog Relay:

+ Beer? Since you are watching the event from your coucnh (or the totilet, or your bed, or actually, from wherever you want), feel free to open up a beer.

Call to action: Link to this blog post on your blog and I will update this post to link to you!

Using the Windows Azure Content Delivery Network

As you know, Windows Azure is a very rich platform. Next to compute and storage, it offers a series of building blocks that simplify your life as a cloud developer. One of these building blocks is the content delivery network (CDN), which can be used for offloading content to a globally distributed network of servers, ensuring faster throughput to your end users.

I’ve been asked to write an article on this matter, which I did, and which is live at ACloudyPlace.com since today. As a small teaser, here’s the first section of it:

Reasons for using a CDN
There are a number of reasons to use a CDN. One of the obvious reasons lies in the nature of the CDN itself: a CDN is globally distributed and caches static content on edge nodes, closer to the end user. If a user accesses your web application and some of the files are cached on the CDN, the end user will download those files directly from the CDN, experiencing less latency in their request.

Another reason for using the CDN is throughput. If you look at a typical webpage, about 20% of it is HTML which was dynamically rendered based on the user’s request. The other 80% goes to static files like images, CSS, JavaScript, and so forth. Your server has to read those static files from disk and write them on the response stream, both actions which take away some of the resources available on your virtual machine. By moving static content to the CDN, your virtual machine will have more capacity available for generating dynamic content.

Here’s the full article: Using the Windows Azure Content Delivery Network

Protecting Windows Azure Web and Worker roles from malware

Most IT administrators will install some sort of virus scanner on your precious servers. Since the cloud, from a technical perspective, is just a server, why not follow that security best practice on Windows Azure too? It has gone by almost unnoticed, but last week Microsoft released the Microsoft Endpoint Protection for Windows Azure Customer Technology Preview. For the sake of bandwidth, I’ll be referring to it as EP.

EP offers real-time protection, scheduled scanning, malware remediation (a fancy word for quarantining), active protection and automatic signature updates. Sounds a lot like Microsoft Endpoint Protection or Windows Security Essentials? That’s no coincidence: EP is a Windows Azurified version of it.

Enabling anti-malware on Windows Azure

After installing the Microsoft Endpoint Protection for Windows Azure Customer Technology Preview, sorry, EP, a new Windows Azure import will be available. As with remote desktop or diagnostics, EP can be enabled by a simple XML one liner:

1 <Import moduleName="Antimalware" />

Here’s a sample web role ServiceDefinition.csdef file containing this new import:

1 <?xml version="1.0" encoding="utf-8"?> 2 <ServiceDefinition name="ChuckProject" 3 xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> 4 <WebRole name="ChuckNorris" vmsize="Small"> 5 <Sites> 6 <Site name="Web"> 7 <Bindings> 8 <Binding name="Endpoint1" endpointName="Endpoint1" /> 9 </Bindings> 10 </Site> 11 </Sites> 12 <Endpoints> 13 <InputEndpoint name="Endpoint1" protocol="http" port="80" /> 14 </Endpoints> 15 <Imports> 16 <Import moduleName="Antimalware" /> 17 <Import moduleName="Diagnostics" /> 18 </Imports> 19 </WebRole> 20 </ServiceDefinition>

That’s it! When you now deploy your Windows Azure solution, Microsoft Endpoint Protection will be installed, enabled and configured on your Windows Azure virtual machines.

Now since I started this blog post with “IT administrators”, chances are you want to fine-tune this plugin a little. No problem! The ServiceConfiguration.cscfg file has some options waiting to be eh, touched. And since these are in the service configuration, you can also modify them through the management portal, the management API, or sysadmin-style using PowerShell. Anyway, the following options are available:

  • Microsoft.WindowsAzure.Plugins.Antimalware.ServiceLocation – Specify the datacenter region where your application is deployed, for example “West Europe” or “East Asia”. This will speed up deployment time.
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableAntimalware – Should EP be enabled or not?
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableRealtimeProtection – Should real-time protection be enabled?
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableWeeklyScheduledScans – Weekly scheduled scans enabled?
  • Microsoft.WindowsAzure.Plugins.Antimalware.DayForWeeklyScheduledScans – Which day of the week (0 – 7 where 0 means daily)
  • Microsoft.WindowsAzure.Plugins.Antimalware.TimeForWeeklyScheduledScans – What time should the scheduled scan run?
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedExtensions – Specify file extensions to exclude from scanning (pip-delimited)
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedPaths – Specify paths to exclude from scanning (pip-delimited)
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedProcesses – Specify processes to exclude from scanning (pip-delimited)

Monitoring anti-malware on Windows Azure

How will you know if a threat has been detected? Well, luckily for us, Windows Endpoint Protection writes its logs to the System event log. Which means that you can simply add a specific data source in your diagnostics monitor and you’re done:

1 var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration(); 2 3 // Note: if you need informational / verbose, also subscribe to levels 4 and 5 4 configuration.WindowsEventLog.DataSources.Add( 5 "System!*[System[Provider[@Name='Microsoft Antimalware'] and (Level=1 or Level=2 or Level=3)]]"); 6 7 configuration.WindowsEventLog.ScheduledTransferPeriod 8 = System.TimeSpan.FromMinutes(1); 9 10 DiagnosticMonitor.Start( 11 "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 12 configuration);

In addition, EP also logs its inner workings to its installation folders. You can also include these in your diagnostics configuration:

1 var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration(); 2 3 // ...add the event logs like in the previous code sample... 4 5 var mep1 = new DirectoryConfiguration(); 6 mep1.Container = "wad-endpointprotection-container"; 7 mep1.DirectoryQuotaInMB = 5; 8 mep1.Path = "%programdata%\Microsoft Endpoint Protection"; 9 10 var mep2 = new DirectoryConfiguration(); 11 mep2.Container = "wad-endpointprotection-container"; 12 mep2.DirectoryQuotaInMB = 5; 13 mep2.Path = "%programdata%\Microsoft\Microsoft Security Client"; 14 15 configuration.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0); 16 configuration.Directories.DataSources.Add(mep1); 17 configuration.Directories.DataSources.Add(mep2); 18 19 DiagnosticMonitor.Start( 20 "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 21 configuration);

From this moment one, you can use a tool like Cerebrata’s Diagnostics Monitor to check the event logs of all your Windows Azure instances that have anti-malware enabled.

Pro NuGet is finally there!

Short version: Install-Package ProNuget or http://amzn.to/pronuget

Pro NuGet - Continuous integration Package RestoreIt’s been a while since I wrote my first book. After I’ve been telling that writing a book is horrendous (try writing a chapter per week after your office hours…) and that I would never write on again, my partner-in-crime Xavier Decoster and I had the same idea at the same time: what about a book on NuGet? So here it is: Pro NuGet is fresh off the presses (or on Kindle).

Special thanks go out to Scott Hanselman and Phil Haack for writing our foreword. Also big kudos to all who’ve helped us out now and then and did some small reviews. Yes Rob, Paul, David, Phil, Hadi: that’s you guys.

Why a book on NuGet?

Why not? At the time we decided we would start writing a book (september 2011), NuGet was out there for a while already. Yet, most users then (and still today) were using NuGet only as a means of installing packages, some creating packages. But NuGet is much more! And that’s what we wanted to write about. We did not want to create a reference guide on what NuGet command were available. We wanted to focus on best practices we’ve learned over the past few months using NuGet.

Some scenarios covered in our book:

  • What’s the big picture on package management?
  • Flashback last week: NuGet.org was down. How do you keep your team working if you depend on that external resource?
  • Is it a good idea to auto-update NuGet packages in a continous integration process?
  • Use the PowerShell console in VS2010/11. How do I write my own NuGet PowerShell Cmdlets? What can I do in there?
  • Why would you host your own NuGet repository?
  • Using NuGet for continuous delivery
  • More!

I feel we’ve managed to cover a lot of concepts that go beyond “how to use NuGet vX” and instead have given as much guidance as possible. Questions, suggestions, remarks, … are all welcome. And a click on “Add to cart” is also a good idea ;-)

I’m an ASP Insider

imageCool! I’ve just learned that I’m invited to join the ASPInsiders. I’m really excited and honored to be part of this group of great ASP.NET experts. Very much looking forward to learning the secret handshake and being able to provide feedback that helps the ASP.NET team forward.

If don’t know who the ASP Insiders are, here’s their elevator pitch:

“The ASPInsiders is a select group of international professionals who have demonstrated expertise in ASP.NET technologies and who provide valuable, early feedback on related developing technologies and publications to their peers, the Microsoft ASP.NET team and others.”

Some more info is available in the Who are the ASPInsiders? post by one of the insiders.

TechDays Finland - Architectural Patterns for the Cloud - NuGet

As promised, here are the slide decks for the two sessions delivered at TechDays Finland last week.

Architectural Patterns for the Cloud

The promise of all cloud vendors out there is they can run your applications without changes. While that claim is true, it’s better to optimize existing software or design specifically for the cloud when moving or building an application. Architectural optimization will speed up your application, make it more scalable and even will make it cheaper to run on Windows Azure. This session will take you along some common patterns that are easy to implement and will make your cloud more sunny.

Organize your Chickens - NuGet for the Enterprise

Managing software dependencies, whether those created in-house or from third parties can be a pain in the behind. Whether dependencies feel like wild chickens or people run around like chickens dealing with dependencies, the NuGet package manager can be a cure. Let us guide you to creating enterprise (chicken) NuGets and dealing with them in a structured, easy-to-maintain manner. From developer workstation to build server, NuGet tastes great! We'll provide you the dip sauce.

Enjoy! And if there’s any feedback or questions, I would love to hear it.