Maarten Balliauw {blog}

ASP.NET, ASP.NET MVC, Windows Azure, PHP, ...

NAVIGATION - SEARCH

NuGet push... to Windows Azure

When looking at how people like to deploy their applications to a cloud environment, a large faction seems to prefer being able to use their source control system as a source for their production deployment. While interesting, I see a lot of problems there: your source code may not run immediately and probably has to be compiled. You don’t want to maintain compiled assemblies in source control, right? Also, maybe some QA process is in place where a deployment can only occur after approval. Why not use source control for what it’s there for: source control? And how about using a NuGet repository as the source for our deployment? Meet the Windows Azure NuGetRole.

Disclaimer/Warning: this is demo material and should probably not be used for real-life deployments without making it bullet proof!

Download the sample code: NuGetRole.zip (262.22 kb)

How to use it

If you compile the source code (download), you have X steps left in getting your NuGetRole running on Windows Azure:

  • Specifying the package source to use
  • Add some packages to the package source feed (which you can easily host on MyGet)
  • Deploy to Windows Azure

When all these steps have been taken care of, the NuGetRole will download all latest package versions from the package source specified in ServiceConfiguration.cscfg:

1 <?xml version="1.0" encoding="utf-8"?> 2 <ServiceConfiguration serviceName="NuGetRole.Azure" 3 xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" 4 osFamily="1" 5 osVersion="*"> 6 <Role name="NuGetRole.Web"> 7 <Instances count="1" /> 8 <ConfigurationSettings> 9 <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> 10 <Setting name="PackageSource" value="http://www.myget.org/F/nugetrole/" /> 11 </ConfigurationSettings> 12 </Role> 13 </ServiceConfiguration>

Packages you publish should only contain a content and/or lib folder. Other package contents will currently be ignored by the NuGetRole. If you want to add some web content like a default page to your role, simply publish the following package:

NuGet Package Explorer MyGet NuGet NuGetRole Azure

Just push, and watch your Windows Azure web role farm update their contents. Or have your build server push a NuGet package containing your application and have your server farm update itself. Whatever pleases you.

How it works

What I did was create a fairly empty Windows Azure project (download).  In this project, one Web role exists. This web role consists of nothing but a Web.config file and a WebRole.cs class which looks like the following:

1 public class WebRole : RoleEntryPoint 2 { 3 private bool _isSynchronizing; 4 private PackageSynchronizer _packageSynchronizer = null; 5 6 public override bool OnStart() 7 { 8 var localPath = Path.Combine(Environment.GetEnvironmentVariable("RdRoleRoot") + "\\approot"); 9 10 _packageSynchronizer = new PackageSynchronizer( 11 new Uri(RoleEnvironment.GetConfigurationSettingValue("PackageSource")), localPath); 12 13 _packageSynchronizer.SynchronizationStarted += sender => _isSynchronizing = true; 14 _packageSynchronizer.SynchronizationCompleted += sender => _isSynchronizing = false; 15 16 RoleEnvironment.StatusCheck += (sender, args) => 17 { 18 if (_isSynchronizing) 19 { 20 args.SetBusy(); 21 } 22 }; 23 24 return base.OnStart(); 25 } 26 27 public override void Run() 28 { 29 _packageSynchronizer.SynchronizeForever(TimeSpan.FromSeconds(30)); 30 31 base.Run(); 32 } 33 }

The above code is essentially wiring some configuration values like the local web root and the NuGet package source to use to a second class in this project: the PackageSynchronizer. This class simply checks the specified NuGet package source every few minutes, checks for the latest package versions and if required, updates content and bin files.  Each synchronization run does the following:

1 public void SynchronizeOnce() 2 { 3 var packages = _packageRepository.GetPackages() 4 .Where(p => p.IsLatestVersion == true).ToList(); 5 6 var touchedFiles = new List<string>(); 7 8 // Deploy new content 9 foreach (var package in packages) 10 { 11 var packageHash = package.GetHash(); 12 var packageFiles = package.GetFiles(); 13 foreach (var packageFile in packageFiles) 14 { 15 // Keep filename 16 var packageFileName = packageFile.Path.Replace("content\\", "").Replace("lib\\", "bin\\"); 17 18 // Mark file as touched 19 touchedFiles.Add(packageFileName); 20 21 // Do not overwrite content that has not been updated 22 if (!_packageFileHash.ContainsKey(packageFileName) || _packageFileHash[packageFileName] != packageHash) 23 { 24 _packageFileHash[packageFileName] = packageHash; 25 26 Deploy(packageFile.GetStream(), packageFileName); 27 } 28 } 29 30 // Remove obsolete content 31 var obsoleteFiles = _packageFileHash.Keys.Except(touchedFiles).ToList(); 32 foreach (var obsoletePath in obsoleteFiles) 33 { 34 _packageFileHash.Remove(obsoletePath); 35 Undeploy(obsoletePath); 36 } 37 } 38 }

Or in human language:

  • The specified NuGet package source is checked for packages
  • Every package marked “IsLatest” is being downloaded and deployed onto the machine
  • Files that have not been used in the current synchronization step are deleted

This is probably not a bullet-proof solution, but I wanted to show you how easy it is to use NuGet not only as a package manager inside Visual Studio, but also from your code: NuGet is not just a package manager but in essence a package management protocol. Which you can easily extend.

One thing to note: I also made the Windows Azure load balancer ignore the role that’s updating itself. This means a roie instance that is synchronizing its contents will never be available in the load balancing pool so no traffic is sent to the role instance during an update.

ASP.NET MVC dynamic view sections

Earlier today, a colleague of mine asked for advice on how he could create a “dynamic” view. To elaborate, he wanted to create a change settings page on which various sections would be rendered based on which plugins are loaded in the application.

Intrigued by the question and having no clue on how to do this, I quickly hacked together a SettingsViewModel, to which he could add all section view models no matter what type they are:

1 public class SettingsViewModel 2 { 3 public List<dynamic> SettingsSections = new List<dynamic>(); 4 }

To my surprise, when looping this collection in the view it just works as expected: every section is rendered using its own DisplayTemplate. Simple and slick.

1 @model MvcApplication.ViewModels.SettingsViewModel 2 3 @{ 4 ViewBag.Title = "Index"; 5 } 6 7 <h2>Settings</h2> 8 9 @foreach (var item in Model.SettingsSections) 10 { 11 @Html.DisplayFor(model => item); 12 }

Why MyGet uses Windows Azure

MyGet - NuGet hosting private feedRecently one of the Tweeps following me started fooling around and hit one of my sweet spots: Windows Azure. Basically, he mocked me for using Windows Azure for MyGet, a website with enough users but not enough to justify the “scalability” aspect he thought Windows Azure was offering. Since Windows Azure is much, much more than scalability alone, I decided to do a quick writeup about the various reasons on why we use Windows Azure for MyGet. And those are not scalability.

First of all, here’s a high-level overview of our deployment, which may illustrate some of the aspects below:

image

Costs

Windows Azure is cheap. Cheap as in cost-effective, not as in, well, sleezy. Many will disagree with me but the cost perspective of Windows Azure can be real cheap in some cases as well as very expensive in other cases. For example, if someone asks me if they should move to Windows Azure and they now have one server running 300 small sites, I’d probably tell them not to move as it will be a tough price comparison.

With MyGet we run 2 Windows Azure instances in 2 datacenters across the globe (one in the US and one in the EU). For $180.00 per month this means 2 great machines at two very distant regions of the globe. You can probably find those with other hosters as well, but will they manage your machines? Patch and update them? Probably not, for that amount. In our scenario, Windows Azure is cheap.

Feel free to look at the cost calculator tool to estimate usage costs.

Traffic Manager

Traffic Manager, a great (beta) product in the Windows Azure offering allows us to do geographically distributed applications. For example, US users of MyGet will end up in the US datacenter, European users will end up in the EU datacenter. This is great, and we can easily add extra locations to this policy and have, for example, a third location in Asia.

Next to geographically distributing MyGet, Traffic Manager also ensures that if one datacenter goes down, the DNS pool will consist of only “live” datacenters and thus provide datacenter fail-over. Not ideal as the web application will be served faster from a server that’s closer to the end user, but the application will not go down.

One problem we have with this is storage. We use Windows Azure storage (blobs, tables and queues) as those only cost $0.12 per GB. Distributing the application does mean that our US datacenter server has to access storage in the EU datacenter which of course adds some latency. We try to reduce this using extensive caching on all sides, but it’d be nicer if Traffic Manager allowed us to setup georeplication for storage as well. This only affects storing package metadata and packages. Reading packages is not affected by this because we’re using the Windows Azure CDN for that.

CDN

The Windows Azure Content Delivery Network allows us to serve users fast. The main use case for MyGet is accessing and downloading packages. Ok, the updating has some latency due to the restrictions mentioned above, but if you download a package from MyGet it will always come from a CDN node near the end user to ensure low latency and fast access. Given the CDN is just a checkbox on the management pages means integrating with CDN is a breeze. The only thing we’ve struggled with is finding an acceptable caching policy to ensure stale data is limited.

Windows Azure AppFabric Access Control

MyGet is not one application. MyGet is three applications: our development environment, staging and production. In fact, we even plan for tenants so every tenant in fact is its own application. To streamline, manage and maintain a clear overview of which user can authenticate to which application via which identity provider, we use ACS to facilitate MyGet authentication.

To give you an example: our dev environment allows logging in via OpenID on a development machine. Production allows for OpenID on a live environment. In staging, we only use Windows Live ID and Facebook whereas our production website uses different identity providers. Tenants will, in the future, be given the option to authenticate to their own ADFS server, we’re pretty sure ACS will allow us to simply configure that and instrument only tenant X can use that ADFS server.

ACs has been a great time saver and is definitely something we want to use in future project. It really eases common authentication pains and acts as a service bus between users, identity providers and our applications.

Windows Azure AppFabric Caching

Currently we don’t use Windows Azure AppFabric Caching in our application. We currently use the ASP.NET in-memory cache on all machines but do feel the need for having a distributed caching solution. While appealing, we think about deploying Memcached in our application because of the cost structure involved. But we might as well end up with Wndows Azure AppFabric Caching anyway as it integrates nicely with our current codebase.

Conclusion

In short, Windows Azure is much more than hosting and scalability. It’s the building blocks available such as Traffic Manager, CDN and Access Control Service that make our lives easier. The pricing structure is not always that transparent but if you dig a little into it you’ll find affordable solutions that are really easy to use because you don’t have to roll your own.

Book review: Microsoft Windows Azure Development Cookbook

Microsoft Windows Azure Development CookbookOver the past few months, I’ve been doing technical reviewing for a great Windows Azure book: the Windows Azure Development Cookbook published by Packt. During this review I had no idea who the author of the book was but after publishing it seems the author is no one less than my fellow Windows Azure MVP Neil Mackenzie! If you read his blog you should know you should immediately buy this book.

Why? Well, Neil usually goes both broad and deep: all required context for understanding a recipe is given and the recipe itself goes deep enough to know most of the ins and outs of a specific feature of Windows Azure. Well written, to the point and clear to every reader both novice and expert.

The book is one of a series of cookbooks published by Packt. They are intended to provide “recipes” showing how to implement specific techniques in a particular technology. They don’t cover getting started scenarios, but do cover some basic techniques, some more advanced techniques and usually one or two expert techniques. From the cookbooks I’ve read, this approach works and should get you up to speed real quick. And that’s no different with this one.

Here’s a chapter overview:

  1. Controlling Access in the Windows Azure Platform
  2. Handling Blobs in Windows Azure
  3. Going NoSQL with Windows Azure Tables
  4. Disconnecting with Windows Azure Queues
  5. Developing Hosted Services for Windows Azure
  6. Digging into Windows Azure Diagnostics
  7. Managing Hosted Services with the Service Management API
  8. Using SQL Azure
  9. Looking at the Windows Azure AppFabric

An interesting sample chapter on the Service Management API can be found here.

Oh and before I forget: Neil, congratulations on your book!  It was a pleasure doing the reviewing!

A client side Glimpse to your PHP application

Glimpse for PHPA few months ago, the .NET world was surprised with a magnificent tool called “Glimpse”. Today I’m pleased to release a first draft of a PHP version for Glimpse! Now what is this Glimpse thing… Well: "what Firebug is for the client, Glimpse does for the server... in other words, a client side Glimpse into whats going on in your server."

For a quick demonstration of what this means, check the video at http://getglimpse.com/. Yes, it’s a .NET based video but the idea behind Glimpse for PHP is the same. And if you do need a PHP-based one, check http://screenr.com/27ds (warning: unedited :-))

Fundamentally Glimpse is made up of 3 different parts, all of which are extensible and customizable for any platform:

  • Glimpse Server Module
  • Glimpse Client Side Viewer
  • Glimpse Protocol

This means an server technology that provides support for the Glimpse protocol can provide the Glimpse Client Side Viewer with information. And that’s what I’ve done.

What can I do with Glimpse?

A lot of things. The most basic usage of Glimpse would be enabling it and inspecting your requests by hand. Here’s a small view on the information provided:

Glimpse phpinfo()

By default, Glimpse offers you a glimpse into the current Ajax requests being made, your PHP Configuration, environment info, request variables, server variables, session variables and a trace viewer. And then there’s the remote tab, Glimpse’s killer feature.

When configuring Glimpse through www.yoursite.com/?glimpseFile=Config, you can specify a Glimpse session name. If you do that on a separate device, for example a customer’s browser or a mobile device you are working with, you can distinguish remote sessions in the remote tab. This allows debugging requests that are being made live on other devices! A full description is over at http://getglimpse.com/Help/Plugin/Remote.

PHP debug mobile browser

Adding Glimpse to your PHP project

Installing Glimpse in a PHP application is very straightforward. Glimpse is supported starting with PHP 5.2 or higher.

  • For PHP 5.2, copy the source folder of the repository to your server and add <?php include '/path/to/glimpse/index.php'; ?> as early as possible in your PHP script.
  • For PHP 5.3, copy the glimpse.phar file from the build folder of the repository to your server and add <?php include 'phar://path/to/glimpse.phar'; ?> as early as possible in your PHP script.

Here’s an example of the Hello World page shown above:

1 <?php 2 require_once 'phar://../build/Glimpse.phar'; 3 ?> 4 <html> 5 <head> 6 <title>Hello world!</title> 7 </head> 8 9 <?php Glimpse_Trace::info('Rendering body...'); ?> 10 <body> 11 <h1>Hello world!</h1> 12 <p>This is just a test.</p> 13 </body> 14 <?php Glimpse_Trace::info('Rendered body.'); ?> 15 </html>

Enabling Glimpse

From the moment Glimpse is installed into your web application, navigate to your web application and append the ?glimpseFile=Config query string to enable/disable Glimpse. Optionally, a client name can also be specified to distinguish remote requests.

Configuring Glimpse for PHP

After enabling Glimpse, a small “eye” icon will appear in the bottom-right corner of your browser. Click it and behold the magic!

Now of course: anyone can potentially enable Glimpse. If you don’t want that, ensure you have some conditional mechanism around the <?php require_once 'phar://../build/Glimpse.phar'; ?> statement.

Creating a first Glimpse plugin

Not enough information on your screen? Working with Zend Framework and want to have a look at route values? Want to work with Wordpress and view some hidden details about a post through Glimpse? The sky is the limit. All there’s to it is creating a Glimpse plugin and registering it. Implementing Glimpse_Plugin_Interface is enough:

1 <?php 2 class MyGlimpsePlugin 3 implements Glimpse_Plugin_Interface 4 { 5 public function getData(Glimpse $glimpse) { 6 $data = array( 7 array('Included file path') 8 ); 9 10 foreach (get_included_files() as $includedFile) { 11 $data[] = array($includedFile); 12 } 13 14 return array( 15 "MyGlimpsePlugin" => count($data) > 0 ? $data : null 16 ); 17 } 18 19 public function getHelpUrl() { 20 return null; // or the URL to a help page 21 } 22 } 23 ?>

To register the plugin, add a call to $glimpse->registerPlugin():

1 <?php 2 $glimpse->registerPlugin(new MyGlimpsePlugin()); 3 ?>

And Bob’s your uncle:

Creating a Glimpse plugin in PHP

Now what?

Well, it’s up to you. First of all: all feedback would be welcomed. Second of all: this is on Github (https://github.com/Glimpse/Glimpse.PHP). Feel free to fork and extend! Feel free to contribute plugins, core features, whatever you like! Have a lot of CakePHP projects? Why not contribute a plugin that provides a Glimpse at CakePHP diagnostics?

‘Till next time!

Version 4 of the Windows Azure SDK for PHP released

Only a few months after the Windows Azure SDK for PHP 3.0.0, Microsoft and RealDolmen are proud to present you the next version of the most complete SDK for Windows Azure out there (yes, that is a rant against the .NET SDK!): Windows Azure SDK for PHP. We’ve been working very hard with an expanding globally distributed team on getting this version out.

The Windows Azure SDK 4 contains some significant feature enhancements. For example, it now incorporates a PHP library for accessing Windows Azure storage, a logging component, a session sharing component and clients for both the Windows Azure and SQL Azure Management API’s. On top of that, all of these API’s are now also available from the command-line both under Windows and Linux. This means you can batch-script a complete datacenter setup including servers, storage, SQL Azure, firewalls, … If that’s not cool, move to the North Pole.

Here’s the official change log:

  • New feature: Service Management API support for SQL Azure
  • New feature: Service Management API's exposed as command-line tools
  • New feature: MicrosoftWindowsAzureRoleEnvironment for retrieving environment details
  • New feature: Package scaffolders
  • Integration of the Windows Azure command-line packaging tool
  • Expansion of the autoloader class increasing performance
  • Several minor bugfixes and performance tweaks

Some interesting links on some of the new features:

Also keep an eye on www.sdn.nl where I’ll be posting an article on scripting a complete application deployment to Windows Azure, including SQL Azure, storage and firewalls.

And finally: keep an eye on http://azurephp.interoperabilitybridges.com and http://blogs.technet.com/b/port25/. I have a feeling some cool stuff may be coming following this release...

Copy packages from one NuGet feed to another

Copy packages from one NuGet feed to another - MyGet NuGet Server

Yesterday, a funny discussion was going on at the NuGet Discussion Forum on CodePlex. Funny, you say? Well yes. Funny because it was about a feature we envisioned as being a must-have feature for the NuGet ecosystem: copying packages from the NuGet feed to another feed. And funny because we already have that feature present in MyGet. You may wonder why anyone wants to do that? Allow me to explain.

Scenarios where copying packages makes sense

The first scenario is feed stability. Imagine you are building a project and expect to always reference a NuGet package from the official feed. That’s OK as long as you have that package present in the NuGet feed, but what happens if someone removes it or updates it without respecting proper versioning? This should not happen, but it can be an unpleasant surprise if it happens. Copying the package to another feed provides stability: the specific package version is available on that other feed and will never change unless you update or remove it. It puts you in control, not the package owner.

A second scenario: enhanced speed! It’s still much faster to pull packages from a local feed or a feed that’s geographically distributed, like the one MyGet offers (US and Europe at the moment). This is not to bash any carriers or network providers, it’s just physics: electrons don’t travel that fast and it’s better to have them coming from a closer location.

But… how to do it? Client side

There are some solutions to this problem/feature. The first one is a hard one: write a script that just pulls packages from the official feed. You’ll find a suggestion on how to do that here. This thing however does not pull along dependencies and forces you to do ugly, user-unfriendly things. Let’s go for beauty :-)

Rob Reynolds (aka @ferventcoder) added some extension sauce to the NuGet.exe:

NuGet.exe Install /ExcludeVersion /OutputDir %LocalAppData%\NuGet\Commands AddConsoleExtension NuGet.exe addextension nuget.copy.extension NuGet.exe copy castle.windsor –destination http://myget.org/F/somefeed

Sweet! And Rob also shared how he created this extension (warning: interesting read!)

But… how to do it? Server side

The easiest solution is to just use MyGet! We have a nifty feature in there named “Mirror packages”. It copies the selected package to your private feed, distributes it across our CDN nodes for a fast download and it pulls along all dependencies.

Mirror a NuGet package - Copy a NuGet package

Enjoy making NuGet a component of your enterprise workflow! And MyGet of course as well!

Windows Azure Accelerator for Web Roles

Windows Azure Accelerator for Web RolesOne of the questions I often get around Windows Azure is: “Is Windows Azure interesting for me?”. It’s a tough one, because most of the time when someone asks that question they currently already have a server somewhere that hosts 100 websites. In the full-fledged Windows Azure model, that would mean 100 x 2 (we want the SLA) = 200 Windows Azure instances. And a stroke at the end of the month when the bill arrives. Microsoft’s DPE team have released something very interesting for those situations though: the Windows Azure Accelerator for Web Roles.

In short, the WAAWR (no way I’m going to write Windows Azure Accelerator for Web Roles out every time!) is a means of creating virtual web sites on the IIS server running on a Windows Azure instance. Add “multi-instance” to that and have a free tool to create a server farm for you!

The features on paper:

  • Deploy sites to Windows Azure in less than 30 seconds
  • Enables deployments to multiple Web Role instances using Web Deploy
  • Saves Web Deploy packages & IIS configuration in Windows Azure storage to provide durability
  • A web administrator portal for managing web sites deployed to the role
  • The ability to upload and manage SSL certificates
  • Simple logging and diagnostics tools

Interesting… Let’s go for a ride!

Obtaining & installing the Windows Azure Accelerator for Web Roles

Installing the WAAWR is as easy as download, extract, buildme.cmd and you’re done. After that, Visual Studio 2010 (or Visual Studio Web Developer Express!) features a new project template:

Create new Windows Azure Accelerator for Web Roles project

Click OK, enter the required information (such as: a storage account that will be used for synchronizing the different server instances and an administrator account). After that, enable remote desktop and publish. That’s it. I’ve never ever setup a web farm more quickly than that.

Creating a web site

After deploying the solution you created in Visual Studio, browse to the live deployment and log in with the administrator credentials you created when creating the project. This will give you a nice looking web interface which allows you to create virtual web sites and have some insight into what’s happening in your server farm.

I’ll create a new virtual website on my server farm:

Create a site in Windows Azure Accelerator for Web Roles

After clicking create we can try to publish an ASP.NET MVC application.

Publishing a web site

For testing purposes I created a simple ASP.NET MVC application. Since the default project template already has a high enough “Hello World factor”, let’s immediately right-click the project name and hit Publish. Deploying an application to the newly created Windows Azure webfarm is as easy as specifying the following parameters:

Windows Azure Web Deploy

One Publish click later, you are done. And deployed on a web farm instance, I can now see the website itself but also… some statistics :-)

Maarten Balliauw Windows Azure

Conclusion

The newly released Windows Azure Accelerator for Web Roles is, IMHO, the easiest, fastest manner to deploy a multi-site webfarm on Windows Azure. Other options like.the ones proposed by Steve Marx on his blog do work, but are only suitable if you are billing your customer by the hour.

The fact that it uses web deploy to publish applications to the system and the fact that this just works behind a corporate firewall and annoying proxy is just fabulous!

This also has a downside: if I want to push my PHP applications to Windows Azure in a jiffy, chances are this will be a problem. Not on Windows (but not ideal there either), but certainly when publishing apps from other platforms. Is that a self-imposed challenge? Nope. Web deploy does not seem to have an open protocol (that I know of) and while reverse engineering it is possible I will not risk the legal consequences :-) However, some reverse-engineering of the WAAWR itself learned me that websites are stored as a ZIP package on blob storage and there’s a PHP SDK for that. A nifty workaround is possible as such, if you get your head around the ZIP file folder structure.

My conclusion in short: if you ever receive the question “Is Windows Azure interesting for me?” from someone who wants to host a bunch of websites on it? It is. And it’s easy.

A hidden gem in the Windows Azure SDK for PHP: command line parsing

It’s always fun to dive into frameworks: often you’ll find little hidden gems that can be of great use in your own projects. A dive into the Windows Azure SDK for PHP learned me that there’s a nifty command line parsing tool in there which makes your life easier when writing command line scripts.

Usually when creating a command line script you would parse $_SERVER['argv'], validate values and check whether required switches are available or not. With the Microsoft_Console_Command class from the Windows Azure SDK for PHP, you can ease up this task. Let’s compare writing a simple “hello” command.

Command-line hello world the ugly way

Let’s start creating a script that can be invoked from the command line. The first argument will be the command to perform, in this case “hello”. The second argument will be the name to who we want to say hello.

$command = null; $name = null; if (isset($_SERVER['argv'])) { $command = $_SERVER['argv'][1]; } // Process "hello" if ($command == "hello") { $name = $_SERVER['argv'][2]; echo "Hello $name"; }

Pretty obvious, no? Now let’s add some “help” as well:

$command = null; $name = null; if (isset($_SERVER['argv'])) { $command = $_SERVER['argv'][1]; } // Process "hello" if ($command == "hello") { $name = $_SERVER['argv'][2]; echo "Hello $name"; } if ($command == "") { echo "Help for this command\r\n"; echo "Possible commands:\r\n"; echo " hello - Says hello.\r\n"; }

To be honest: I find this utter clutter. And it’s how many command line scripts for PHP are written today. Imagine this script having multiple commands and some parameters that come from arguments, some from environment variables, …

Command-line hello world the easy way

With the Windows Azure for SDK tooling, I can replace the first check (“which command do you want”) by creating a class that extends Microsoft_Console_Command.  Note I also decorated the class with some special docblock annotations which will be used later on by the built-in help generator. Bear with me :-)

/** * Hello world * * @command-handler hello * @command-handler-description Hello world. * @command-handler-header (C) Maarten Balliauw */ class Hello extends Microsoft_Console_Command { } Microsoft_Console_Command::bootstrap($_SERVER['argv']);

Also notice that in the example above, the last line actually bootstraps the command. Which is done in an interesting way: the arguments for the script are passed in as an array. This means that you can also abuse this class to create “subcommands” which you pass a different array of parameters.

To add a command implementation, just create a method and annotate it again:

/** * @command-name hello * @command-description Say hello to someone * @command-parameter-for $name Microsoft_Console_Command_ParameterSource_Argv --name|-n Required. Name to say hello to. * @command-example Print "Hello, Maarten": * @command-example hello -n="Maarten" */ public function helloCommand($name) { echo "Hello, $name"; }

Easy, no? I think this is pretty self-descriptive:

  • I have a command named “hello”
  • It has a description
  • It takes one parameter $name for which the value can be provided from arguments (Microsoft_Console_Command_ParameterSource_Argv). If passed as an argument, it’s called “—name” or “-n”. And there’s a description as well.

To declare arguments, I’ve found that there’s other sources for them as well:

  • Microsoft_Console_Command_ParameterSource_Argv – Gets the value from the command arguments
  • Microsoft_Console_Command_ParameterSource_StdIn – Gets the value from StdIn, which enables you to create “piped” commands
  • Microsoft_Console_Command_ParameterSource_Env – Gets the value from an environment variable

The best part: help is generated for you! Just run the script without any further arguments:

(C) Maarten Balliauw Hello world. Available commands: hello Say hello to someone --name, -n Required. Name to say hello to. Example usage: Print "Hello, Maarten": hello -n="Maarten" <default>, -h, -help, help Displays the current help information.

Magic at its best! Enjoy!

A first look at Windows Azure AppFabric Applications

After the Windows Azure AppFabric team announced the availability of Windows Azure AppFabric Applications (preview), I signed up for early access immediately and got in. After installing the tools and creating a namespace through the portal, I decided to give it a try to see what it’s all about. Note that Neil Mackenzie also has an extensive post on “WAAFapps” which I recommend you to read as well.

So what is this Windows Azure AppFabric Applications thing?

Before answering that question, let’s have a brief look at what Windows Azure is today. According to Microsoft, Windows Azure is a “PaaS” (Platform-as-a-Service) offering. What that means is that Windows Azure offers a series of platform components like compute, storage, caching, authentication, a service bus, a database, a CDN, … to your applications.

Consuming those components is pretty low level though: in order to use, let’s say, caching, one has to add the required references, make some web.config changes and open up a connection to these things. Ok, an API is provided but it’s not the case that you can seamlessly integrate caching into an application in seconds (in a manner like one would integrate file system access in an application which you literally can do in seconds).

Meet Windows Azure AppFabric Applications. Windows Azure AppFabric Applications (why such long names, Microsoft!) redefine the concept of Platform-as-a-Service: where Windows Azure out of the box is more like a “Platform API-as-a-Service”, Windows Azure AppFabric Applications  is offering tools and platform support for easily integrating the various Windows Azure components.

This “redefinition” of Windows Azure introduces some new concepts: in Windows Azure you have roles and role instances. In AppFabric Applications you don’t have that concept: AFA (yes, I managed to abbreviate it!) uses so-called Containers. A Container is a logical unit in which one or more services of an application are hosted. For example, if you have 2 web applications, caching and SQL Azure, you will (by default) have one Container containing 2 web applications + 2 service references: one for caching, one for SQL Azure.

Containers are not limited to one role or role instance: a container is a set of predeployed role instances on which your applications will run. For example, if you add a WCF service, chances are that this will be part of the same container. Or a different one if you specify otherwise.

It’s pretty interesting that you can scale containers separately. For example, one can have 2 scale units for the container containing web applications, 3 for the WCF container, … A scale unit is not necessarily just one extra instance: it depends on how many services are in a container? In fact, you shouldn’t care anymore about role instances and virtual machines: with AFA (my abbreviation for Windows Azure AppFabric Applications, remember) one can now truly care about only one thing: the application you are building.

Hello, Windows Azure AppFabric Applications

Visual Studio tooling support

To demonstrate a few concepts, I decided to create a simple web application that uses caching to store the number of visits to the website. After installing the Visual Studio tooling, I started with one of the templates contained in the SDK:

Creating a Windows Azure AppFabric Application

This template creates a few things. To start with, 2 projects are created in Visual Studio: one MVC application in which I’ll create my web application, and one Windows Azure AppFabric Application containing a file App.cs which seems to be a DSL for building Windows Azure AppFabric Application. Opening this DSL gives the following canvas in Visual Studio:

App.cs Windows Azure AppFabric Applications

As you can see, this is the overview of my application as well as how they interact with each other. For example, the “MVCWebApp” has 1 endpoint (to serve HTTP requests) + 2 service references (to Windows Azure AppFabric caching and SQL Azure). This is an important notion as it will generate integration code for you. For example, in my MVC web application I can find the ServiceReferences.g.cs file containing the following code:

1 class ServiceReferences 2 { 3 public static Microsoft.ApplicationServer.Caching.DataCache CreateImport1() 4 { 5 return Service.ExecutingService.ResolveImport<Microsoft.ApplicationServer.Caching.DataCache>("Import1"); 6 } 7 8 public static System.Data.SqlClient.SqlConnection CreateImport2() 9 { 10 return Service.ExecutingService.ResolveImport<System.Data.SqlClient.SqlConnection>("Import2"); 11 } 12 }

Wait a minute… This looks like a cool thing! It’s basically a factory for components that may be hosted elsewhere! Calling ServiceReferences.CreateImport1() will give me a caching client that I can immediately work with! ServiceReferences.CreateImport2() (you can change these names by the way) gives me a connection to SQL Azure. No need to add connection strings in the application itself, no need to configure caching in the application itself. Instead, I can configure these things in the Windows Azure AppFabric Application canvas and just consume them blindly in my code. Awesome!

Here’s the code for my HomeController where I consume the cache/. Even my grandmother can write this!

1 [HandleError] 2 public class HomeController : Controller 3 { 4 public ActionResult Index() 5 { 6 var count = 1; 7 var cache = ServiceReferences.CreateImport1(); 8 var countItem = cache.GetCacheItem("visits"); 9 if (countItem != null) 10 { 11 count = ((int)countItem.Value) + 1; 12 } 13 cache.Put("visits", count); 14 15 ViewData["Message"] = string.Format("You are visitor number {0}.", count); 16 17 return View(); 18 } 19 20 public ActionResult About() 21 { 22 return View(); 23 } 24 }

Now let’s go back to the Windows Azure AppFabric Application canvas, where I can switch to “Deployment View”:

Windows Azure AppFabric Application Deployment View

Deployment View basically lets you decide in which container one or more applications will be running and how many scale units a container should span (see the properties window in Visual Studio for this).

Right-clicking and selecting “Deploy…” deploys my Windows Azure AppFabric Application to the production environment.

The management portal

After logging in to http://portal.appfabriclabs.com, I can manage the application I just published:

Windows Azure AppFabric Application Management Portal

I’m not going to go in much detail but will highlight some features. The portal enables you to manage your application: deploy/undeploy, scale, monitor, change configuration, …  Basically everything you would expect to be able to do. And more! If you look at the monitoring part, for example, you will see some KPI’s on your application. Here’s what my sample application shows after being deployed for a few minutes:

Windows Azure AppFabric Applications monitoring and latency

Pretty slick. It even monitors average latencies etc.!

Conclusion

As you can read in this blog post, I’ve been exploring this product and trying out the basics of it. I’m no sure yet if this model will fit every application, but I’m sure a solution like this is where the future of PaaS should be: no longer caring about servers, VM’s or instances, just deploy and let the platform figure everything out. My business value is my application, not the fact that it spans 2 VM’s.

Now when I say “future of PaaS”, I’m also a bit skeptical… Most customers I work with use COM, require startup scripts to configure the environment, care about the server their application runs on. In fact, some applications will never be able to be deployed on this solution because of that. Where Windows Azure already represents a major shift in terms of development paradigm (a too large step for many!), I thing the step to Windows Azure AppFabric Applications is a bridge too far for most people. At present.

But then there’s corporations… As corporations always are 10 steps behind, I foresee that this will only become mainstream within the next 5-8 years (for enterpise). Too bad! I wish most corporate environments moved faster…

If Microsoft wants this thing to succeed I think they need to work even more on shifting minds to the cloud paradigm and more specific to the PaaS paradigm. Perhaps Windows 8 can be a utility to do this: if Windows 8 shifts from “programming for a Windows environment” to “programming for a PaaS environment”, people will start following that direction. What the heck, maybe this is even a great model for Joe Average to create “apps” for Windows 8! Just like one submits an app to AppStore or Marketplace today, he/she can submit an app to “Windows Marketplace” which in the background just drops everything on a technology like Windows Azure AppFabric Applications?