Maarten Balliauw {blog}

ASP.NET MVC, Microsoft Azure, PHP, web development ...

NAVIGATION - SEARCH

A first look at Windows Azure AppFabric Applications

After the Windows Azure AppFabric team announced the availability of Windows Azure AppFabric Applications (preview), I signed up for early access immediately and got in. After installing the tools and creating a namespace through the portal, I decided to give it a try to see what it’s all about. Note that Neil Mackenzie also has an extensive post on “WAAFapps” which I recommend you to read as well.

So what is this Windows Azure AppFabric Applications thing?

Before answering that question, let’s have a brief look at what Windows Azure is today. According to Microsoft, Windows Azure is a “PaaS” (Platform-as-a-Service) offering. What that means is that Windows Azure offers a series of platform components like compute, storage, caching, authentication, a service bus, a database, a CDN, … to your applications.

Consuming those components is pretty low level though: in order to use, let’s say, caching, one has to add the required references, make some web.config changes and open up a connection to these things. Ok, an API is provided but it’s not the case that you can seamlessly integrate caching into an application in seconds (in a manner like one would integrate file system access in an application which you literally can do in seconds).

Meet Windows Azure AppFabric Applications. Windows Azure AppFabric Applications (why such long names, Microsoft!) redefine the concept of Platform-as-a-Service: where Windows Azure out of the box is more like a “Platform API-as-a-Service”, Windows Azure AppFabric Applications  is offering tools and platform support for easily integrating the various Windows Azure components.

This “redefinition” of Windows Azure introduces some new concepts: in Windows Azure you have roles and role instances. In AppFabric Applications you don’t have that concept: AFA (yes, I managed to abbreviate it!) uses so-called Containers. A Container is a logical unit in which one or more services of an application are hosted. For example, if you have 2 web applications, caching and SQL Azure, you will (by default) have one Container containing 2 web applications + 2 service references: one for caching, one for SQL Azure.

Containers are not limited to one role or role instance: a container is a set of predeployed role instances on which your applications will run. For example, if you add a WCF service, chances are that this will be part of the same container. Or a different one if you specify otherwise.

It’s pretty interesting that you can scale containers separately. For example, one can have 2 scale units for the container containing web applications, 3 for the WCF container, … A scale unit is not necessarily just one extra instance: it depends on how many services are in a container? In fact, you shouldn’t care anymore about role instances and virtual machines: with AFA (my abbreviation for Windows Azure AppFabric Applications, remember) one can now truly care about only one thing: the application you are building.

Hello, Windows Azure AppFabric Applications

Visual Studio tooling support

To demonstrate a few concepts, I decided to create a simple web application that uses caching to store the number of visits to the website. After installing the Visual Studio tooling, I started with one of the templates contained in the SDK:

Creating a Windows Azure AppFabric Application

This template creates a few things. To start with, 2 projects are created in Visual Studio: one MVC application in which I’ll create my web application, and one Windows Azure AppFabric Application containing a file App.cs which seems to be a DSL for building Windows Azure AppFabric Application. Opening this DSL gives the following canvas in Visual Studio:

App.cs Windows Azure AppFabric Applications

As you can see, this is the overview of my application as well as how they interact with each other. For example, the “MVCWebApp” has 1 endpoint (to serve HTTP requests) + 2 service references (to Windows Azure AppFabric caching and SQL Azure). This is an important notion as it will generate integration code for you. For example, in my MVC web application I can find the ServiceReferences.g.cs file containing the following code:

1 class ServiceReferences 2 { 3 public static Microsoft.ApplicationServer.Caching.DataCache CreateImport1() 4 { 5 return Service.ExecutingService.ResolveImport<Microsoft.ApplicationServer.Caching.DataCache>("Import1"); 6 } 7 8 public static System.Data.SqlClient.SqlConnection CreateImport2() 9 { 10 return Service.ExecutingService.ResolveImport<System.Data.SqlClient.SqlConnection>("Import2"); 11 } 12 }

Wait a minute… This looks like a cool thing! It’s basically a factory for components that may be hosted elsewhere! Calling ServiceReferences.CreateImport1() will give me a caching client that I can immediately work with! ServiceReferences.CreateImport2() (you can change these names by the way) gives me a connection to SQL Azure. No need to add connection strings in the application itself, no need to configure caching in the application itself. Instead, I can configure these things in the Windows Azure AppFabric Application canvas and just consume them blindly in my code. Awesome!

Here’s the code for my HomeController where I consume the cache/. Even my grandmother can write this!

1 [HandleError] 2 public class HomeController : Controller 3 { 4 public ActionResult Index() 5 { 6 var count = 1; 7 var cache = ServiceReferences.CreateImport1(); 8 var countItem = cache.GetCacheItem("visits"); 9 if (countItem != null) 10 { 11 count = ((int)countItem.Value) + 1; 12 } 13 cache.Put("visits", count); 14 15 ViewData["Message"] = string.Format("You are visitor number {0}.", count); 16 17 return View(); 18 } 19 20 public ActionResult About() 21 { 22 return View(); 23 } 24 }

Now let’s go back to the Windows Azure AppFabric Application canvas, where I can switch to “Deployment View”:

Windows Azure AppFabric Application Deployment View

Deployment View basically lets you decide in which container one or more applications will be running and how many scale units a container should span (see the properties window in Visual Studio for this).

Right-clicking and selecting “Deploy…” deploys my Windows Azure AppFabric Application to the production environment.

The management portal

After logging in to http://portal.appfabriclabs.com, I can manage the application I just published:

Windows Azure AppFabric Application Management Portal

I’m not going to go in much detail but will highlight some features. The portal enables you to manage your application: deploy/undeploy, scale, monitor, change configuration, …  Basically everything you would expect to be able to do. And more! If you look at the monitoring part, for example, you will see some KPI’s on your application. Here’s what my sample application shows after being deployed for a few minutes:

Windows Azure AppFabric Applications monitoring and latency

Pretty slick. It even monitors average latencies etc.!

Conclusion

As you can read in this blog post, I’ve been exploring this product and trying out the basics of it. I’m no sure yet if this model will fit every application, but I’m sure a solution like this is where the future of PaaS should be: no longer caring about servers, VM’s or instances, just deploy and let the platform figure everything out. My business value is my application, not the fact that it spans 2 VM’s.

Now when I say “future of PaaS”, I’m also a bit skeptical… Most customers I work with use COM, require startup scripts to configure the environment, care about the server their application runs on. In fact, some applications will never be able to be deployed on this solution because of that. Where Windows Azure already represents a major shift in terms of development paradigm (a too large step for many!), I thing the step to Windows Azure AppFabric Applications is a bridge too far for most people. At present.

But then there’s corporations… As corporations always are 10 steps behind, I foresee that this will only become mainstream within the next 5-8 years (for enterpise). Too bad! I wish most corporate environments moved faster…

If Microsoft wants this thing to succeed I think they need to work even more on shifting minds to the cloud paradigm and more specific to the PaaS paradigm. Perhaps Windows 8 can be a utility to do this: if Windows 8 shifts from “programming for a Windows environment” to “programming for a PaaS environment”, people will start following that direction. What the heck, maybe this is even a great model for Joe Average to create “apps” for Windows 8! Just like one submits an app to AppStore or Marketplace today, he/she can submit an app to “Windows Marketplace” which in the background just drops everything on a technology like Windows Azure AppFabric Applications?

Officially a cloudhead now! (or: re-awarded MVP)

View Maarten Balliauw's MVP profileWoohoo! I just received the great mail I expect yearly on the first of July:

Dear Maarten Balliauw,

Congratulations! We are pleased to present you with the 2011 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in Windows Azure technical communities during the past year.

The Microsoft MVP Award provides us the unique opportunity to celebrate and honor your significant contributions and say "Thank you for your technical leadership."

Toby Richards
General Manager
Community & Online Support

This e-mail is not that clear about what technology one is an MVP, so I dug in… It seems that I will be leaving the largest group of MVP’s around: ASP.NET (sorry guys, you were great!) and joining the cloudheads of the Windows Azure group.

Thanks everyone for keeping me motivated in working with the community, sharing knowledge and providing me time to do all this. That last one means: thank you, boss, and thank you to my lovely wife!

Let’s start working on earning the award for next year…

Delegate feed privileges to other users on MyGet

MyGetOne of the first features we had envisioned for MyGet and which seemed increasingly popular was the ability to provide other users a means of managing packages on another user’s feed.

As of today, we’re proud to announce the following new features:

  • Delegating feed privileges to other users – This allows you to make another MyGet user “co-admin” or “contributor” to a feed. This eases management of a private feed as that work can be spread across multiple people.
  • Making private feeds private by requiring authentication – It’s now possible to configure a feed so that nobody can consult its list of packages unless a valid login is provided. This feature is not yet available for use with NuGet 1.4.
  • Global deployment – We’ve updated our deployment so managing feeds can now be done on a server that’s closer to you.

Now when is Microsoft going to buy us out :-)

Delegating feed privileges to other users

MyGet now allows you to make another MyGet user “co-admin” or “contributor” to a feed. This eases management of a private feed as that work can be spread across multiple people. If combined with the “private feeds” option, it’s also possible to give some users read access to the feed while unauthenticated users can not access the feed created.

To delegate privileges to a user, navigate to the feed details and click the Feed security tab. This tab allows you to change feed privileges for different users. Adding feed privileges can be done by clicking the Add feed privileges… button (duh!).

Add MyGet feed privileges

Available privileges are:

  • Has no access to this feed (speaks for itself)
  • Can consume this feed (allows the user to use the feed in Visual Studio / NuGet)
  • Can manage packages for this feed (allows the user to add packages to the feed via the website and via the NuGet push API)
  • Can manage users and packages for this feed (extends the above with feed privilege management capabilities)

After selecting the privileges, the user receives an e-mail in which he/she can claim the acquired privileges:

Claim MyGet feed privileges

Privileges are not granted per direct: after assigning privileges, the user has to claim these privileges by clicking a link in an automated e-mail that has been sent.

Making private feeds private by requiring authentication

It’s now possible to configure a feed so that nobody can consult its list of packages unless a valid login is provided. Combined with the feed privilege delegation feature one can granularly control who can and who can not consume a feed from MyGet. Note that his feature is not yet available for use with NuGet 1.4, we hope to see support for this shipping with NuGet 1.5.

In order to enable this feature, on the Feed security tab change feed privileges for Everyone to Has no access to this feed.

NuGet feed authentication

This will instruct MyGet to request for basic authentication when someone accesses a MyGet feed. For example, try our sample feed: http://www.myget.org/F/mygetsample/

Global deployment

We’ve updated our deployment so managing feeds can now be done on a server that’s closer to you. Currently we have a deployment running in a European datacenter and one in the US. We hope to expand this further as well as leverage a content delivery network for high-speed distribution of packages.

image

We need your opinion!

As features keep popping into our head, the time we have to work on MyGet in our spare time is not enough. To support some extra development, we are thinking along the lines of introducing a premium version which you can host in your own datacenter or on a dedicated cloud environment. We would love some feedback on the following survey:

Enabling conditional Basic HTTP authentication on a WCF OData service

imageYes, a long title, but also something I was not able to find too easily using Google. Here’s the situation: for MyGet, we are implementing basic authentication to the OData feed serving available NuGet packages. If you recall my post Using dynamic WCF service routes, you may have deducted that MyGet uses that technique to have one WCF OData service serving the feeds of all our users. It’s just convenient! Unless you want basic HTTP authentication for some feeds and not for others…

After doing some research, I thought the easiest way to resolve this was to use WCF intercepting. Convenient, but how would you go about this? And moreover: how to make it extensible so we can use this for other WCF OData (or WebAPi) services in the future?

The solution to this was to create a message inspector (IDispatchMessageInspector). Here’s the implementation we created for MyGet: (disclaimer: this will only work for OData services and WebApi!)

 

1 public class ConditionalBasicAuthenticationMessageInspector : IDispatchMessageInspector 2 { 3 protected IBasicAuthenticationCondition Condition { get; private set; } 4 protected IBasicAuthenticationProvider Provider { get; private set; } 5 6 public ConditionalBasicAuthenticationMessageInspector( 7 IBasicAuthenticationCondition condition, IBasicAuthenticationProvider provider) 8 { 9 Condition = condition; 10 Provider = provider; 11 } 12 13 public object AfterReceiveRequest(ref Message request, IClientChannel channel, InstanceContext instanceContext) 14 { 15 // Determine HttpContextBase 16 if (HttpContext.Current == null) 17 { 18 return null; 19 } 20 HttpContextBase httpContext = new HttpContextWrapper(HttpContext.Current); 21 22 // Is basic authentication required? 23 if (Condition.Evaluate(httpContext)) 24 { 25 // Extract credentials 26 string[] credentials = ExtractCredentials(request); 27 28 // Are credentials present? If so, is the user authenticated? 29 if (credentials.Length > 0 && Provider.Authenticate(httpContext, credentials[0], credentials[1])) 30 { 31 httpContext.User = new GenericPrincipal( 32 new GenericIdentity(credentials[0]), new string[] { }); 33 return null; 34 } 35 36 // Require authentication 37 HttpContext.Current.Response.StatusCode = 401; 38 HttpContext.Current.Response.StatusDescription = "Unauthorized"; 39 HttpContext.Current.Response.Headers.Add("WWW-Authenticate", string.Format("Basic realm=\"{0}\"", Provider.Realm)); 40 HttpContext.Current.Response.End(); 41 } 42 43 return null; 44 } 45 46 public void BeforeSendReply(ref Message reply, object correlationState) 47 { 48 // Noop 49 } 50 51 private string[] ExtractCredentials(Message requestMessage) 52 { 53 HttpRequestMessageProperty request = (HttpRequestMessageProperty)requestMessage.Properties[HttpRequestMessageProperty.Name]; 54 55 string authHeader = request.Headers["Authorization"]; 56 57 if (authHeader != null && authHeader.StartsWith("Basic")) 58 { 59 string encodedUserPass = authHeader.Substring(6).Trim(); 60 61 Encoding encoding = Encoding.GetEncoding("iso-8859-1"); 62 string userPass = encoding.GetString(Convert.FromBase64String(encodedUserPass)); 63 int separator = userPass.IndexOf(':'); 64 65 string[] credentials = new string[2]; 66 credentials[0] = userPass.Substring(0, separator); 67 credentials[1] = userPass.Substring(separator + 1); 68 69 return credentials; 70 } 71 72 return new string[] { }; 73 } 74 }

Our ConditionalBasicAuthenticationMessageInspector implements a WCF message inspector that, once a request has been received, checks the HTTP authentication headers to check for a basic username/password. One extra there: since we wanted conditional authentication, we have also implemented an IBasicAuthenticationCondition interface which we have to implement. This interface decides whether to invoke authentication or not. The authentication itself is done by calling into our IBasicAuthenticationProvider. Implementations of these can be found on our CodePlex site.

If you are getting optimistic: great! But how do you apply this message inspector to a WCF service? No worries: you can create a behavior for that. All you have to do is create a new Attribute and implement IServiceBehavior. In this implementation, you can register the ConditionalBasicAuthenticationMessageInspector on the service endpoint. Here’s the implementation:

1 [AttributeUsage(AttributeTargets.Class)] 2 public class ConditionalBasicAuthenticationInspectionBehaviorAttribute 3 : Attribute, IServiceBehavior 4 { 5 protected IBasicAuthenticationCondition Condition { get; private set; } 6 protected IBasicAuthenticationProvider Provider { get; private set; } 7 8 public ConditionalBasicAuthenticationInspectionBehaviorAttribute( 9 IBasicAuthenticationCondition condition, IBasicAuthenticationProvider provider) 10 { 11 Condition = condition; 12 Provider = provider; 13 } 14 15 public ConditionalBasicAuthenticationInspectionBehaviorAttribute( 16 Type condition, Type provider) 17 { 18 Condition = Activator.CreateInstance(condition) as IBasicAuthenticationCondition; 19 Provider = Activator.CreateInstance(provider) as IBasicAuthenticationProvider; 20 } 21 22 public void AddBindingParameters(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase, Collection<ServiceEndpoint> endpoints, BindingParameterCollection bindingParameters) 23 { 24 // Noop 25 } 26 27 public void ApplyDispatchBehavior(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase) 28 { 29 foreach (ChannelDispatcher channelDispatcher in serviceHostBase.ChannelDispatchers) 30 { 31 foreach (EndpointDispatcher endpointDispatcher in channelDispatcher.Endpoints) 32 { 33 endpointDispatcher.DispatchRuntime.MessageInspectors.Add( 34 new ConditionalBasicAuthenticationMessageInspector(Condition, Provider)); 35 } 36 } 37 } 38 39 public void Validate(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase) 40 { 41 // Noop 42 } 43 }

One step to do: apply this service behavior to our OData service. Easy! Just add an attribute to the service class and you’re done! Note that we specify the IBasicAuthenticationCondition and IBasicAuthenticationProvider on the attribute.

1 [ConditionalBasicAuthenticationInspectionBehavior( 2 typeof(MyGetBasicAuthenticationCondition), 3 typeof(MyGetBasicAuthenticationProvider))] 4 public class PackageFeedHandler 5 : DataService<PackageEntities>, 6 IDataServiceStreamProvider, 7 IServiceProvider 8 { 9 }

Enjoy!

Community Day 2011 - Fun with ASP.NET MVC, MEF and NuGet

To start the blog post: AWESOME! That’s what I have to say about the latest edition of Community Day 2011. I had the privilege of doing a session on ASP.NET MVC 3, MEF and NuGet, and as promised to the audience: here are the slides. For those who want to see the session, the recording can be found on Channel 9 from a previous event.

“Fun with ASP.NET MVC3, MEF and NuGet”
Community Day 2011, Mechelen, Belgium, 23/06/2011

Abstract: “So you have a team of developers… And a nice architecture to build on… How about making that architecture easy for everyone and getting developers up to speed quickly? Learn all about integrating the managed extensibility framework (MEF) and ASP.NET MVC with some NuGet sauce for creating loosely coupled, easy to use architectures that anyone can grasp.

Here’s the slides:

Here’s the example code: Fun with ASP.NET MVC 3 MEF - CommunityDay 2011.zip (6.79 mb)

Some resources:

Advanced scenarios with Windows Azure Queues

For DeveloperFusion, I wrote an article on Windows Azure queues. Interested in working with queues and want to use some advanced techniques? Head over to the article:

Last week, in Brian Prince’s article, Using the Queuing Service in Windows Azure, you saw how to create, add messages into, retrieve and consume those messages from Windows Azure Queues. While being a simple, easy-to-use mechanism, a lot of scenarios are possible using this near-FIFO queuing mechanism. In this article we are going to focus on three scenarios which show how queues can be an important and extremely scalable component in any application architecture:

  • Back off polling, a method to lessen the number of transactions in your queue and therefore reduce the bandwidth used.
  • Patterns for working with large queue messages, a method to overcome the maximal size for a queue message and support a greater amount of data.
  • Using a queue as a state machine.

The techniques used in every scenario can be re-used in many applications and often be combined into an approach that is both scalable and reliable.

To get started, you will need to install the Windows Azure Tools for Microsoft Visual Studio. The current version is 1.4, and that is the version we will be using. You can download it from http://www.microsoft.com/windowsazure/sdk/.

N.B. The code that accompanies this article comes as a single Visual Studio solution with three console application projects in it, one for each scenario covered. To keep code samples short and to the point, the article covers only the code that polls the queue and not the additional code that populates it. Readers are encouraged to discover this themselves.

Want more content? Check Advanced scenarios with Windows Azure Queues. Enjoy!

MyGet now supports pushing from the command line

One of the work items we had opened for MyGet was the ability to push packages to a private feed from the command line. Only a few hours after our initial launch, David Fowler provided us with example code on how to implement NuGet command line pushes on the server side. An evening of coding later, I quickly hacked this into MyGet, which means that we now support pushing packages from the command line!

For those that did not catch up with my blog post overload of the past week: MyGet offers you the possibility to create your own, private, filtered NuGet feed for use in the Visual Studio Package Manager.  It can contain packages from the official NuGet feed as well as your private packages, hosted on MyGet. Want a sample? Add this feed to your Visual Studio package manager: http://www.myget.org/F/chucknorris

Pushing a package from the command line to MyGet

The first thing you’ll be needing is an API key for your private feed. This can be obtained through the “Edit Feed” link, where you’ll see an API key listed as well as a button to regenerate the API key, just in case someone steals it from you while giving a demo of MyGet :-)

image

Once you have the API key, it can be stored into the NuGet executable’s settings by running the following command, including your private API key and your private feed URL:

1 NuGet setApiKey c18673a2-7b57-4207-8b29-7bb57c04f070 -Source http://www.myget.org/F/testfeed

After that, you can easily push a package to your private feed. The package will automatically show up on the website and your private feed. Do note that this can take a few minutes to propagate.

1 NuGet push RouteMagic.0.2.2.2.nupkg -Source http://www.myget.org/F/testfeed

More on the command line can be found on the NuGet documentation wiki.

Other change: authentication to the website

Someone on Twitter (@corydeppen) complained he had to login using Windows Live ID. Because we’re using the Windows Azure AppFabric Access Control Service (which I’ll abbreviate to ACS next time), this was in fact a no-brainer. We now support Windows Live ID, Google, Yahoo! and Facebook as authentication mechanisms for MyGet. Enjoy!

Creating your own private NuGet feed: MyGet

myget - NuGet as a serverEver since NuGet came out, I’ve been thinking about leveraging it in a corporate environment. I've seen two NuGet server implementations appear on the Internet: the official NuGet gallery server and Phil Haack’s NuGet.Server package. As these both are good, there’s one thing wrong with them: you can't be lazy! You have to do some stuff you don’t always want to do, namely: configure and deploy.

After discussing some ideas with my colleague Xavier Decoster, we decided it’s time to turn our heads into the cloud: we’re providing you NuGet-as-a-Service (NaaS)! Say hello to MyGet.

MyGet offers you the possibility to create your own, private, filtered NuGet feed for use in the Visual Studio Package Manager.
It can contain packages from the official NuGet feed as well as your private packages, hosted on MyGet. Want a sample? Add this feed to your Visual Studio package manager: http://www.myget.org/F/chucknorris

But wait, there’s more: we’re open sourcing this thing! Feel free to fork over at CodePlex and extend our "product". We've already covered some feature requests we would love to see, and Xavier has posted some more on his blog. In short: feel free to add your own most-wanted features, provide us with bugfixes (pretty sure there will be a lot since we hacked this together in a very short time). We're hosting on WIndows Azure, which means you should get the Windows Azure SDK installed prior to contributing. Unless you feel that you can write code without locally debugging :-)

Chuck Norris Feed

Feel free to go ahead and create your private feed. Some ideas (more at Xavier's site):

  • A feed containing only the packages you or your company often use
  • A feed containing only your (open-source?) project and its dependencies
  • A feed containing just a few packages that you want to use for a certain project: tell your developers to just install them all

Bugs and feature requests? Feel free to post them as a comment below. Once we release the sources, I’ll kick your mailbox with a request to implement the stuff you proposed. Seems fair to me :-)

Enjoy http://myget.org!

Scaffolding and packaging a Windows Azure project in PHP

Scaffolding CloudWith the fresh release of the Windows Azure SDK for PHP v3.0, it’s time to have a look at the future. One of the features we’re playing with is creating a full-fledged replacement for the current Windows Azure Command-Line tools available. These tools sometimes are a life saver and sometimes a big PITA due to baked-in defaults and lack of customization options. And to overcome that last one, here’s what we’re thinking of: scaffolders.

Basically what we’ll be doing is splitting the packaging process into two steps:

  • Scaffolding
  • Packaging

To get a feeling about all this, I strongly suggest you to download the current preview version of this new concept and play along.

By the way: feedback is very welcome! Just comment on this post and I’ll get in touch.

Scaffolding a Windows Azure project

Scaffolding a Windows Azure project consists of creating a “template” for your Windows Azure project. The idea is that we can provide one or more default scaffolders that can generate a template for you, but there’s no limitation on creating your own scaffolders (or using third party scaffolders).

The default scaffolder currently included is based on a blog post I did earlier about having a lightweight deployment. Creating a template for a Windows Azure project is pretty simple:

1 Package Scaffold -p:"C:\temp\Sample" --DiagnosticsConnectionString:"UseDevelopmentStorage=true"

This command will generate a folder structure in C:\Temp\Sample and uses the default scaffolder (which requires the parameter “DiagnosticsConnectionString to be specified). Nothing however prevents you from creating your own (later in this post).

image

Once you have the folder structure in place, the only thing left is to copy your application contents into the “PhpOnAzure.Web” folder. In case of this default scaffolder, that is all that is required to create your Windows Azure project structure. Nothing complicated until now, and I promise you things will never get complicated. However if you are a brave soul, you can at this point customize the folder structure, add our custom configuration settings, …

Packaging a Windows Azure project

After the scaffolding comes the packaging. Again, a very simple command:

1 Package Create -p:"C:\temp\Sample" -dev:false

The above will create a Sample.cspkg file which you can immediately deploy to Windows Azure. Either through the portal or using the Windows Azure command line tools that are included in the current version of the Windows Azure SDK for PHP.

Building your own scaffolder

Scaffolders are in fact Phar archives, a PHP packaging standard which is in essence a file containing executable PHP code as well as resources like configuration files, images, …

A scaffolder is typically a structure containing a resources folder containing configuration files or a complete PHP deployment or something like that, and a file named index.php, containing the scaffolding logic. Let’s have a look at index.php.

1 <?php 2 class Scaffolder 3 extends Microsoft_WindowsAzure_CommandLine_PackageScaffolder_PackageScaffolderAbstract 4 { 5 /** 6 * Invokes the scaffolder. 7 * 8 * @param Phar $phar Phar archive containing the current scaffolder. 9 * @param string $root Path Root path. 10 * @param array $options Options array (key/value). 11 */ 12 public function invoke(Phar $phar, $rootPath, $options = array()) 13 { 14 // ... 15 } 16 }

Looks simple, right? It is. The invoke method is the only thing that you should implement: this can be a method extracting some content to the $rootPath as well as updating some files in there as well as… anything! If you can imagine ourself doing it in PHP, it’s possible in a scaffolder.

Packaging a scaffolder is the last step in creating one: copying all files into a .phar file. And wouldn’t it be fun if that task was easy as well? Check this command:

1 Package CreateScaffolder -p:"/path/to/scaffolder" -out:"/path/to/MyScaffolder.phar"

There you go.

Ideas for scaffolders

I’m not going to provide all the following scaffolders out of the box, but here’s some scaffolders that I’m thinking would be interesting:

  • A scaffolder including a fully tweaked configured PHP runtime (with SQL Server Driver for PHP, Wincache, …)
  • A scaffolder which enables remote desktop
  • A scaffolder which contains an autoscaling mechanism
  • A scaffolder that can not exist on its own but can provide additional functionality to an existing Windows Azure project

Enjoy! And as I said: feedback is very welcome!