Maarten Balliauw {blog}

ASP.NET MVC, Microsoft Azure, PHP, web development ...

NAVIGATION - SEARCH

Use NuGet Package Restore to avoid pushing assemblies to Windows Azure Websites

Windows Azure Websites allows you to publish a web site in ASP.NET, PHP, Node, … to Windows Azure by simply pushing your source code to a TFS or Git repository. But how does Windows Azure Websites manage dependencies? Do you have to check-in your assemblies and NuGet packages into source control? How about no…

NuGet 1.6 shipped with a great feature called NuGet Package Restore. This feature lets you use NuGet packages without adding them to your source code repository. When your solution is built by Visual Studio (or MSBuild, which is used in Windows Azure Websites), a build target calls nuget.exe to make sure any missing packages are automatically fetched and installed before the code is compiled. This helps you keep your source repo small by keeping large packages out of version control.

Enabling NuGet Package Restore

Enabling NuGet package restore can be done from within Visual Studio. Simply right-click your solution and click the “Enable NuGet Package Restore” menu item.

NuGet package restore Windows Azure Websites Antares

Visual Studio will now do the following with the projects in your solution:

  • Create a .nuget folder at the root of your solution, containing a NuGet.exe and a NuGet build target
  • Import this NuGet target into all your projects so that MSBuild can find, download and install NuGet packages on-the-fly when creating a build

Be sure to push the files in the .nuget folder to your source control system. The packages folder is not needed, except for the repositories.config file that sits in there.

But what about my non-public assembly references? What if I don't trust auto-updating from NuGet.org?

Good question. What about them? A simple answer would be to create NuGet packages for them. And if you already have NuGet packages for them, things get even easier. Make sure that you are hosting these packages in an online feed which is not the public NuGet repository at www.nuget.org, unless you want your custom assemblies out there in public. A good choice would be to checkout www.myget.org and host your packages there.

But then a new question surfaces: how do I link a custom feed to my projects? The answer is pretty simple: in the .nuget folder, edit the NuGet.targets file. In the PackageSources element, you can supply a semicolon (;) separated list of feeds to check for packages:

1 <?xml version="1.0" encoding="utf-8"?> 2 <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> 3 <PropertyGroup> 4 <!-- ... --> 5 6 <!-- Package sources used to restore packages. By default will used the registered sources under %APPDATA%\NuGet\NuGet.Config --> 7 <PackageSources>"http://www.myget.org/F/chucknorris;http://www.nuget.org/api/v2"</PackageSources> 8 9 <!-- ... --> 10 </PropertyGroup> 11 12 <!-- ... --> 13 </Project>

By doing this and pushing the targets file to your Windows Azure Websites Git or TFS repo, the build system backing Windows Azure Websites will go ahead and download your packages from an external location, not cluttering your sources. Which makes for one, happy cloud.

Windows Azure Git Deploy

GitHub for Windows Azure Websites

Windows Azure Websites Git Github for WindowsWith the new release of Windows Azure and Windows Azure Websites, a lot of new scenarios with Windows Azure just became possible. One I like a lot, especially since Appharbor and Heroku have similar offers too, is the possibility to push source code (ASP.NET or PHP) to Windows Azure instead of binaries using Windows Azure Websites.

Not everyone out there is a command-line here though: if you want to use Git as a mechanism of pushing sources to Windows Azure Websites chances are you may go crazy if you are unfamiliar with command-line git commands. Luckily, a couple of weeks ago, GitHub released GitHub for Windows. It features an easy-to-use GUI on top of GitHub repositories. And with a small trick also on top of Windows Azure Websites.

Setting up a Windows Azure Website

Since you’re probably still unfamiliar with Windows Azure Websites, let me guide you through the setup. It’s a simple process. First of all, navigate to the new Windows Azure portal. It looks different than the one you’re used to but it’s way easier to use. In the toolbar at the bottom, click New, select Web site, Quick Create and enter a hostname of choice. I chose “websiteswithgit”:

Creating a Windows Azure Website

After a couple of seconds, you’ll be presented with the dashboard of your newly created Windows Azure Website. This dashboard features a lot of interesting metrics about your website such as data traffic, CPU usage, errors, … It also displays the available means for publishing a site to Windows Azure Websites: TFS deploy, Git deploy, Webdeploy and FTP publishing. That’s it: your website has been set up and if you navigate to the newly created URL, you’ll be greeted with the default Windows Azure Websites landing page.

Setting up Git publishing

Since we’ll be using Git, click the Set up Git Publishing option.

Windows Azure Websites Dashboard

If you haven’t noticed already: Windows Azure Websites makes Windows Azure a lot easier. After a couple of seconds, Git publishing is configured and all it takes to deploy your website is commit your source code, whether ASP.NET, ASP.NET Webpages or PHP to the newly created Git repository. Windows Azure Websites will take care of the build process (cool!) and will deploy this to Windows Azure in just a couple of seconds. Whoever told you deploying to Windows Azure takes ages lied to you!

Connecting GitHub for Windows to Windows Azure Websites

After setting up Git publishing, you probably have noticed that there’s a Git repository URL being displayed. Copy this one to your clipboard as we’ll be needing it in a minute. Open GitHub for Windows, right-click the UI and choose to “open a shell here”. Make sure you’re in the folder of choice. Next, issue a “git clone <url>” command, where <url> of course is the Git repository URL you’ve just copied.

Windows Azure Git Repository Build

The (currently empty) Windows Azure Website Git repository will be cloned onto your system. Now close this command-line (I promised we would use GitHub for Windows instead).

Git folder

Open the folder in which you cloned the Git repo and drag it onto GitHub for Windows. It will look kind of empty, still:

A Windows Azure Websites repository in GitHub for Windows

Next, add any file you want. A PHP file, a plain HTML file or a complete ASP.NET or ASP.NET MVC Web Application. GitHub for Windows will detect these changes and you can commit them to your local repository:

GitHub commit Windows Azure

All that’s left to do after a commit is clicking the Publish button. GitHub for Windows will now copy all changesets to the Windows Azure Websites GitHub repository which will in turn trigger an eventual build process for your web site. The result? A happy Windows Azure Websites dashboard and a site up and running. Rinse, repeat, commit. Happy deployments to Windows Azure Websites using GitHub for Windows!

Antares Windows Azure Websites Deployment History Build

AZUG Windows Azure Saturday overview

Windows Azure SaturdayAs one of the board members of the Windows Azure User Group in Belgium, I wanted to write a post on an event we organized last weekend. We do more events (one each month), however this one was way out of our comfort zone. Typically, we have an evening event in which a speaker delivers one session to around 40 attendees. Last Saturday, we organized our first Windows Azure Saturday, a hackaton followed by a barbecue.

Here’s what I will remember about our event…

The event

The idea for a “hackaton plus barbecue” emerged a couple of months ago. The idea was simple: get a group of Windows Azure enthusiasts together, code for a couple of hours and have a fun barbecue afterwards.

At that time, the idea was small. We’ve done some hands-on events under our “Code d’Azure” brand earlier, so we had the idea of having some 10 attendees bringing their laptop, especially since we were planning to host it on a Saturday. We were wrong…  Almost 50 people registered for the hackaton and barbecue!

“10 EUR entrance but you get it back”

On regular events, there’s always a number of no-shows. Not a problem although it’s always difficult to know how much sandwiches to order. This time we had a barbecue so to ensure we did not order too much meat, we’ve asked attendees to pay for the event. If they showed up, they would get their money back. Otherwise the money would be used to cover the cost of the barbecue.

This system worked out well! In fact, people wouldn’t have had a problem with the user group keeping their 10 EUR (in fact, some attendees refused to take it  back and offered to sponsor their 10 EUR as well).

Applications developed during the hackaton

Here’s a list of the applications developed during the hackaton:

  • shellR - PowerShell over the web – A really cool application providing real-time PowerShell to any machine over the Internet.
  • Autoscaling document translation service – An autoscaling service which translates PDF and Word documents.
  • XBox game score comparer – Compare XBox game scores with your friends.
  • TwiBo – A Twitter bot which analyzes a specific hashtag and can retweet based on specific characteristics of the Tweet.
  • Positive/negative review analyzer – Analyzing text and giving a positive / negative sentiment score. Great to analyze what people are saying about a specific subject.
  • Worker role as a service – Hosting assemblies in a worker role, as a service.
  • PDF creator service – Creating PDF files from plain text files.
  • Service bus webcam picture transmission – A Netduino streaming webcam pictures over the Windows Azure Service Bus.
  • BBQ as a Service – Order BBQ meat via a Windows Phone 7 application.

To be honest, I did not expect too many good applications. We only had 3,5 hours to code and a s a developer I know that’s almost nothing. Again: wrong. The apps that have emerged were really, really cool! shellR for example (the guys who won the grand prize) was offering a PowerShell console through a browser, which could connect to PowerShell agents around the globe enabling you to manage a computer through your browser. A startup idea in my opinion!

Windows Azure SaturdayThe barbecue

RealDolmen provided us with the location for our event. We were looking for a location that could house 50 attendees and their laptops, had a room to eat (in case of bad weather) and a garden in case of nice weather. Their main office had it all! The weather was nice so we had a great barbecue in their garden.

On a side note: when organizing a party, the guys from ”Lekker Beest” are absolute masters in the art of cooking meat and fish! Yummy in the tummy!

Sponsors

Thanks to our sponsors, we were able to provide the following prizes to the apps being developed:

  • 2 ReSharper licenses sponsored by JetBrains
  • 2 Cloud Storage Studio licenses sponsored by Cerebrata
  • 3 one-year MyGet Small subscriptions + a free Pro NuGet book sponsored by MyGet
  • 25 Pluralsight 1-month subscriptions

The first prize was a unique thing: a 2 person Windows Azure North Europe region datacenter tour (flights, dinner, hotel and a tour through the datacenter where you've deployed your app) sponsored by Microsoft.

Thanks to all of our sponsors!

Wifi

Every conference has wifi problems. We had none. Which does not mean we didn’t have networking issues. The Internet uplink we arranged was an ADSL connection, an “asymmetric" digital subscriber line”. All blahblah, except for the asymmetric aspect: if someone launched an upload to Windows Azure, others were experiencing a really slow connection. And as uploads were happening all the time, well, guess what: the connection was a bit flakey. This is something we would do different on a future event.

Conclusion

It was a blast! Have a look at the Tweets around our event. We will definitely do this again. Thank you sponsors, thank you attendees and thank you fellow AZUG board members for making this a splendid event!

Ready for a Windows Azure roadtrip?

image

A while ago, Michelangelo van Dam (a hardcore PHP guy), François Hertay (a hardcore Java guru) and myself (a .NET guy) were asked if we wanted to write about some of our experiences with the Windows Azure platform.

Starting next week, you’ll find about 3 blog posts per week and/or videos on Windows Azure (both a getting started level as well as some fun advanced posts) on a freshly released website: www.azure-roadtrip.be

I have the honour of kicking off a series of blog posts on .NET, starting with a level 100 one: the what & why of Windows Azure for .NET developers.

Let’s knock at an open door: Windows Azure is a cloud platform. It’s a group of services offered by Microsoft that enable you to build your applications using the typical characteristics of a cloud platform: everything is self-service, on-demand and pay-per-use. You are the one demanding resources like virtual machines or storage and you pay for what you use, depending on how much you use for how long. No contract periods – well, there is one: you always pay for at least one hour if you use these resources.

Read more on wwww.azure-roadtrip.be! And keep an eye on that site as I have some fun posts coming there :-)

Social meet up on Twitter for MEET Windows Azure on June 7th

AzureBanner_300x250Here’s a perhaps rather redundant event for you but it should be kind of fun: MEET Windows Azure on Twitter (+ Beer). The idea is to list people who have a twitter account and intend to follow the MEET Windows Azure event via live streaming on June 7th (1pm PDT).

So see you online for the event on the 7th! My Twitter handle is @maartenballiauw

MEET Windows Azure Blog Relay:

+ Beer? Since you are watching the event from your coucnh (or the totilet, or your bed, or actually, from wherever you want), feel free to open up a beer.

Call to action: Link to this blog post on your blog and I will update this post to link to you!

The world is changing: the future of IT

imageI’ve had my say on cloud and the new world of IT already in an earlier post, Predictions for the future. Today, I’m seeing signs the world is in fact starting to change. Sites like Instagram started small and grew big in no time. Were the founders IT wonders? No. And you don’t have to be.

Not so long ago, it would have taken you a lot of time and resources to get your idea up and running on the Internet. Especially if it required multiple datacenters and scalability. You would have to deploy a bunch of servers and make sure you had an agile IT environment in place in order to get things running and keep things flexible, a key requirement for many startups but also for large organizations.

Today, cloud platforms like Windows Azure change the rules. Anyone can now build an advanced application architecture backed by an advanced infrastructure. Platform-as-a-service offerings like Windows Azure offer you the possibility to distribute users between different geographical regions. They offer you storage in multiple datacenters. They enable you to continuously deploy new versions of your software and easily rollback should things go wrong.

The cloud is not new technology. Virtualization is used. System administrators still run the datacenter. It’s about new ideas and possibilities. The datacenter we knew before, is just the fabric in which your ideas come to life. A thin software layer on top of the giant hardware pool that is available makes sure that anyone can quickly combine a large number of easy-to-use building blocks to empower your idea. It makes advanced, global-scale projects easy and cheap and yet, more reliable.

Everyone on the globe, a small startup or a large organization, can now take advantage of the same IT possibilities that were previously only available for businesses running their own datacenter. Today, I can set up a global application that scales in a few hours at a very low-risk and price.

Of course, you need some supporting services for your business as well. For the development part, source control and issue tracking may be useful. GitHub, TFS Online and many others offer that as-a-Service, up and running in no time. For local teams, for distributed teams. The same story with e-mail, customer relation management, or even billing your customers. You can easily set up a new company or a new team based on the capabilities the new world of IT has to offer.

All of this has an impact on several areas. As small, agile startups or teams start working on their ideas and have a low time-to-market due to all of this, they can benefit over slow, unadapted large organizations. They can make higher profits because of the commodity services available in the cloud. They can make higher profits because organizations not making use of these technologies will fall behind. Probably sooner than we all think at this point in time. Large organizations will have to adapt to small, lean teams that know both the datacenter fabric they are working on as well as software. Silos will have to be broken down into lean teams, ready to make use of all that’s offered at the platform level. Ready to be fast-to-market or even first-to-market. Much like startups are small teams that often already make use of these new techniques.

Make your idea come to life in this changing new world.

Using the Windows Azure Content Delivery Network

As you know, Windows Azure is a very rich platform. Next to compute and storage, it offers a series of building blocks that simplify your life as a cloud developer. One of these building blocks is the content delivery network (CDN), which can be used for offloading content to a globally distributed network of servers, ensuring faster throughput to your end users.

I’ve been asked to write an article on this matter, which I did, and which is live at ACloudyPlace.com since today. As a small teaser, here’s the first section of it:

Reasons for using a CDN
There are a number of reasons to use a CDN. One of the obvious reasons lies in the nature of the CDN itself: a CDN is globally distributed and caches static content on edge nodes, closer to the end user. If a user accesses your web application and some of the files are cached on the CDN, the end user will download those files directly from the CDN, experiencing less latency in their request.

Another reason for using the CDN is throughput. If you look at a typical webpage, about 20% of it is HTML which was dynamically rendered based on the user’s request. The other 80% goes to static files like images, CSS, JavaScript, and so forth. Your server has to read those static files from disk and write them on the response stream, both actions which take away some of the resources available on your virtual machine. By moving static content to the CDN, your virtual machine will have more capacity available for generating dynamic content.

Here’s the full article: Using the Windows Azure Content Delivery Network

Protecting Windows Azure Web and Worker roles from malware

Most IT administrators will install some sort of virus scanner on your precious servers. Since the cloud, from a technical perspective, is just a server, why not follow that security best practice on Windows Azure too? It has gone by almost unnoticed, but last week Microsoft released the Microsoft Endpoint Protection for Windows Azure Customer Technology Preview. For the sake of bandwidth, I’ll be referring to it as EP.

EP offers real-time protection, scheduled scanning, malware remediation (a fancy word for quarantining), active protection and automatic signature updates. Sounds a lot like Microsoft Endpoint Protection or Windows Security Essentials? That’s no coincidence: EP is a Windows Azurified version of it.

Enabling anti-malware on Windows Azure

After installing the Microsoft Endpoint Protection for Windows Azure Customer Technology Preview, sorry, EP, a new Windows Azure import will be available. As with remote desktop or diagnostics, EP can be enabled by a simple XML one liner:

1 <Import moduleName="Antimalware" />

Here’s a sample web role ServiceDefinition.csdef file containing this new import:

1 <?xml version="1.0" encoding="utf-8"?> 2 <ServiceDefinition name="ChuckProject" 3 xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> 4 <WebRole name="ChuckNorris" vmsize="Small"> 5 <Sites> 6 <Site name="Web"> 7 <Bindings> 8 <Binding name="Endpoint1" endpointName="Endpoint1" /> 9 </Bindings> 10 </Site> 11 </Sites> 12 <Endpoints> 13 <InputEndpoint name="Endpoint1" protocol="http" port="80" /> 14 </Endpoints> 15 <Imports> 16 <Import moduleName="Antimalware" /> 17 <Import moduleName="Diagnostics" /> 18 </Imports> 19 </WebRole> 20 </ServiceDefinition>

That’s it! When you now deploy your Windows Azure solution, Microsoft Endpoint Protection will be installed, enabled and configured on your Windows Azure virtual machines.

Now since I started this blog post with “IT administrators”, chances are you want to fine-tune this plugin a little. No problem! The ServiceConfiguration.cscfg file has some options waiting to be eh, touched. And since these are in the service configuration, you can also modify them through the management portal, the management API, or sysadmin-style using PowerShell. Anyway, the following options are available:

  • Microsoft.WindowsAzure.Plugins.Antimalware.ServiceLocation – Specify the datacenter region where your application is deployed, for example “West Europe” or “East Asia”. This will speed up deployment time.
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableAntimalware – Should EP be enabled or not?
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableRealtimeProtection – Should real-time protection be enabled?
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableWeeklyScheduledScans – Weekly scheduled scans enabled?
  • Microsoft.WindowsAzure.Plugins.Antimalware.DayForWeeklyScheduledScans – Which day of the week (0 – 7 where 0 means daily)
  • Microsoft.WindowsAzure.Plugins.Antimalware.TimeForWeeklyScheduledScans – What time should the scheduled scan run?
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedExtensions – Specify file extensions to exclude from scanning (pip-delimited)
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedPaths – Specify paths to exclude from scanning (pip-delimited)
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedProcesses – Specify processes to exclude from scanning (pip-delimited)

Monitoring anti-malware on Windows Azure

How will you know if a threat has been detected? Well, luckily for us, Windows Endpoint Protection writes its logs to the System event log. Which means that you can simply add a specific data source in your diagnostics monitor and you’re done:

1 var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration(); 2 3 // Note: if you need informational / verbose, also subscribe to levels 4 and 5 4 configuration.WindowsEventLog.DataSources.Add( 5 "System!*[System[Provider[@Name='Microsoft Antimalware'] and (Level=1 or Level=2 or Level=3)]]"); 6 7 configuration.WindowsEventLog.ScheduledTransferPeriod 8 = System.TimeSpan.FromMinutes(1); 9 10 DiagnosticMonitor.Start( 11 "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 12 configuration);

In addition, EP also logs its inner workings to its installation folders. You can also include these in your diagnostics configuration:

1 var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration(); 2 3 // ...add the event logs like in the previous code sample... 4 5 var mep1 = new DirectoryConfiguration(); 6 mep1.Container = "wad-endpointprotection-container"; 7 mep1.DirectoryQuotaInMB = 5; 8 mep1.Path = "%programdata%\Microsoft Endpoint Protection"; 9 10 var mep2 = new DirectoryConfiguration(); 11 mep2.Container = "wad-endpointprotection-container"; 12 mep2.DirectoryQuotaInMB = 5; 13 mep2.Path = "%programdata%\Microsoft\Microsoft Security Client"; 14 15 configuration.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0); 16 configuration.Directories.DataSources.Add(mep1); 17 configuration.Directories.DataSources.Add(mep2); 18 19 DiagnosticMonitor.Start( 20 "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 21 configuration);

From this moment one, you can use a tool like Cerebrata’s Diagnostics Monitor to check the event logs of all your Windows Azure instances that have anti-malware enabled.

Pro NuGet is finally there!

Short version: Install-Package ProNuget or http://amzn.to/pronuget

Pro NuGet - Continuous integration Package RestoreIt’s been a while since I wrote my first book. After I’ve been telling that writing a book is horrendous (try writing a chapter per week after your office hours…) and that I would never write on again, my partner-in-crime Xavier Decoster and I had the same idea at the same time: what about a book on NuGet? So here it is: Pro NuGet is fresh off the presses (or on Kindle).

Special thanks go out to Scott Hanselman and Phil Haack for writing our foreword. Also big kudos to all who’ve helped us out now and then and did some small reviews. Yes Rob, Paul, David, Phil, Hadi: that’s you guys.

Why a book on NuGet?

Why not? At the time we decided we would start writing a book (september 2011), NuGet was out there for a while already. Yet, most users then (and still today) were using NuGet only as a means of installing packages, some creating packages. But NuGet is much more! And that’s what we wanted to write about. We did not want to create a reference guide on what NuGet command were available. We wanted to focus on best practices we’ve learned over the past few months using NuGet.

Some scenarios covered in our book:

  • What’s the big picture on package management?
  • Flashback last week: NuGet.org was down. How do you keep your team working if you depend on that external resource?
  • Is it a good idea to auto-update NuGet packages in a continous integration process?
  • Use the PowerShell console in VS2010/11. How do I write my own NuGet PowerShell Cmdlets? What can I do in there?
  • Why would you host your own NuGet repository?
  • Using NuGet for continuous delivery
  • More!

I feel we’ve managed to cover a lot of concepts that go beyond “how to use NuGet vX” and instead have given as much guidance as possible. Questions, suggestions, remarks, … are all welcome. And a click on “Add to cart” is also a good idea ;-)

I’m an ASP Insider

imageCool! I’ve just learned that I’m invited to join the ASPInsiders. I’m really excited and honored to be part of this group of great ASP.NET experts. Very much looking forward to learning the secret handshake and being able to provide feedback that helps the ASP.NET team forward.

If don’t know who the ASP Insiders are, here’s their elevator pitch:

“The ASPInsiders is a select group of international professionals who have demonstrated expertise in ASP.NET technologies and who provide valuable, early feedback on related developing technologies and publications to their peers, the Microsoft ASP.NET team and others.”

Some more info is available in the Who are the ASPInsiders? post by one of the insiders.