Maarten Balliauw {blog}

ASP.NET, ASP.NET MVC, Windows Azure, PHP, ...

NAVIGATION - SEARCH

The world is changing: the future of IT

imageI’ve had my say on cloud and the new world of IT already in an earlier post, Predictions for the future. Today, I’m seeing signs the world is in fact starting to change. Sites like Instagram started small and grew big in no time. Were the founders IT wonders? No. And you don’t have to be.

Not so long ago, it would have taken you a lot of time and resources to get your idea up and running on the Internet. Especially if it required multiple datacenters and scalability. You would have to deploy a bunch of servers and make sure you had an agile IT environment in place in order to get things running and keep things flexible, a key requirement for many startups but also for large organizations.

Today, cloud platforms like Windows Azure change the rules. Anyone can now build an advanced application architecture backed by an advanced infrastructure. Platform-as-a-service offerings like Windows Azure offer you the possibility to distribute users between different geographical regions. They offer you storage in multiple datacenters. They enable you to continuously deploy new versions of your software and easily rollback should things go wrong.

The cloud is not new technology. Virtualization is used. System administrators still run the datacenter. It’s about new ideas and possibilities. The datacenter we knew before, is just the fabric in which your ideas come to life. A thin software layer on top of the giant hardware pool that is available makes sure that anyone can quickly combine a large number of easy-to-use building blocks to empower your idea. It makes advanced, global-scale projects easy and cheap and yet, more reliable.

Everyone on the globe, a small startup or a large organization, can now take advantage of the same IT possibilities that were previously only available for businesses running their own datacenter. Today, I can set up a global application that scales in a few hours at a very low-risk and price.

Of course, you need some supporting services for your business as well. For the development part, source control and issue tracking may be useful. GitHub, TFS Online and many others offer that as-a-Service, up and running in no time. For local teams, for distributed teams. The same story with e-mail, customer relation management, or even billing your customers. You can easily set up a new company or a new team based on the capabilities the new world of IT has to offer.

All of this has an impact on several areas. As small, agile startups or teams start working on their ideas and have a low time-to-market due to all of this, they can benefit over slow, unadapted large organizations. They can make higher profits because of the commodity services available in the cloud. They can make higher profits because organizations not making use of these technologies will fall behind. Probably sooner than we all think at this point in time. Large organizations will have to adapt to small, lean teams that know both the datacenter fabric they are working on as well as software. Silos will have to be broken down into lean teams, ready to make use of all that’s offered at the platform level. Ready to be fast-to-market or even first-to-market. Much like startups are small teams that often already make use of these new techniques.

Make your idea come to life in this changing new world.

Using the Windows Azure Content Delivery Network

As you know, Windows Azure is a very rich platform. Next to compute and storage, it offers a series of building blocks that simplify your life as a cloud developer. One of these building blocks is the content delivery network (CDN), which can be used for offloading content to a globally distributed network of servers, ensuring faster throughput to your end users.

I’ve been asked to write an article on this matter, which I did, and which is live at ACloudyPlace.com since today. As a small teaser, here’s the first section of it:

Reasons for using a CDN
There are a number of reasons to use a CDN. One of the obvious reasons lies in the nature of the CDN itself: a CDN is globally distributed and caches static content on edge nodes, closer to the end user. If a user accesses your web application and some of the files are cached on the CDN, the end user will download those files directly from the CDN, experiencing less latency in their request.

Another reason for using the CDN is throughput. If you look at a typical webpage, about 20% of it is HTML which was dynamically rendered based on the user’s request. The other 80% goes to static files like images, CSS, JavaScript, and so forth. Your server has to read those static files from disk and write them on the response stream, both actions which take away some of the resources available on your virtual machine. By moving static content to the CDN, your virtual machine will have more capacity available for generating dynamic content.

Here’s the full article: Using the Windows Azure Content Delivery Network

Protecting Windows Azure Web and Worker roles from malware

Most IT administrators will install some sort of virus scanner on your precious servers. Since the cloud, from a technical perspective, is just a server, why not follow that security best practice on Windows Azure too? It has gone by almost unnoticed, but last week Microsoft released the Microsoft Endpoint Protection for Windows Azure Customer Technology Preview. For the sake of bandwidth, I’ll be referring to it as EP.

EP offers real-time protection, scheduled scanning, malware remediation (a fancy word for quarantining), active protection and automatic signature updates. Sounds a lot like Microsoft Endpoint Protection or Windows Security Essentials? That’s no coincidence: EP is a Windows Azurified version of it.

Enabling anti-malware on Windows Azure

After installing the Microsoft Endpoint Protection for Windows Azure Customer Technology Preview, sorry, EP, a new Windows Azure import will be available. As with remote desktop or diagnostics, EP can be enabled by a simple XML one liner:

1 <Import moduleName="Antimalware" />

Here’s a sample web role ServiceDefinition.csdef file containing this new import:

1 <?xml version="1.0" encoding="utf-8"?> 2 <ServiceDefinition name="ChuckProject" 3 xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> 4 <WebRole name="ChuckNorris" vmsize="Small"> 5 <Sites> 6 <Site name="Web"> 7 <Bindings> 8 <Binding name="Endpoint1" endpointName="Endpoint1" /> 9 </Bindings> 10 </Site> 11 </Sites> 12 <Endpoints> 13 <InputEndpoint name="Endpoint1" protocol="http" port="80" /> 14 </Endpoints> 15 <Imports> 16 <Import moduleName="Antimalware" /> 17 <Import moduleName="Diagnostics" /> 18 </Imports> 19 </WebRole> 20 </ServiceDefinition>

That’s it! When you now deploy your Windows Azure solution, Microsoft Endpoint Protection will be installed, enabled and configured on your Windows Azure virtual machines.

Now since I started this blog post with “IT administrators”, chances are you want to fine-tune this plugin a little. No problem! The ServiceConfiguration.cscfg file has some options waiting to be eh, touched. And since these are in the service configuration, you can also modify them through the management portal, the management API, or sysadmin-style using PowerShell. Anyway, the following options are available:

  • Microsoft.WindowsAzure.Plugins.Antimalware.ServiceLocation – Specify the datacenter region where your application is deployed, for example “West Europe” or “East Asia”. This will speed up deployment time.
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableAntimalware – Should EP be enabled or not?
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableRealtimeProtection – Should real-time protection be enabled?
  • Microsoft.WindowsAzure.Plugins.Antimalware.EnableWeeklyScheduledScans – Weekly scheduled scans enabled?
  • Microsoft.WindowsAzure.Plugins.Antimalware.DayForWeeklyScheduledScans – Which day of the week (0 – 7 where 0 means daily)
  • Microsoft.WindowsAzure.Plugins.Antimalware.TimeForWeeklyScheduledScans – What time should the scheduled scan run?
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedExtensions – Specify file extensions to exclude from scanning (pip-delimited)
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedPaths – Specify paths to exclude from scanning (pip-delimited)
  • Microsoft.WindowsAzure.Plugins.Antimalware.ExcludedProcesses – Specify processes to exclude from scanning (pip-delimited)

Monitoring anti-malware on Windows Azure

How will you know if a threat has been detected? Well, luckily for us, Windows Endpoint Protection writes its logs to the System event log. Which means that you can simply add a specific data source in your diagnostics monitor and you’re done:

1 var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration(); 2 3 // Note: if you need informational / verbose, also subscribe to levels 4 and 5 4 configuration.WindowsEventLog.DataSources.Add( 5 "System!*[System[Provider[@Name='Microsoft Antimalware'] and (Level=1 or Level=2 or Level=3)]]"); 6 7 configuration.WindowsEventLog.ScheduledTransferPeriod 8 = System.TimeSpan.FromMinutes(1); 9 10 DiagnosticMonitor.Start( 11 "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 12 configuration);

In addition, EP also logs its inner workings to its installation folders. You can also include these in your diagnostics configuration:

1 var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration(); 2 3 // ...add the event logs like in the previous code sample... 4 5 var mep1 = new DirectoryConfiguration(); 6 mep1.Container = "wad-endpointprotection-container"; 7 mep1.DirectoryQuotaInMB = 5; 8 mep1.Path = "%programdata%\Microsoft Endpoint Protection"; 9 10 var mep2 = new DirectoryConfiguration(); 11 mep2.Container = "wad-endpointprotection-container"; 12 mep2.DirectoryQuotaInMB = 5; 13 mep2.Path = "%programdata%\Microsoft\Microsoft Security Client"; 14 15 configuration.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0); 16 configuration.Directories.DataSources.Add(mep1); 17 configuration.Directories.DataSources.Add(mep2); 18 19 DiagnosticMonitor.Start( 20 "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 21 configuration);

From this moment one, you can use a tool like Cerebrata’s Diagnostics Monitor to check the event logs of all your Windows Azure instances that have anti-malware enabled.

Pro NuGet is finally there!

Short version: Install-Package ProNuget or http://amzn.to/pronuget

Pro NuGet - Continuous integration Package RestoreIt’s been a while since I wrote my first book. After I’ve been telling that writing a book is horrendous (try writing a chapter per week after your office hours…) and that I would never write on again, my partner-in-crime Xavier Decoster and I had the same idea at the same time: what about a book on NuGet? So here it is: Pro NuGet is fresh off the presses (or on Kindle).

Special thanks go out to Scott Hanselman and Phil Haack for writing our foreword. Also big kudos to all who’ve helped us out now and then and did some small reviews. Yes Rob, Paul, David, Phil, Hadi: that’s you guys.

Why a book on NuGet?

Why not? At the time we decided we would start writing a book (september 2011), NuGet was out there for a while already. Yet, most users then (and still today) were using NuGet only as a means of installing packages, some creating packages. But NuGet is much more! And that’s what we wanted to write about. We did not want to create a reference guide on what NuGet command were available. We wanted to focus on best practices we’ve learned over the past few months using NuGet.

Some scenarios covered in our book:

  • What’s the big picture on package management?
  • Flashback last week: NuGet.org was down. How do you keep your team working if you depend on that external resource?
  • Is it a good idea to auto-update NuGet packages in a continous integration process?
  • Use the PowerShell console in VS2010/11. How do I write my own NuGet PowerShell Cmdlets? What can I do in there?
  • Why would you host your own NuGet repository?
  • Using NuGet for continuous delivery
  • More!

I feel we’ve managed to cover a lot of concepts that go beyond “how to use NuGet vX” and instead have given as much guidance as possible. Questions, suggestions, remarks, … are all welcome. And a click on “Add to cart” is also a good idea ;-)

I’m an ASP Insider

imageCool! I’ve just learned that I’m invited to join the ASPInsiders. I’m really excited and honored to be part of this group of great ASP.NET experts. Very much looking forward to learning the secret handshake and being able to provide feedback that helps the ASP.NET team forward.

If don’t know who the ASP Insiders are, here’s their elevator pitch:

“The ASPInsiders is a select group of international professionals who have demonstrated expertise in ASP.NET technologies and who provide valuable, early feedback on related developing technologies and publications to their peers, the Microsoft ASP.NET team and others.”

Some more info is available in the Who are the ASPInsiders? post by one of the insiders.

TechDays Finland - Architectural Patterns for the Cloud - NuGet

As promised, here are the slide decks for the two sessions delivered at TechDays Finland last week.

Architectural Patterns for the Cloud

The promise of all cloud vendors out there is they can run your applications without changes. While that claim is true, it’s better to optimize existing software or design specifically for the cloud when moving or building an application. Architectural optimization will speed up your application, make it more scalable and even will make it cheaper to run on Windows Azure. This session will take you along some common patterns that are easy to implement and will make your cloud more sunny.

Organize your Chickens - NuGet for the Enterprise

Managing software dependencies, whether those created in-house or from third parties can be a pain in the behind. Whether dependencies feel like wild chickens or people run around like chickens dealing with dependencies, the NuGet package manager can be a cure. Let us guide you to creating enterprise (chicken) NuGets and dealing with them in a structured, easy-to-maintain manner. From developer workstation to build server, NuGet tastes great! We'll provide you the dip sauce.

Enjoy! And if there’s any feedback or questions, I would love to hear it.

Introducing MyGet package source proxy (beta)

My blog already has quite the number of blog posts around MyGet, our NuGet-as-a-Service solution which my colleague Xavier and I are running. There are a lot of reasons to host your own personal NuGet feed (such as protecting your intellectual property or only adding approved packages to the feed, but there’s many more as you can <plug>read in our book</plug>). We’ve added support for another scenario: MyGet now supports proxying remote feeds.

Up until now, MyGet required you to upload your own NuGet packages and to include packages from the NuGet feed. The problem with this is that you either required your team to register multiple NuGet feeds in Visual Studio (which still is a good option) or to register just your MyGet feed and add all packages your team is using to it. Which, again, is also a good option.

With our package source proxy in place, we now provide a third option: MyGet can proxy upstream NuGet feeds. Let’s start with a quick diagram and afterwards walk you through a scenario elaborating on this:

MyGet Feed Proxy Aggregate Feed Connector

You are seeing this correctly: you can now register just your MyGet feed in Visual Studio and we’ll add upstream packages to your feed automatically, optionally filtered as well.

Enabling MyGet package source proxy

Enabling the MyGet package source proxy is very straightforward. Navigate to your feed of choice (or create a new one) and click the Package Sources item. This will present you with a screen similar to this:

MyGet hosted package source

From there, you can add external (or MyGet) feeds to your personal feed and add packages directly from them using the Add package dialog. More on that in Xavier’s blog post. What’s more: with the tick of a checkbox, these external feeds can also be aggregated with your feed in Visual Studio’s search results. Here’s the magical add dialog and the proxy checkbox:

Add package source proxy

As you may see, we also offer the option to filter upstream packages. For example, the filter string substringof('wp7', Tags) eq true that we used will filter all upstream packages where the tags contain “wp7”.

What will Visual Studio display us? Well, just the Windows Phone 7 packages from NuGet, served through our single-endpoint MyGet feed.

Conclusion

Instead of working with a number of NuGet feeds, your development team will just work with one feed that is aggregating packages from both MyGet and other package sources out there (NuGet, Orchard Gallery, Chocolatey, …). This centralizes managing external packages and makes it easier for your team members to find the packages they can use in your projects.

Do let us know what you think of this feature! Our UserVoice is there for you, and in fact, that’s where we got the idea for this feature from in the first place. Your voice is heard!

Slides for TechDays Belgium 2012: SignalR

It was the last session on the last day of TechDays 2012 so I was expecting almost nobody to show up. Still, a packed room came to have a look at how to make the web realtime using SignalR. Thanks for joining and for being very cooperative during the demos!

As promised, here are the slides. You can also find the demo code here: SignalR. Code, not toothpaste - TechDays Belgium 2012.zip (2.74 mb)

A recording on Channel9 is available as well.

PS: The book on NuGet (Pro NuGet) which I mentioned can be (pre)ordered on Amazon.

Slides for UGIALT.NET - SignalR

Except for having a bad, bad experience using EasyJet on the way towards Milan, UGIALT.NET was a blast! Here's the slide deck and sample code for my session on SignalR.

Abstract: "SignalR. Code, not toothpaste. (or: Using SignalR for realtime client/server communication) Today’s users are interested in a rich experience where the terms client and server don’t mean a thing. They expect real-time action between both, no matter if the technology used is HTML5 websockets or something else. This session will cover SignalR and show you how it can be used to communicate in real time between the client and server, using HTML5 or not. Combine SignalR with ASP.NET MVC, jQuery and perhaps a sprinkle of Windows Azure and you’ll have an interesting, reliable and fast stack to build your real-time client-server and server-client communications. Join me on this journey between web, cloud and user. No toothpaste. Just code."

Demo code is available for download as well: Demos.zip (1.68 mb)

PS: EasyJet was kind enough to provide us with a flight attendant that looked and talked like Borat on the flight back, which made me laugh numerous times during the flight. Is nice!

Tracking API usage with Google Analytics

So you have an API. Congratulations! You should have one. But how do you track who uses it, what client software they use and so on? You may be logging API calls yourself. You may be relying on services like Apigee.com who make you pay (for a great service, though!). Being cheap, we thought about another approach for MyGet. We’re already using Google Analytics to track pageviews and so on, why not use Google Analytics for tracking API calls as well?

Meet GoogleAnalyticsTracker. It is a three-classes assembly which allows you to track requests from within C# to Google Analytics.

Go and  fork this thing and add out-of-the-box support for WCF Web API, Nancy or even “plain old” WCF or ASMX!

Using GoogleAnalyticsTracker

Using GoogleAnalyticsTracker in your projects is simple. Simply Install-Package GoogleAnalyticsTracker and be an API tracking bad-ass! There are two things required: a Google Analytics tracking ID (something in the form of UA-XXXXXXX-X) and the domain you wish to track, preferably the same domain as the one registered with Google Analytics.

After installing GoogleAnalyticsTracker into your project, you currently have two options to track your API calls: use the Tracker class or use the included ASP.NET MVC Action Filter.

Here’s a quick demo of using the Tracker class:

1 Tracker tracker = new Tracker("UA-XXXXXX-XX", "www.example.org"); 2 tracker.TrackPageView("My API - Create", "api/create");

Unfortunately, this class has no notion of a web request. This means that if you want to track user agents and user languages, you’ll have to add some more code:

1 Tracker tracker = new Tracker("UA-XXXXXX-XX", "www.example.org"); 2 3 var request = HttpContext.Request; 4 tracker.Hostname = request.UserHostName; 5 tracker.UserAgent = request.UserAgent; 6 tracker.Language = request.UserLanguages != null ? string.Join(";", request.UserLanguages) : ""; 7 8 tracker.TrackPageView("My API - Create", "api/create");

Whaah! No worries though: there’s an extension method which does just that:

1 Tracker tracker = new Tracker("UA-XXXXXX-XX", "www.example.org"); 2 tracker.TrackPageView(HttpContext, "My API - Create", "api/create");

The sad part is: this code quickly clutters all your action methods. No worries! There’s an ActionFilter for that!

1 [ActionTracking("UA-XXXXXX-XX", "www.example.org")] 2 public class ApiController 3 : Controller 4 { 5 public JsonResult Create() 6 { 7 return Json(true); 8 } 9 }

And what’s better: you can register it globally and optionally filter it to only track specific controllers and actions!

1 public class MvcApplication : System.Web.HttpApplication 2 { 3 public static void RegisterGlobalFilters(GlobalFilterCollection filters) 4 { 5 filters.Add(new HandleErrorAttribute()); 6 filters.Add(new ActionTrackingAttribute( 7 "UA-XXXXXX-XX", "www.example.org", 8 action => action.ControllerDescriptor.ControllerName == "Api") 9 ); 10 } 11 }

And here’s what it could look like (we’re only tracking for the second day now…):

WCF Web API analytics google

We even have stats about the versions of the NuGet Command Line used to access our API!

NuGet API tracking Google

Enjoy! And fork this thing and add out-of-the-box support for WCF Web API, Nancy or even “plain old” WCF or ASMX!