Maarten Balliauw {blog}

ASP.NET, ASP.NET MVC, Windows Azure, PHP, ...

NAVIGATION - SEARCH

PHP Managed Extensibility Framework – PHPMEF

image While flying sitting in the airplane to the Microsoft Web Developer Summit in Seattle, I was watching some PDC09 sessions on my laptop. During the MEF session, an idea popped up: there is no MEF for PHP! 3500 kilometers after that moment, PHP got its own MEF…

What is MEF about?

MEF is a .NET library, targeting extensibility of projects. It allows you to declaratively extend your application instead of requiring you to do a lot of plumbing. All this is done with three concepts in mind: export, import and compose. (Glenn, I stole the previous sentence from your blog). “PHPMEF” uses the same concepts in order to provide this extensibility features.

Let’s start with a story… Imagine you are building a Calculator. Yes, shoot me, this is not a sexy sample. Remember I wrote this one a plane with snoring people next to me…The Calculator is built of zero or more ICalculationFunction instances. Think command pattern. Here’s how such an interface can look like:

[code:c#]

interface ICalculationFunction
{
    public function execute($a, $b);
}

[/code]

Nothing special yet. Now let’s implement an instance which does sums:

[code:c#]

class Sum implements ICalculationFunction
{
    public function execute($a, $b)
    {
        return $a + $b;
    }
}

[/code]

Now how would you go about using this in the following Calculator class:

[code:c#]

class Calculator
{
    public $CalculationFunctions;
}

[/code]

Yes, you would do plumbing. Either instantiating the Sum object and adding it into the Calculator constructor, or something similar. Imagine you also have a Division object. And other calculation functions. How would you go about building this in a maintainable and extensible way? Easy: use exports…

Export

Exports are one of the three fundaments of PHPMEF. Basically, you can specify that you want class X to be “ exported”  for extensibility. Let’s export Sum:

[code:c#]

/**
  * @export ICalculationFunction
  */
class Sum implements ICalculationFunction
{
    public function execute($a, $b)
    {
        return $a + $b;
    }
}

[/code]

Sum is exported as Sum by default, but in this case I want PHPMEF to know that it is also exported as ICalculationFunction. Let’s see why this is in the import part…

Import

Import is a concept required for PHPMEF to know where to instantiate specific objects. Here’s an example:

[code:c#]

class Calculator
{
    /**
      * @import ICalculationFunction
      */
    public $SomeFunction;
}

[/code]

In this case, PHPMEF will simply instantiate the first ICalculationFunction instance it can find and assign it to the Calculator::SomeFunction variable. Now think of our first example: we want different calculation functions in our calculator! Here’s how:

[code:c#]

class Calculator
{
    /**
      *  @import-many ICalculationFunction
      */
    public $CalculationFunctions;
}

[/code]

Easy, no? PHPMEF will ensure that all possible ICalculationFunction instances are added to the Calculator::CalculationFunctions array. Now how is all this being plumbed together? It’s not plumbed! It’s composed!

Compose

Composing matches all exports and imports in a specific application path. How? Easy! Use the PartInitializer!

[code:c#]

// Create new Calculator instance
$calculator = new Calculator();

// Satisfy dynamic imports
$partInitializer = new Microsoft_MEF_PartInitializer();
$partInitializer->satisfyImports($calculator);

[/code]

Easy, no? Ask the PartInitializer to satisfy all imports and you are done!

Advanced usage scenarios

The above sample was used to demonstrate what PHPMEF is all about. I’m sure you can imagine more complex scenarios. Here are some other possibilities…

Single instance exports

By default, PHPMEF instantiates a new object every time an import has to be satisfied. However, imagine you want our Sum class to be re-used. You want PHPMEF to assign the same instance over and over again, no matter where and how much it is being imported. Again, no plumbing. Just add a declarative comment:

[code:c#]

/**
  * @export ICalculationFunction
  * @export-metadata singleinstance
  */
class Sum implements ICalculationFunction
{
    public function execute($a, $b)
    {
        return $a + $b;
    }
}

[/code]

Export/import metadata

Imagine you want to work with interfaces like mentioned above, but want to use a specific implementation that has certain metadata defined. Again: easy and no plumbing!

My calculator might look like the following:

[code:c#]

class Calculator
{
    /**
      *  @import-many ICalculationFunction
      */
    public $CalculationFunctions;

    /**
      *  @import ICalculationFunction
      *  @import-metadata CanDoSums
      */
    public $SomethingThatCanDoSums;
}

[/code]

Calculator::SomeThingThatCanDoSums is now constrained: I only want to import something that has the metadata “CanDoSums” attached. Here’s how to create such an export:

[code:c#]

/**
  * @export ICalculationFunction
  * @export-metadata CanDoSums
  */
class Sum implements ICalculationFunction
{
    public function execute($a, $b)
    {
        return $a + $b;
    }
}

[/code]

Here’s an answer to a question you may have: yes, multiple metadata definitions are possible and will be used to determine if an export matches an import.

One small note left: you can also ask the PartInitializer for the metadata defined on a class.

[code:c#]

// Create new Calculator instance
$calculator = new Calculator();

// Satisfy dynamic imports
$partInitializer = new Microsoft_MEF_PartInitializer();
$partInitializer->satisfyImports($calculator);

// Get metadata
$metadata = $partInitializer->getMetadataForClass('Sum');

[/code]

Can I get the source?

No, not yet. For a number of reasons. I first want to make this thing a bit more stable, as well as deciding if all MEF features should be ported. Also, I’m looking for an appropriate name/library to put this in. You may have noticed the Microsoft_* naming, a small hint to the Interop team in incorporating this as another Microsoft library in the PHP world. Yes Vijay, talking to you :-)

Simple API for Cloud Application Services

Zend, in co-operation with IBM, Microsoft, Rackspace, GoGrid and other cloud leaders, today have released their Simple API for Cloud Application Services project. The Simple Cloud API project empowers developers to use one interface to interact with the cloud services offered by different vendors. These vendors are all contributing to this open source project, making sure the Simple Cloud API “fits like a glove” on top of their service.

Zend Cloud adapters will be available for services such as:

  • File storage services, including Windows Azure blobs, Rackspace Cloud Files, Nirvanix Storage Delivery Network and Amazon S3
  • Document Storage services, including Windows Azure tables and Amazon SimpleDB
  • Simple queue services, including Amazon SQS and Windows Azure queues

Note that the Simple Cloud API is focused on providing a simple and re-usable interface across different cloud services. This implicates that specific features a service offers will not be available using the Simple Cloud API.

Here’s a quick code sample for the Simple Cloud API. Let’s upload some data and list the items in a Windows Azure Blob Storage container using the Simple Cloud API:

[code:c#]

require_once('Zend/Cloud/Storage/WindowsAzure.php');

// Create an instance
$storage = new Zend_Cloud_Storage_WindowsAzure(
'zendtest',
array(
  'host' => 'blob.core.windows.net',
  'accountname' => 'xxxxxx',
  'accountkey' => 'yyyyyy'
));

// Create some data and upload it
$item1 = new Zend_Cloud_Storage_Item('Hello World!', array('creator' => 'Maarten'));
$storage->storeItem($item1, 'data/item.txt');

// Now download it!
$item2 = $storage->fetchItem('data/item.txt', array('returntype' => 2));
var_dump($item2);

// List items
var_dump(
$storage->listItems()
);

[/code]

It’s quite fun to be a part of this kind of things: I started working for Microsoft on the Windows Azure SDK for PHP, we contributed the same codebase to Zend Framework, and now I’m building the Windows Azure implementations for the Simple Cloud API.

The full press release can be found at the Simple Cloud API website.

ASP.NET MVC MvcSiteMapProvider 1.0 released

image Back in March, I blogged about an experimental MvcSiteMap provider I was building. Today, I am proud to announce that it is stable enough to call it version 1.0! Download MvcSiteMapProvider 1.0 over at CodePlex.

Ever since the source code release I did back in March, a lot of new features have been added, such as HtmlHelper extension methods, attributes, dynamic parameters, … I’ll leave most of them up to you to discover, but there are some I want to quickly highlight.

kick it on DotNetKicks.com

ACL module extensibility

By default, MvcSiteMap will make nodes visible or invisible based on [Authorize] attributes that are placed on controllers or action methods. If you have implemented your own authentication mechanism, this may no longer be the best way to show or hide sitemap nodes. By implementing and registering the IMvcSiteMapAclModule interface, you can now plug in your own visibility logic.

[code:c#]

public interface IMvcSiteMapAclModule
{
    /// <summary>
    /// Determine if a node is accessible for a user
    /// </summary>
    /// <param name="provider">The MvcSiteMapProvider requesting the current method</param>
    /// <param name="context">Current HttpContext</param>
    /// <param name="node">SiteMap node</param>
    /// <returns>True/false if the node is accessible</returns>
    bool IsAccessibleToUser(MvcSiteMapProvider provider, HttpContext context, SiteMapNode node);
}

[/code]

Dynamic parameters

Quite often, action methods have parameters that are not really bound to a sitemap node. For instance, take a paging parameter. You may ignore this one safely when determining the active sitemap node: /Products/List?page=1 and /Products/List?page=2 should both have the same menu item highlighted. This is where dynamic parameters come in handy: MvcSiteMap will completely ignore the specified parameters when determining the current node.

[code:c#]

<mvcSiteMapNode title="Products" controller="Products" action="List" isDynamic="true" dynamicParameters="page" />

[/code]

The above sitemap node will always be highlighted, whatever the value of “page” is.

SiteMapTitle action filter attribute

In some situations, you may want to dynamically change the SiteMap.CurrentNode.Title in an action method. This can be done manually by setting  SiteMap.CurrentNode.Title, or by adding the SiteMapTitle action filter attribute.

Imagine you are building a blog and want to use the Blog’s Headline property as the site map node title. You can use the following snippet:

[code:c#]

[SiteMapTitle("Headline")]
public ViewResult Show(int blogId) {
   var blog = _repository.Find(blogIdId);
   return blog;
}

[/code]

You can also use a non-strong typed ViewData value as the site map node title:

[code:c#]

[SiteMapTitle("SomeKey")]
public ViewResult Show(int blogId) {
   ViewData["SomeKey"] = "This will be the title";

   var blog = _repository.Find(blogIdId);
   return blog;
}

[/code]

HtmlHelper extension methods

MvcSiteMap provides different HtmlHelper extension methods which you can use to generate SiteMap-specific HTML code on your ASP.NET MVC views. Here's a list of available HtmlHelper extension methods.

  • HtmlHelper.Menu() - Can be used to generate a menu
  • HtmlHelper.SiteMap() - Can be used to generate a list of all pages in your sitemap
  • HtmlHelper.SiteMapPath() - Can be used to generate a so-called "breadcrumb trail"

The MvcSiteMap release can be found on CodePlex.

kick it on DotNetKicks.com

SQL Azure Manager

image

A few days ago, the SQL Server Team announced the availability of three major CTP’s and several new upcoming projects in the SQL related family tree: SQL Server 2008 R2, SQL Server StreamInsight and SQL Azure. Now that last one is interesting: Microsoft will offer a 1GB or 10GB database server “in the cloud” for a good price.

Currently, SQL Azure is in CTP and will undergo some more development. Of course, I wanted to play with this, but… connecting to the thing using SQL Server management Studio is not the most intuitive and straightforward task. It’s more of a workaround. Juliën Hanssens, a colleague of mine, was going crazy for this. Being a good colleague, I poored some coffee tea in the guy and he came up with the SQL Azure manager: a community effort to quickly enable connecting to your SQL Azure database(s) and perform basic tasks.

SQL Azure Manager

And it does just that. Note that it is a first conceptual release. And that it is still a bit unstable. But it does the trick. At least at a bare minimum. And for the time being that is enough. Want to play with it? Check Juliën’s ClickOnce page!

Note that this thing will become open-soucre in the future, after he finds a good WF designer to do the main UI. Want to help him? Use the submit button!

kick it on DotNetKicks.com

Signed Access Signatures and PHP SDK for Windows Azure

PHP SDK for Windows Azure The latest Windows Azure storage release featured a new concept: “Shared Access Signatures”. The idea of those is that you can create signatures for specific resources in blob storage and that you can provide more granular access than the default “all-or-nothing” approach that is taken by Azure blob storage. Steve Marx posted a sample on this, demonstrating how you can provide read access to a blob for a specified amount of minutes, after which the access is revoked.

The PHP SDK for Windows Azure is now equipped with a credentials mechanism, based on Signed Access Signatures. Let’s see if we can demonstrate how this would work…

kick it on DotNetKicks.com

A quick start…

Let’s take Steve’s Wazdrop sample and upload a few files, we get a set of permissions:

https://wazdrop.blob.core.windows.net/files/7bf9417f-c405-4042-8f99-801acb1ea494?st=2009-08-17T08%3A52%3A48Z&se=2009-08-17T09%3A52%3A48Z&sr=b&sp=r&sig=Zcngfaq60OXtLxcsTjmPXUL9Q4Rj3zTPmW40eARVYxU%3D

https://wazdrop.blob.core.windows.net/files/d30769f6-35b9-4337-8c34-014ff590b18f?st=2009-08-17T08%3A54%3A19Z&se=2009-08-17T09%3A54%3A19Z&sr=b&sp=r&sig=Mm8CnmI3XXVbJ6y0FN9WfAOknVySfsF5jIA55drZ6MQ%3D

If we take a detailed look, the Azure account name used is “wazdrop”, and we have access to 2 files in Steve’s storage account, namely “7bf9417f-c405-4042-8f99-801acb1ea494” and “d30769f6-35b9-4337-8c34-014ff590b18f” in the “files” container.

Great! But if I want to use the PHP SDK for Windows Azure to access the resources above, how would I do that? Well, that should not be difficult. Instantiate a new Microsoft_Azure_Storage_Blob client, and pass it a new Microsoft_Azure_SharedAccessSignatureCredentials instance:

[code:c#]

$storageClient = new Microsoft_Azure_Storage_Blob('blob.core.windows.net', 'wazdrop', '');
$storageClient->setCredentials(
    new Microsoft_Azure_SharedAccessSignatureCredentials('wazdrop', '')
);

[/code]

One thing to notice here: we do know the storage account (“wazdrop”), but not Steve’s shared key to his storage account. Which is good for him, otherwise I would be able to manage all containers and blobs in his account.

The above code sample will now fail every action I invoke on it. Every getBlob(), putBlob(), createContainer(), … will fail because I cannot authenticate! Fortunately, Steve’s application provided me with two URL’s that I can use to read 2 blobs. Now set these as permissions on our storage client:

[code:c#]

$storageClient->getCredentials()->setPermissionSet(array(
    'https://wazdrop.blob.core.windows.net/files/7bf9417f-c405-4042-8f99-801acb1ea494?st=2009-08-17T08%3A52%3A48Z&se=2009-08-17T09%3A52%3A48Z&sr=b&sp=r&sig=Zcngfaq60OXtLxcsTjmPXUL9Q4Rj3zTPmW40eARVYxU%3D',
    'https://wazdrop.blob.core.windows.net/files/d30769f6-35b9-4337-8c34-014ff590b18f?st=2009-08-17T08%3A54%3A19Z&se=2009-08-17T09%3A54%3A19Z&sr=b&sp=r&sig=Mm8CnmI3XXVbJ6y0FN9WfAOknVySfsF5jIA55drZ6MQ%3D'
));

[/code]

We now have instructed the PHP SDK for Windows Azure that we have read permissions on two blobs, and can now use regular API calls to retrieve these blobs:

[code:c#]

$storageClient->getBlob('files', '7bf9417f-c405-4042-8f99-801acb1ea494', 'C:\downloadedfile1.txt');
$storageClient->getBlob('files', 'd30769f6-35b9-4337-8c34-014ff590b18f', 'C:\downloadedfile2.txt');

[/code]

The PHP SDK for Windows Azure will now take care of checking if a permission URL matches the call that is being made, and inject the signatures automatically.

A bit more advanced…

The above sample did demonstrate how the new Signed Access Signature is implemented in PHP SDK for Windows Azure, but it did not yet demonstrate all “coolness”. Let’s say the owner of a storage account named “phpstorage” has a private container named “phpazuretestshared1”, and that this owner wants to allow you to put some blobs in this container. Since the owner does not want to give you full access, nor wants to make the container public, he issues a Shared Access Signature:

http://phpstorage.blob.core.windows.net/phpazuretestshared1?st=2009-08-17T09%3A06%3A17Z&se=2009-08-17T09%3A56%3A17Z&sr=c&sp=w&sig=hscQ7Su1nqd91OfMTwTkxabhJSaspx%2BD%2Fz8UqZAgn9s%3D

This one allows us to write in the container “phpazuretest1” on account “phpstorage”. Now let’s see if we can put some blobs in there!

[code:c#]

$storageClient = new Microsoft_Azure_Storage_Blob('blob.core.windows.net', 'phpstorage', '');
$storageClient->setCredentials(
    new Microsoft_Azure_SharedAccessSignatureCredentials('phpstorage', '')
);

$storageClient->getCredentials()->setPermissionSet(array(
    'http://phpstorage.blob.core.windows.net/phpazuretestshared1?st=2009-08-17T09%3A06%3A17Z&se=2009-08-17T09%3A56%3A17Z&sr=c&sp=w&sig=hscQ7Su1nqd91OfMTwTkxabhJSaspx%2BD%2Fz8UqZAgn9s%3D'
));

$storageClient->putBlob('phpazuretestshared1', 'NewBlob.txt', 'C:\Files\dataforazure.txt');

[/code]

Did you see what happened? We did not specify an explicit permission to write to a specific blob. Instead, the PHP SDK for Windows Azure determined that a permission was required to either write to that specific blob, or to write to its container. Since we only had a signature for the latter, it chose those credentials to perform the request on Windows Azure blob storage.

kick it on DotNetKicks.com

Query the cloud with PHP (PHPLinq and Windows Azure)

PHPLinq Architecture I’m pleased to announce PHPLinq currently supports basic querying of Windows Azure Table Storage. PHPLinq is a class library for PHP, based on the idea of Microsoft’s LINQ technology. LINQ is short for language integrated query, a component in the .NET framework which enables you to perform queries on a variety of data sources like arrays, XML, SQL server, ... These queries are defined using a syntax which is very similar to SQL.

Next to PHPLinq querying arrays, XML and objects, which was already supported, PHPLinq now enables you to query Windows Azure Table Storage in the same manner as you would query a list of employees, simply by passing PHPLinq a Table Storage client and table name as storage hint in the in() method:

[code:c#]

$result = from('$employee')->in( array($storageClient, 'employees', 'AzureEmployee') )
            ->where('$employee => $employee->Name == "Maarten"')
            ->select('$employee');

[/code]

The Windows Azure Table Storage layer is provided by Microsoft’s PHP SDK for Windows Azure and leveraged by PHPLinq to enable querying “the cloud”.

kick it on DotNetKicks.com

How we built TwitterMatic.net - Part 5: the front-end

TwitterMatic - Schedule your Twitter updates“After having found a god-like guardian for his application, Knight Maarten The Brave Coffeedrinker found out that his application still had no functional front-end. It’s OK to have a guardian and a barn in the cloud, but if there’s nothing to guard, this is a bit useless. Having asked the carpenter and the smith of the village, our knight decided that the so-called “ASP.NET MVC” framework might help in his quest.”

This post is part of a series on how we built TwitterMatic.net. Other parts:

kick it on DotNetKicks.com

The front-end

In part 2 of this series, we have already created the basic ASP.NET MVC structure in the web role project. There are few action methods and views to create: we need one for displaying our scheduled tweets and one for scheduling a tweet. We’ll concentrate on the latter in this post.

Action methods

The Create action method will look like this:

[code:c#]

// GET: /Tweet/Create

public ActionResult Create()
{
    TimedTweet tweet = new TimedTweet();

    ViewData["SendOnDate"] = tweet.SendOn.ToShortDateString();
    ViewData["SendOnTime"] = tweet.SendOn.ToShortTimeString();

    return View(tweet);
}

// POST: /Tweet/Create

[AcceptVerbs(HttpVerbs.Post)]
public ActionResult Create(int UtcOffset, string SendOnDate, string SendOnTime, FormCollection collection)
{
    TimedTweet tweet = new TimedTweet(this.User.Identity.Name);
    try
    {
        tweet.SendOn = DateTime.Parse(SendOnDate + " " + SendOnTime).AddMinutes(UtcOffset);

        // Ensure we have a valid SendOn date
        if (!TimedTweetValidation.ValidateFutureDate(tweet.SendOn))
        {
            ModelState.AddModelError("SendOn", "The scheduled time should be in the future.");
        }

        if (this.TryUpdateModel(tweet, new string[] { "Status" }, collection.ToValueProvider()) && ModelState.IsValid)
        {
            // ...
            Repository.Insert(this.User.Identity.Name, tweet);

            return RedirectToAction("Index");
        }
        else
        {
            // ...
            return View(tweet);
        }
    }
    catch
    {
        // ...
        return View(tweet);
    }
}

[/code]

As you can see, we’re doing the regular GET/POST differentiation here: GET to show the Create view, POST to actually do something with user-entered data. Nothing too fancy in the code, just passing some data to the repository we created in an earlier post.

Views

The view for our Create action is slightly more work. I’ve noticed a cool date picker at http://ui.jquery.com, and a cool time picker at http://haineault.com/media/jquery/ui-timepickr/page/. Why not use them both?

Here’s the plain, simple view, no jQuery used:

[code:c#]

<%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<TwitterMatic.Shared.Domain.TimedTweet>" %>

<asp:Content ID="title" ContentPlaceHolderID="TitleContent" runat="server">
    Schedule a tweet for <%=User.Identity.Name%>
</asp:Content>

<asp:Content ID="content" ContentPlaceHolderID="MainContent" runat="server">
    <h3>Schedule a tweet for <%=User.Identity.Name%></h3>

    <% if (!ViewData.ModelState.IsValid) { %>
        <%= Html.ValidationSummary("Could not schedule tweet. Please correct the errors and try again.") %>
    <% } %>

    <% using (Html.BeginForm()) {%>
        <%=Html.Hidden("UtcOffset", 0)%>

        <fieldset>
            <legend>Schedule a tweet</legend>
            <p>
                <label for="Status">Message:</label>
                <%= Html.TextArea("Status") %>
                <%= Html.ValidationMessage("Status", "*") %>
                <span id="status-chars-left">140</span>
            </p>
            <p>
                <label for="SendOn">Send on:</label>
                <%= Html.TextBox("SendOnDate", ViewData["SendOnDate"]) %>
                <%= Html.TextBox("SendOnTime", ViewData["SendOnTime"]) %>
            </p>
            <p>
                <button type="submit" value="Schedule">
                    Schedule tweet!
                </button>
            </p>
        </fieldset>

    <% } %>

    <p style="clear: both;">
        <%= Html.ActionLink("Back to list of scheduled tweets", "Index", "Tweet", null, new { @class = "more" })%>
    </p>
    <p style="clear: both;">&nbsp;</p>
</asp:Content>

[/code]

Nothing fancy in there: just a boring data-entry form. Now let’s spice that one up: we’ll add the datepicker and timepicker:

[code:c#]

<script type="text/javascript">
    $(function() {
        $("#SendOnDate").datepicker({ minDate: 0, showAnim: 'slideDown', dateFormat: 'mm/dd/yy' });
        $("#SendOnTime").timepickr({ convention: 12, rangeMin: ['00', '05', '10', '15', '20', '25', '30', '35', '40', '45', '50', '55'] });
    });
</script>

[/code]

We’re telling jQuery to make a datepicker of the DOM element with id #SendOnDate, and to make a timepickr of the element #SendOnTime. Now let’s add some more useful things:

[code:c#]

<script type="text/javascript">
    $(function() {
        calcCharsLeft();
        $("#Status").keyup(function() {
            calcCharsLeft();
        });
    });

    var calcCharsLeft = function() {
        var charsLeft = (140 - $("#Status").val().length);

        $("#status-chars-left").html(charsLeft);
        if (charsLeft < 0) {
            $("#status-chars-left").css('color', 'red');
            $("#status-chars-left").css('font-weight', 'bold');
        } else {
            $("#status-chars-left").css('color', 'white');
            $("#status-chars-left").css('font-weight', 'normal');
        }
    }
</script>

[/code]

jQuery will now do some more things when the page has loaded: we’re telling the browser to call calcCharsLeft every time a key is pressed in the message text area. This way, we can add a fancy character counter next to the text box, which receives different colors when certain amount of text is entered.

Validation: using DataAnnotations

In the action methods listed earlier in this post, you may have noticed that we are not doing a lot of validation checks. Except for the “Time in the future” check, we’re actually not doing any validation at all!

The reason for not having any validation calls in my controller’s action method, is that I’m using a different model binder than the default one: the ASP.NET MVC team’s DataAnnotationsModelBinder. This model binder makes use of the System.ComponentModel.DataAnnotations namespace to perform validation at the moment of binding data to the model. This concept was used for ASP.NET Dynamic Data, recently picked up by the RIA services team and now also available for ASP.NET MVC.

Basically, what we have to do, is decorating our TimedTweet class’ properties with some DataAnnotations:

[code:c#]

public class TimedTweet : TableStorageEntity, IComparable
{
    public string Token { get; set; }
    public string TokenSecret { get; set; }

    [Required(ErrorMessage = "Twitter screen name is required.")]
    public string ScreenName { get; set; }

    [Required(ErrorMessage = "Message is required.")]
    [StringLength(140, ErrorMessage = "Message length must not exceed 140 characters.")]
    public string Status { get; set; }

    [Required(ErrorMessage = "A scheduled time is required.")]
    [CustomValidation(typeof(TimedTweetValidation), "ValidateFutureDate", ErrorMessage = "The scheduled time should be in the future.")]
    public DateTime SendOn { get; set; }
    public DateTime SentOn { get; set; }
    public string SendStatus { get; set; }
    public int RetriesLeft { get; set; }

    public bool Archived { get; set; }

    // ...
}

[/code]

See how easy this is? Add a [Required] attribute to make a property required. Add a [StringLength] attribute to make sure a certain length is not crossed, … The DataAnnotationsModelBinder will use these hints as a guide to perform validation on your model.

Conclusion

We now know how to work with ASP.NET MVC’s future DataAnnotations validation and have implemented this in TwitterMatic.

In the next part of this series, we’ll have a look at the worker role for TwitterMatic.

kick it on DotNetKicks.com

How we built TwitterMatic.net - Part 4: Authentication and membership

TwitterMatic - Schedule your Twitter updates“Knight Maarten The Brave Coffeedrinker just returned from his quest to a barn in the clouds, when he discovered that he forgot to lock the door to his workplace. He immediately asked the digital village’s smith.to create a lock and provide him a key. Our knight returned to his workplace and concluded that using the smith’s lock would be OK, but having the great god of social networking, Twitter, as a guardian, seemed like a better idea. “O, Auth!”, he said. And the god provided him with a set of prayers, an API, which our knight could use.”

This post is part of a series on how we built TwitterMatic.net. Other parts:

kick it on DotNetKicks.com

Authentication and membership

Why reinvent the wheel when we already have so many wheel manufacturers? I’m really convinced that from now on, nobody should EVER provide subscribe/login/password retrieval/… functionality to his users again! Use OpenID, or Live ID, or Google Accounts, or JanRain’s RPX bundling all types of existing authentication mechanisms. Did you hear me? NEVER provide your own authentication mechanism again, unless you have a solid reason for it!

Since we’re building an application for Twitter, and Twitter provides OAuth API for delegating authentication, why not use OAuth? As a start, here’s the flow that has to be respected when working with OAuth.

OAuth request flow diagram

Now let’s build this in to our application…

Implementing OAuth in TwitterMatic

First of all: if you are developing something and it involves a third-party product or service, chances are there’s something useful for you on CodePlex. In TwitterMatic’s case, that useful tool is LINQ to Twitter, providing OAuth implementation as well as a full API to the Twitter REST services. Thank you, JoeMayo!

The only thing we still have to do in order for TwitterMatic OAuth authentication to work, is create an AccountController in the web role project. Let’s start with a Login action method:

[code:c#]

IOAuthTwitter oAuthTwitter = new OAuthTwitter();
oAuthTwitter.OAuthConsumerKey = configuration.ReadSetting("OAuthConsumerKey");
oAuthTwitter.OAuthConsumerSecret = configuration.ReadSetting("OAuthConsumerSecret");

if (string.IsNullOrEmpty(oauth_token)) {
    // Not authorized. Redirect to Twitter!
    string loginUrl = oAuthTwitter.AuthorizationLinkGet(
        configuration.ReadSetting("OAuthRequestTokenUrl"),
        configuration.ReadSetting("OAuthAuthorizeTokenUrl"),
        false,
        true
    );
    return Redirect(loginUrl);
}

[/code]

Our users will now be redirected to Twitter in order to authenticate, if the method receives an empty or invalid oauth-token. If we however do retrieve a valid token, we’ll use FormsAuthentication cookies to keep the user logged in on TwitterMatic as well. Note that we are also saving the authentication token as the user’s password, we’ll be needing this same token to post updates afterwards.

[code:c#]

// Should be authorized. Get the access token and secret.
string userId = "";
string screenName = "";

oAuthTwitter.AccessTokenGet(oauth_token, configuration.ReadSetting("OAuthAccessTokenUrl"),
    out screenName, out userId);
if (oAuthTwitter.OAuthTokenSecret.Length > 0)
{
    // Store the user in membership
    MembershipUser user = Membership.GetUser(screenName);
    if (user == null)
    {
        MembershipCreateStatus status = MembershipCreateStatus.Success;
        user = Membership.CreateUser(
            screenName,
            oAuthTwitter.OAuthToken + ";" + oAuthTwitter.OAuthTokenSecret,
            screenName, 
            "twitter", 
            "matic",
            true,
            out status);
    }

    // Change user's password
    user.ChangePassword(
        user.GetPassword("matic"),
        oAuthTwitter.OAuthToken + ";" + oAuthTwitter.OAuthTokenSecret
    );
    Membership.UpdateUser(user);

    // All is well!
    FormsAuthentication.SetAuthCookie(screenName, true);
    return RedirectToAction("Index", "Home");
}
else
{
    // Not OK...
    return RedirectToAction("Login");
}

[/code]

Here’s the full code to AccountController:

[code:c#]

[HandleError]
public class AccountController : Controller
{
    protected IConfigurationProvider configuration;

    public ActionResult Login(string oauth_token)
    {
        IOAuthTwitter oAuthTwitter = new OAuthTwitter();
        oAuthTwitter.OAuthConsumerKey = configuration.ReadSetting("OAuthConsumerKey");
        oAuthTwitter.OAuthConsumerSecret = configuration.ReadSetting("OAuthConsumerSecret");

        if (string.IsNullOrEmpty(oauth_token)) {
            // Not authorized. Redirect to Twitter!
            string loginUrl = oAuthTwitter.AuthorizationLinkGet(
                configuration.ReadSetting("OAuthRequestTokenUrl"),
                configuration.ReadSetting("OAuthAuthorizeTokenUrl"),
                false,
                true
            );
            return Redirect(loginUrl);
        } else {
            // Should be authorized. Get the access token and secret.
            string userId = "";
            string screenName = "";

            oAuthTwitter.AccessTokenGet(oauth_token, configuration.ReadSetting("OAuthAccessTokenUrl"),
                out screenName, out userId);
            if (oAuthTwitter.OAuthTokenSecret.Length > 0)
            {
                // Store the user in membership
                MembershipUser user = Membership.GetUser(screenName);
                if (user == null)
                {
                    MembershipCreateStatus status = MembershipCreateStatus.Success;
                    user = Membership.CreateUser(
                        screenName,
                        oAuthTwitter.OAuthToken + ";" + oAuthTwitter.OAuthTokenSecret,
                        screenName,
                        configuration.ReadSetting("OAuthConsumerKey"),
                        configuration.ReadSetting("OAuthConsumerSecret"),
                        true,
                        out status);
                }

                // Change user's password
                user.ChangePassword(
                    user.GetPassword(configuration.ReadSetting("OAuthConsumerSecret")),
                    oAuthTwitter.OAuthToken + ";" + oAuthTwitter.OAuthTokenSecret
                );
                Membership.UpdateUser(user);

                // All is well!
                FormsAuthentication.SetAuthCookie(screenName, true);
                return RedirectToAction("Index", "Home");
            }
            else
            {
                // Not OK...
                return RedirectToAction("Login");
            }
        }
    }

    public ActionResult Logout()
    {
        FormsAuthentication.SignOut();
        return RedirectToAction("Index", "Home");
    }
}

[/code]

Using ASP.NET provider model

The ASP.NET provider model provides abstractions for features like membership, roles, sessions, … Since we’ll be using membership to store authenticated users, we’ll need a provider that works with Windows Azure Table Storage. The Windows Azure SDK samples contain a project '”AspProviders”. Reference it, and add the following to your web.config:

[code:xml]

<?xml version="1.0"?>
<configuration>
  <!-- ... -->

  <appSettings>
    <add key="DefaultMembershipTableName" value="Membership" />
    <add key="DefaultRoleTableName" value="Roles" />
    <add key="DefaultSessionTableName" value="Sessions" />
    <add key="DefaultProviderApplicationName" value="TwitterMatic" />
    <add key="DefaultProfileContainerName" />
    <add key="DefaultSessionContainerName" />
  </appSettings>

  <connectionStrings />

  <system.web>
    <!-- ... -->

    <authentication mode="Forms">
      <forms loginUrl="~/Account/Login" />
    </authentication>

    <membership defaultProvider="TableStorageMembershipProvider">
      <providers>
        <clear/>
        <add name="TableStorageMembershipProvider"
             type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageMembershipProvider"
             description="Membership provider using table storage"
             applicationName="TwitterMatic"
             enablePasswordRetrieval="true"
             enablePasswordReset="true"
             requiresQuestionAndAnswer="true"
             minRequiredPasswordLength="1"
             minRequiredNonalphanumericCharacters="0"
             requiresUniqueEmail="false"
             passwordFormat="Clear" />
      </providers>
    </membership>

    <profile enabled="false" />

    <roleManager enabled="true" defaultProvider="TableStorageRoleProvider" cacheRolesInCookie="false">
      <providers>
        <clear/>
        <add name="TableStorageRoleProvider"
             type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageRoleProvider"
             description="Role provider using table storage"
             applicationName="TwitterMatic" />
      </providers>
    </roleManager>

    <sessionState mode="Custom" customProvider="TableStorageSessionStateProvider">
      <providers>
        <clear />
        <add name="TableStorageSessionStateProvider"
             type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"
             applicationName="TwitterMatic" />
      </providers>
    </sessionState>

    <!-- ... -->
  </system.web>

  <!-- ... -->
</configuration>

[/code]

TwitterMatic should now be storing sessions (if we were to use them), membership and roles in the cloud, by just doing some configuration magic. I love ASP.NET for this!

Conclusion

We now know how to leverage third-party authentication (OAuth in our case) and have implemented this in TwitterMatic.

In the next part of this series, we’ll have a look at the ASP.NET MVC front end and how we can validate user input before storing it in our database.

kick it on DotNetKicks.com

How we built TwitterMatic.net - Part 3: Store data in the cloud

TwitterMatic - Schedule your Twitter updates “After setting up his workplace, knight Maarten The Brave Coffeedrinker thought of something else: if a farmer wants to keep a lot of hay, he needs a barn, right? Since the cloudy application would also need to keep things that can be used by the digital villagers, our knight needs a barn in the clouds. Looking at the azure sky, an idea popped into the knight’s head: why not use Windows Azure storage service? It’s a barn that’s always there, a barn that can catch fire and will still have its stored items located in a second barn (and a third). Knight Maarten The Brave Coffeedrinker jumped on his horse and went on a quest, a quest in the clouds.

This post is part of a series on how we built TwitterMatic.net. Other parts:

kick it on DotNetKicks.com

Store data in the cloud

Windows Azure offers 3 types of cloud storage: blobs, tables and queues. Blob Storage stores sets of binary data, organized in containers of your storage account. Table Storage offers structured storage in the form of tables. The Queue service stores an unlimited number of messages, each of which can be up to 8 KB in size.

Let’s look back at the TwitterMatic architecture:

“The worker role will monitor the table storage for scheduled Tweets. If it’s time to send them, the Tweet will be added to a queue. This queue is then processed by another thread in the worker role, which will publish the Tweet to Twitter. ”

This means we’ll be using two out of three storage types: Table Storage and Queue Storage. Problem: these services are offered as a RESTful service, somewhere in the cloud. Solution to that: use the StorageClient project located in the Windows Azure SDK’s samples directory!

The StorageClient project contains .NET API’s that work with the blob, table and queue storage services provided by Windows Azure. For Table Storage, StorageClient provides an extension on top of Astoria (ADO.NET Data Services Framework, but I still say Astoria because it’s easier to type and say…). This makes it easier for you as a developer to use existing knowledge, from LINQ to SQL or Entity Framework or Astoria, to develop Windows Azure applications.

Setting up StorageClient

Add a reference to the StorageClient project to your WebRole project. Next, add some settings to the ServiceConfiguration.cscfg file:

[code:xml]

<?xml version="1.0"?>
<ServiceConfiguration serviceName="TwitterMatic" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
  <Role name="WebRole">
    <Instances count="1"/>
    <ConfigurationSettings>
      <Setting name="AccountName" value="twittermatic"/>
      <Setting name="AccountSharedKey" value=”..."/>
      <Setting name="BlobStorageEndpoint" value="http://blob.core.windows.net"/>
      <Setting name="QueueStorageEndpoint" value = "http://queue.core.windows.net"/>
      <Setting name="TableStorageEndpoint" value="http://table.core.windows.net"/>
      <Setting name="allowInsecureRemoteEndpoints" value="true"/>
    </ConfigurationSettings>
  </Role>
  <Role name="WorkerRole">
    <Instances count="1"/>
    <ConfigurationSettings>
      <Setting name="AccountName" value="twittermatic"/>
      <Setting name="AccountSharedKey" value=”..."/>
      <Setting name="BlobStorageEndpoint" value="http://blob.core.windows.net"/>
      <Setting name="QueueStorageEndpoint" value = "http://queue.core.windows.net"/>
      <Setting name="TableStorageEndpoint" value="http://table.core.windows.net"/>
      <Setting name="allowInsecureRemoteEndpoints" value="true"/>
    </ConfigurationSettings>
  </Role>
</ServiceConfiguration>

[/code]

This way, both the web and worker role know where to find their data (URI) and how to authenticate (account name and shared key).

Working with tables

We’ll only be using one domain class in our entire project: TimedTweet. This class represents a scheduled Twitter update, containing information required to schedule the update. Here’s a list of properties:

  • Token: A token used to authenticate against Twitter
  • TokenSecret: A second token used to authenticate against Twitter
  • ScreenName: Twitter screen name of the user.
  • Status: The message to publish on Twitter.
  • SendOn: Time to send the message.
  • SentOn: Time the message was sent.
  • SendStatus: A status message (pending, in progress, published, …)
  • RetriesLeft: How many retries left before giving up on the Twitter update.
  • Archived: Yes/no if the message is archived.

Here’s the code:

[code:c#]

public class TimedTweet : TableStorageEntity, IComparable
{
    public string Token { get; set; }
    public string TokenSecret { get; set; }
    public string ScreenName { get; set; }
    public string Status { get; set; }
    public DateTime SendOn { get; set; }
    public DateTime SentOn { get; set; }
    public string SendStatus { get; set; }
    public int RetriesLeft { get; set; }
    public bool Archived { get; set; }

    public TimedTweet()
        : base()
    {
        SendOn = DateTime.Now.ToUniversalTime();
        Timestamp = DateTime.Now;
        RowKey = Guid.NewGuid().ToString();
        SendStatus = "Scheduled";
        RetriesLeft = 3;
    }

    public TimedTweet(string partitionKey, string rowKey)
        : base(partitionKey, rowKey)
    {
        SendOn = DateTime.Now.ToUniversalTime();
        SendStatus = "Scheduled";
        RetriesLeft = 3;
    }

    public int CompareTo(object obj)
    {
        TimedTweet target = obj as TimedTweet;
        if (target != null) {
            return this.SendOn.CompareTo(target.SendOn);
        }
        return 0;
    }
}

[/code]

Note that our TimedTweet is inheriting TableStorageEntity. This class provides some base functionality for Windows Azure Table Storage.

We’ll also need to work with this class against table Storage. For that, we can use TableStorage and TableStorageDataServiceContext class, like this:

[code:c#]

public List<TimedTweet> RetrieveAllForUser(string screenName) {
    StorageAccountInfo info = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration(true);
    TableStorage storage = TableStorage.Create(info);

    storage.TryCreateTable("TimedTweet");

    TableStorageDataServiceContext svc = storage.GetDataServiceContext();
    svc.IgnoreMissingProperties = true;

    List<TimedTweet> result = svc.CreateQuery<TimedTweet>("TimedTweet").Where(t => t.ScreenName ==     screenName).ToList();
    foreach (var item in result)
    {
        svc.Detach(item);
    }
    return result;
}

[/code]

Using this, we can build a repository class based on ITimedtweetRepository and implemented against Table Storage:

[code:c#]

public interface ITimedTweetRepository
{
    void Delete(string screenName, TimedTweet tweet);
    void Archive(string screenName, TimedTweet tweet);
    void Insert(string screenName, TimedTweet tweet);
    void Update(TimedTweet tweet);
    List<TimedTweet> RetrieveAll(string screenName);
    List<TimedTweet> RetrieveDue(DateTime dueDate);
    TimedTweet RetrieveById(string screenName, string id);
}

[/code]

Working with queues

Queues are slightly easier to work with: you can enqueue messages and dequeue messages, and that’s about it. Here’s an example snippet:

[code:c#]

public void EnqueueMessage(string message) {
    StorageAccountInfo info = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration(true);

    QueueStorage queueStorage = QueueStorage.Create(info);
    MessageQueue updateQueue = queueStorage.GetQueue("updatequeue");
    if (!updateQueue.DoesQueueExist())
        updateQueue.CreateQueue();

    updateQueue.PutMessage(new Message(message));
}

Conclusion

[/code]

We now know how to use StorageClient and have a location in the cloud to store our data.

In the next part of this series, we’ll have a look at how we can leverage Twitter’s OAuth authentication mechanism in our own TwiterMatic application and make use of some other utilities packed in the Windows Azure SDK.

kick it on DotNetKicks.com

How we built TwitterMatic.net - Part 2: Creating an Azure project

TwitterMatic - Schedule your Twitter updates “Knight Maarten The Brave Coffeedrinker was about to start working on his TwitterMatic application, named after the great god of social networking, Twitter. Before he could start working, he first needed the right tools. He downloaded the Windows Azure SDK, a set of tools recommended by the smith (or was it the carpenter?) of the digital village. Our knight’s work shack was soon ready to start working. The table on which the application would be crafted, was still empty. Time for action, the knight thought. And he started working.”

This post is part of a series on how we built TwitterMatic.net. Other parts:

kick it on DotNetKicks.com

Creating an Azure project

Here we go. After installing the Windows Azure SDK, start a new project: File > New > Project... Next, add a “Web And Worker Cloud Service”, located under Visual C# (or VB...) > Cloud Service > Web And Worker Cloud Service.

The project now contains three projects:

TwitterMatic Visual Studio Solution

The web role project should be able to run ASP.NET, but not yet ASP.NET MVC. Add /Views, /Controllers, Global.asax.cs, references to System.Web.Mvc, System.Web.Routing, … to make the web role an ASP.NET MVC project. This should work out fine, except for the tooling in Visual Studio. To enable ASP.NET MVC tooling, open the TwitterMatic_WebRole.csproj file using Notepad and add the following: (copy-paste: {603c0e0b-db56-11dc-be95-000d561079b0})

Enable ASP.NET MVC tooling in Windows Azure project

Visual Studio will prompt to reload the project, allow this by clicking the "Reload" button.

(Note: we could have also created a web role from an ASP.NET MVC project, check my previous series on Azure to see how to do this.)

Sightseeing

True, I ran over the previous stuff a bit fast, as it is just plumbing ASP.NET MVC in a Windows Azure Web Role. I have written about this in more detail before, check my previous series on Azure. There are some other things I would like to show you though.

The startup project in a Windows Azure solution contains some strange stuff:

TwitterMatic startup project

This project contains 3 types of things: the roles in the application, a configuration file and a file describing the required configuration entries. The “Roles” folder contains a reference to the projects that perform the web role on one side, the worker role on the other side. ServiceConfiguration.cscfg contains all configuration entries for the web and worker roles, such as the number of instances (yes, you can add more servers simply by putting a higher integer in there!) and other application settings such as storage account info.

ServiceDefinition.csdef is basically a copy of the other config file, but without the values. If you are wondering why: if someone screws with ServiceConfiguration.cscfg when deploying the application to Windows Azure, the deployment interface will know that there are settings missing or wrong. Perfect if someone else will be doing deployment to production!

Conclusion

We now have a full Windows Azure solution, with ASP.NET MVC enabled! Time to grab a prefabricated CSS theme, add a logo, … Drop-in ASP.NET MVC themes can always be found at http://www.asp.net/mvc/gallery. The theme we used should be there as well (thanks for that, theme author!).

In the next part of this series, we’ll have a look at where and how we can store our data.

kick it on DotNetKicks.com