HOWTO Install Python3, pip3 & Tornado on Mac

I recently needed to install Python3 on my Mac.  While the bearded Linux masses just seem to know this stuff, or it’s already part of their distro, in Mac-Land by default we’re stuck on Python 2.7.2 and guidance is lacking.

So to save people doing the digging I had to do, here’s a quick HOWTO on installing Python3 on your Mac.  I like understanding who things work, so this post also details where things are installed.

Install Latest Python 3

Download the lastest Python3 installer (v3.3.0 at time of writing) take care you get the version appropriate for your OSX version:

Download Python3 here

Follow the on-screen instructions, Python3 should be successfully installed into /Library/Frameworks/Python.framework/Versions/3.3 the installer will also create symlinks for python3 in /usr/local/bin – which makes python3 available from the command line.

Check it works

From the command line type:


python3

You should see the following:

Install pip3

Like all good languages Python has a package manager.  Python’s is called pip.  Pip can be used to install packages into the Python framework so they can be used in your programs.

Something that is not at all obvious to the uninitiated is that to use pip  with Python3, you need to compile run the various install scripts against Python3, otherwise everything just installs in the Python 2.7 directory (this is the voice of bitter experience speaking).

To work with pip3 we need to first install distribute_setup.py.  As far I understand it, distribute_setup.py parses the setup.py script in the python package and ensure everything is compatible with Python3 (correct me if I’m wrong Python community 🙂

Download and run the script as follows, you’ll need to use sudo on mac, as the script will need privs to write into /Library dir


curl -O http://python-distribute.org/distribute_setup.py
sudo python3 distribute_setup.py

Now you should be able to install pip, again you’ll need to run the script with python3 (not sure if sudo is definitely required this time)


curl -O https://raw.github.com/pypa/pip/master/contrib/get-pip.py
sudo python3 get-pip.py

pip should successfully installed.  Now you might be thinking you’re home and dry, however, if you type pip on the command line you’ll probably get Command not found, you’re going to have to alter your Path variable to make pip3 available.  So add the following line to your .bash_profile:

Restart you terminal and check if it’s working by doing, the following, and checking that (Python 3.3) is appended at the end.


pip --version

If you have an older version of pip installed you should still be able to use it by entering pip-2.7

Install Tornado webserver using pip3

So now we can give our new pip a testdrive by installing the Tornado web server.  Again on mac we appear to need to use sudo, otherwise strange errors occur:


sudo pip install tornado

You should see tornado being installed successfully into Python3’s site_packages dir /Library/Frameworks/Python.framework/Versions/3.3/lib/python3.3/site-packages

So now we can use Python3 to run our Tornado hello world app, and see it running in the browser

So we can run our hello-world.py using Python3 and we should see it running successfully in the browser at http://localhost:8888

Good luck

5 lessons from 3 years at a start-up

Some thoughts in no particular order after 3 years at a start-up

Have a plan – sounds obvious but a weakness of agile is that it can give rise to the illusion that there’s a plan.  However, in reality planning is emergent as the iterations and stories float by.  Emergent planning means that the team can drift or can become distracted, or it’s hard to turn down non-core projects because you can’t point to a strategy or project delivery.  Plans can be flexible and tested in the MVP style, and changed when they are proved not to be working – but there’s no excuse not to have one.

Then ensure everyone is signed up to the plan.  Even in a small team it’s easy for factions and agendas to emerge.  Getting everyone pulling in the same direction is non-trivial

Sales and marketing are waaay more important than devs admit/realise
– Make time to support sales and marketing efforts.  Devs love to scoff at sales people with their suits, lines in BS and vague promises.  But the hard fact is there are very few successful products that have gained market share on technical superiority alone, and the chances are your team is not producing one of them.  You need to think long and hard about your sales and marketing approach.

Only today did I read in the Sunday Times that the publishers of Grand Theft Auto hired Max Clifford to create a media shit-storm regarding the moral failings of the game.  Resulting, of course, in millions of additional sales.

Avoid non-core projects at all costs – Pressure for sales may mean you’re tempted to take on side projects, or do free work in exchange for some kind of marketing exposure. DON’T!!  DON’T EVEN THINK ABOUT IT!!

My experience was that this was a huge distraction and money-pit and time waster and just a generally bad idea that should be pushed back against at all costs.  If you’re tempted and think you can manage it – trust me it will still be a distraction.  If you’re still tempted time-box the work hard and ensure all stakeholders understand that there’s a maximum amount of time you can afford.

Don’t white-label and abstract features until at least 2 customers ask for them – This is basically a rewording of YAGNI – it’s tempting to assume all customers will want feature X or Y.  However, until you have hard evidence that multiple customers want the same feature, try to avoid wasting time abstracting them.  This sounds simple but is very difficult to police and make hard/fast decisions about without getting devs backs up – kanban boards etc can help here to demonstrate to the team how these tasks can add time and cost to the project.

Invest in your team – This doesn’t just mean salaries, this mean listening to your employees.  If you notice the team doing a lot of overtime, do something about it.  Encourage R&D, make sure they have some “slack” time, pay for them to attend conferences, encourage them to blog, take them out for dinner.  Encourage experimentation with new technologies.  Let them do flexi-time, homeworking.

Things like this make a job enjoyable, and mean your team aren’t scouring the job ads.

So in conclusion, as usual, we can say the golden rule is that there are no golden rules, no doubt success can be achieved by ignoring all of the above, but these stuck out to me over the last few years.

See also:

The SDK business is dead – It’s a commodity market now.

SQL Azure – Disaster Recovery

In this post I look at how to set up some tooling to help implement a Disaster Recovery plan for your SQL Azure database.

Fundamentals

The key to any successful DR plan is that it has to be a fire and forget process.  If your DR process involves any manual components – ie Bob from infrastructure needs to push a button at 3pm on Wednesdays, you can guarantee that when disaster strikes you’ll discover Bob hasn’t pushed the button since February.

Thus you want to make sure everything is automated, and you want to hear about it if anything goes wrong.

It’s worth pointing out that every SQL Azure instance is mirrored twice, therefore it is highly unlikely you’re going to suffer an actual outage or data loss from unexpected downtime.  So what we’re doing here is creating a backup in-case someone inadvertently deletes the Customers table.  Of course it never hurts to have a backup under your pillow (so to speak) if it’s going to help you get to sleep at night.

Tooling

Tools you will need:

Exporting your SQL Azure DB

The first thing we’re going to do is to export your SQL Azure DB to a blob file.  The blob file can be used to import your backup into a new DB in the event of disaster.

  • If you haven’t already got one, create a new Azure Storage account.  It’s a good idea to create this in a different location from your SQL Azure DB, so in the event of a catastropic data-centre melt-down your backup will be located far away.  Eg if your database is in North-Europe setup your Storage Account in East-Asia.
  • Now fire-up Azure Storage Explorer and connect to your new storage account.  Create a new private container for sticking the backups in.  If you don’t create a container you can’t actually save anything into your storage account

  • Now we can configure Azure Import Export Client to download your DB into your newly created storage account.  This is a command line util which is ideal for automating but for now we’ll just run manually.  Run the following, editing for your specific account details:

  • Important – Make sure you the BLOBURL argument correctly specifies your container name, ie -BLOBURL http://iainsbackups.blob.core.windows.net/dbbackups/MyDb_120820.bacpac
  • If all has gone well you want to see something like below.  Note – this command simply kicks off the backup process it may take some time before your backup file is complete, you can actually monitor the backup jobs on the portal if you want.

Importing your SQL Azure DB

A DR plan is of little use if you don’t test your backup, so we want to ensure that our backup file can actually be used to create a rescue DB from.  So lets import our .bacpac file to see if we can recreate our DB and connect our app to it.

  • We basically reverse the process.  This time create a new empty SQL Azure DB
  • Now we can configure Azure Import Export Service to import our .bacpac file as follows:

  • If it works as expected we should see

  • Now you want to connect your app to your DB to ensure it works as expected.

Automating your backups

Now we’ve proven we can export and import our db we want to make sure the process happens automatically so we can forget about it.  The easiest way of doing that is to create a simple powershell script that runs the above commands for us, and then schedule it on the task manager.

Here’s a basic script that will run the Import/Export service for us, you can tailor as you see fit.  Note that I’m creating a timestamped backup file so we should get a new file every day

Now we have the script we can call it from the task scheduler, I created a Basic Task to run every night at 23:30, to call our script we can just run powershell from the schedular, as so:

Important – You will have to set your powershell executionpolicy to Remotesigned or the script won’t run when called.

Next Steps

So that’s it we’re backing up our Azure DB and storing in Blob storage all for the cost of a few pennies.  Next we might want to create a more sophisticated script/program that would email us in event of failure, or tidy up old backups – I’ll leave it up to you 🙂

Useful Links

http://msdn.microsoft.com/en-us/library/windowsazure/383f0cb9-0647-4e67-985d-e88369ef0508

Easy database migrations with C# and FluentMigrator

Database migrations are an increasingly common pattern for managing and automating the creation of your projects database schemas. Typically each migration has 3 elements:

  • A Unique Id – each new migration is given a numeric identifier higher than the previous one.
  • An UP component – describing the table / row / column / key – you want to create
  • A DOWN component – which exactly reverses the change you’re are making in the UP component.

Thus by running each of the migrations in order you can go from an empty database, to the schema required by your project, or if an error occurs you can simply roll back any number of migrations to get to a known stable state.

But why?

The advantages of this approach may not be immediately obvious, but I’d say the main advantages are:

  • No seperate SQL scripts, db exports etc.  All database migrations are contained within the project and can be reviewed and managed by the team – No DBA Required.
  • Easy deployment onto as many servers as required – most medium to large projects will have a number of environments, minimally dev, test and live.  By creating DB migrations it’s simple to keep each of these environments in synch.
  • As the project grows, database migrations can be created as required, meaning you can easily rollout new changes to live projects.
  • Easy rollback – if you rollout a patch containing a migration, you can instantly roll it back without discovering you don’t have rollback scripts.
  • Roll outs to the live db are usually possible with no downtime

Introducing FluentMigrator

FluentMigrator is a package that allows you to create “fluent” migrations within Visual Studio.  The easiest thing to do is just to demo a migration:

You can see that the migration is decorated with a number in this case 3.  You can see the UP migration creates the UserRoles table and a number of Foreign Keys.  You should also see that the DOWN migration reverses this change, deleting the keys then deleting the table itself.

Thus if a problem occurs during the creation of this table, FluentMigrator can rollback the migration, as described in your DOWN migration.

Running the migrations

Migrations on their own are of little value if you can’t run them against a target database.  FluentMigrator integrates with both MSBuild and Nant, meaning you can run migrations as part of your build process. What’s nice about this is that you can only run in DB changes if all your unit tests etc pass, and if you have a more sophisticated build script you can control which db migrations are rolled out onto different environments.

Readers of my blog will be familiar with my love of Nant and my Nuget package Nant.Builder.  So it won’t come as a great surprise to find out I’ve added a FluentMigrator target into my Nant.Builder scripts.

You can set up the appropriate values in Nant.xml, you can see a working sample in the Nant.Builder.Sample this uses sqlite (note it will only run on 64bit windows).

Best Practice

As far as best practice goes, as I demonstrate in the sample project I’d advise:

  • Keep all the migration classes in a separate project
  • Number the migration classes in line with their migration number ie Mig012_CreateTableX – makes it easier to manage them once you have a few migrations.

Here’s a snap of our migrations on one of our large projects, as you can see after iteration 18 it occurs to us to start creating folders per iteration, allowing to keep things a bit more ordered:

Conclusions

Fluentmigrator takes a lot of the pain out of managing your database schema over multiple environments.  If there’s a negative it’s that the project is a little bit flakey on environments other than Sqlserver, I’ve tried it on SqlserverCe and Sqlite and it’s not worked or only worked after poking at the code for a while.

On the positive side the project is being actively maintained and updated, I had a minor update accepted and committed within a few days of posting it.  So download it and get involved.

Azure CDN – Cache busting CSS image references and minification

In my previous post I discussed my AzureCdn.Me nuget package which can be used to add a cache busting query string to your CSS file references.  In this post we look at cache busting image references contained in your CSS files, and minifying the result.

Cache Busters

Inexplicably the Azure CDN doesn’t ship with a big red reset button to allow you to clear the cache for the CDN.  Meaning if you upload a new image or file that’s currently cached it may take hours or even days before the CDN refreshes the cache.

As I outlined in my previous post the easiest way round this is to add a query string to the resource reference, which differs on each deploy.  Meaning the CDN will see the resource as changed and pull down a fresh copy.

All well and good, but as often as not your CSS file may actually contain image reference within, ie

.lolcat1
{
    background-image: url("./images/lolcat1.jpg");
    background-size : 100% 100%;
}

Now if you modify that image it won’t be updated on the CDN, which is bad news.

Introducing AzureCdn.Me.Nant

Any regular reader of my blog will know I’m a big fan of Nant.  So I thought I’d write a build task to append a cache-busting query string onto the end of the image ref.  I spent a bit of time investigating whether anyone else had addressed this problem, to my surprise I couldn’t find any.

In the course of that investigation I took a look at YUI Compressor.  This open source tool minifies your CSS and Javascript.  I downloaded the code, and realised I could enhance it to add a cache buster prior to doing the minification.  Anyone interested can check out my fork on Github here.

Usage

I packaged up my changes as a nuget package AzureCdn.Me.Nant.  You can install it into your build project.

Step 1 – You need to ensure all image references within your css are quoted, eg:

  • good – url(“image.png”)
  • good – url(‘image.png’)
  • bad – url(image.png) – no quotes

If image refs aren’t quotes the YUI Compressor code won’t pick it up

Step 2 – Add a task similar to this into your build file, change the params to suit:

<loadtasks assembly=".\lib\Yahoo.Yui.Compressor.Build.Nant.dll" verbose="true" />
<target name="CompressCss">
    <echo message="Compressing files"/>
    <cssCompressor
        deleteSourceFiles="false"
        outputFile="${buildspace.src.dir}/AzureCdnMe.Sample.Web/cdn/content/minified.css"
        compressionType="Standard"
        loggingType="Info"
        preserveComments="false"
        lineBreakPosition="-1"
        cdnQueryString="1.01"
    >
    <sourceFiles>
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\azurecdnme.css" />
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\bootstrap.css" />
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\bootstrap-responsive.css" />
     </sourceFiles>
     </cssCompressor>
</target>

The new property I’ve added is cdnQueryString.   When it runs it will both minify and cache bust your CSS image references, by appending a the supplied version number.

Referencing your code

Once you’ve minified the CSS you need to ensure your solution uses the new minified version.  If you install my AzureCdn.Me package you can find a helper method that will allow you to determine if you are in debug mode or release, eg:

if (Html.IsInDebugMode())
{
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet"  />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet"  />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet"  />
}
else
{
    <link href="@Url.AzureCdnContent("~/Content/minified.css")" rel="stylesheet" type="text/css" />
}

Show me the code

You can find a working sample of the above here, where it’s hopefully obvious what is going on.

Conclusions

If anyone is interested it would be easy to enhance the YUI Compressor code to add an MSBuild task to add the cdnQueryString parameter.  Also the cdnQueryString param will work in the same way if you want to also minify any javascript.

Azure CDN – with AzureCdn.Me and MVC3

I’ve been doing a fair bit of work with the Azure CDN recently.  I’ve put together this blog post outlining how to get started and to give an overview of some tooling I’ve written to help you get up and running.

Azure CDN Quickstart

Currently CDNs need to be created in the original Silverlight portal and need to be attached to an existing Hosted Service or Storage Account.  We’ll attach ours to a storage account.  When creating you should click Enable CDN  and Query String (this option will invalidate the CDN cache on any resources if you vary the query string in the resource address, more on this later).  You should now have a CDN in the cloud, now to populate it.

Populating the CDN with static content

Assuming you haven’t altered the MVC standard layout, your static content is probably in the Content folder.  However, where-ever your static content is residing you’ll need to create a new folder titled CDN and move your content into it.  The Azure CDN expects to find your content in the CDN folder.  The easiest thing to do is to cut and paste your Content folder into the CDN folder.  You should now be ready to update the image references.

Introducing AzureCdn.Me

To make the process of referencing images on the Azure CDN a bit more straight-forward I created the AzureCdn.Me nuget package which includes a couple of extension methods to ease the pain.  So install-package AzureCdn.Me into your web project.  AzureCdn.Me will create a CDN folder for you and add the extensions Url.AzureCdnContent and Html.IsDebugMode and a couple of parameters into your web.config.  If we open web.config we can see the new params:

<add key="AzureCDNEndpoint" value="CDN" />
<add key="AzureCDNDebug" value="true" />

The parameters have been defaulted with the values appropriate for debug.  We can now alter our _layout file to use the extensions method.  First off you’ll need to reference the package by adding a using statement at the top of the file, eg:

@using AzureCdn.Me.Code.Extensions
<!DOCTYPE html>
<html>
<head>
    <title>AzureCdn.Me Sample</title>
    <meta charset="utf-8" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet" />
    ...

Note I haven’t altered the address of the static files.  Now if we run our project everything should still be fine, however if we open Firebug, we can see that the extension method has both appended the CDN folder to the front of the string, as read from web.cofig, additionally and importantly it’s also added a querystring containing a cache busting random number.  As it’s stored in a static class this number should stay the same until you redploy.  This is very useful, as if we create subsequent versions of our website, each deploy will force the CDN to fetch a fresh version of the stylesheet.

Overloaded

There is also an overloaded where you can pass in a value of your choice.  For example, you could pass in todays date, which would mean that the cache would refresh itself every 24hours.  Or you could pass in the version number of the executing assembly etc.  Hopefully you get the idea.

<link href="@Url.AzureCdnContent("~/Content/bootstrap.css", DateTime.Now.ToString("ddMMyy"))" rel="stylesheet"  />

Doing it Live

So once you’re happy with your site, you want to push your site live, you’re going to need to replace the debug AzureCDNEndpoint value in web.config with the actual value of your CDN endpoint, ie http://az123456.vo.msecnd.net.  The easiest way of doing that is with web config transformations.  Once deployed your Css files, images etc will be served up from the Azure CDN, and because of the querystring component, any changes will always be picked up as soon as you push the changes live.

In my next post I’ll outline how to minify your css and how you can invalidate the cache for any images that are referenced in your CSS and Javascript .

Visual Studio Turbo – DIY AppHarbor with Nant.Builder

In the final part of this series I look at automating uploading your app into the Windows Azure Cloud, or as I like to think of it a Do It Yourself AppHarbor, hopefully with no leftover screws ;-).  The series so for:

  1. Visual Studio 2010 Workflow
  2. Automating Your Builds with Nant.Builder
  3. DIY AppHarbor – Deploying Your Builds onto Windows Azure

Update 08/08/12 – Updated Nant.Builder and links to reflect changes for Azure 1.7 and Azure Powershell Commandlets

Prerequisites

1.  You’ll hopefully not be surprised to learn you’re going to need a Windows Azure account (there’s a rather stingy 90 day free trial, if you haven’t signed up already).  Within your account you’re going to need to set up one Hosted Service where we’ll deploy the app to, and one Storage Account where the package gets uploaded to prior to deployment.  If you’re stuggling just Google for help on configuration and setting up Windows Azure, there’s plenty of good guides out there.

2. You’ll also need to install the .net Windows Azure SDK v1.7.  Again I’ll assume you know how to add and configure an Azure project to your solution.

3.  Finally, you need to download the Windows Azure Powershell Cmdlets.  This will be installed automatically using Web Platform Installer.  Follow the Getting Started instructions here to ensure it was successfully installed.  You can get a list of available commands, here.

Getting Started – Importing your Azure Credentials

  • You’re going to need to download your Azure credentials, so Nant.Builder can contact Azure on your behalf.  We can do this by clicking here:
  • You should now have file called <your-sub>-<date>-credentials.publishsettings
    • Unhelpfully you can’t seem to rename the file on the portal to make it more meaningful
  • If you open the file you’ll see it’s an XML file containing your subscription details.
    • IMPORTANT– if you have multiple azure subscriptions you’ll need to edit the file so that it only includes the one subscription that you want to deploy your app into.
  • With the file downloaded open powershell and run the following commands, note you’ll need to change the path and filename to your .publishsettings file:

  • If the above command run successfully you should have an xml containing your subscriptionId and thumbprint in c:\dev\tools\windowsazure\subscriptions
  • ** REALLY IMPORTANT** – The subscription xml file is basically the keys to your Azure account, so you DO NOT want to be casually emailing it around, take it to the pub etc.  Ensure you save it behind a firewall etc etc.
  • OK that’s us got our Azure credentials organised, next we can configure Nant.Builder

Configure Nant.Builder for Azure Deployment

Packaging your solution for Azure

  • Install and configure Nant.Builder as described in Part 2 of this series.
  • Open the Nant.build file and navigate to the Azure Settings section.
  • Set the create.azure.package parameter to true, this will call CSPack to package your solution in a format suitable for deployment to Windows Azure.  If you’re interested in what’s happening here I’ve talked about CSPack in depth here and here
  • Set the azure.project.name parameter to the name of the Azure project in your solution.
  • Set the azure.role.project.name parameter to the name of the project which contains the entrypoint to your app.  This will most likely be the Web project containing your MVC views etc.
  • Finally set the azure.service.config.file parameter to the name of the *.cscfg file containing the Azure config you want to deploy.  The default is *.cloud.cscfg but may be different if you have a test config, live config etc.
  • You can run Nant.Builder now and your solution should be packaged and output in C:\dev\releases\<your-solution-name>

Deploying your solution to Azure

  • If packaging has succeeded, you can now finally automate deployment to Azure.  Navigate to the Azure deployment section within Nant.Build
  • Set the deploy.azure.package parameter to true
  • Set the azure.subscription.credentials.file parameter to the name of the the file you created in the Import your Azure Credentials section above, ie C:\dev\tools\WindowsAzure\Subscriptions\yourSubscription.xml
  • Set the azure.hosted.service.name parameter to the name of the hosted service you want to deploy your app into.  IMPORTANT – be aware that this is the name listed as the DNS Prefix not the actual service name

  • Set the azure.deployment.environment parameter to the environment type you wish to deploy your app into.  Valid values are either staging or production
  • Finally set the azure.storage.account.name parameter to the name of the storage account you set up earlier, this is where the app will be uploaded to temporarily when it’s being deployed.
  • That’s it we should now be ready to test our DIY App Harbor.  Your Azure Config section should look similar to this, obviously with your app details replaced:
 <!--Azure Settings-->

<!-- Packaging -->
 
 <!--The name of the project containing the Azure csdef, cscfg files-->
 
 <!-- This is the name of the project containing your app entry point, probably the Web project, but may be a library if using a worker role-->
 
 <!-- The name of the file containing the azure config for your app, default is .Cloud but may be custom if you have multiple configs, eg test, live etc -->


<!-- Deployment -->
 
 <!-- The name of the file containing your exported subcription details - IMPORTANT keep this file safe as it contains very sensitive credentials about your Azure sub -->
 
 <!-- The name of a azure hosted service where you want to deploy your app-->
 
 <!-- The environment type either Staging or Production-->
 
 <!-- The name of a storage account that exists on your subscription, this will be used to temporarily load your app into while it's being deploed-->
 

One Click Deployment

So we have hopefully achieved the dream of all modern developers being able to deploy our app into the cloud with one click.  If it’s successful you should see something similar to

DeployAzurePackage:

     [exec] 27/05/2012 22:54 - Azure Cloud App deploy script started.
     [exec] 27/05/2012 22:54 - Preparing deployment of ContinuousDeploy to your service
     [exec] or inception with Subscription ID your subid
     [exec] 27/05/2012 22:54 - Creating New Deployment: In progress
     [exec] 27/05/2012 22:56 - Creating New Deployment: Succeeded, Deployment ID
     [exec] 27/05/2012 22:56 - Starting Instances: In progress
     [exec] 27/05/2012 22:56 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Stopped
     [exec] 27/05/2012 22:57 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Initializing
     [exec] 27/05/2012 23:00 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Busy
     [exec] 27/05/2012 23:01 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Ready
     [exec] 27/05/2012 23:01 - Starting Instances: Succeeded
     [exec] 27/05/2012 23:01 - Created Cloud App with URL http://xxx
     [exec] 27/05/2012 23:01 - Azure Cloud App deploy script finished.

BUILD SUCCEEDED

Note – You are better to run Nant from the command line to see the above output, as the powershell script that deploys your build echos progress to the command line, but not to Visual Studio, if you are running Nant as an external tool

Nant.Builder.Sample

I’ve created a sample project on GitHub that shows Nunit.Builder integrated into it, so it should be more obvious how it all wires up.  Download Nant.Builder.Sample here

Conclusions

I hope you’ve found the series useful, and that you benefit from turbo-charging your workflow.  Over the next month I’m going to refactor Nant.Builder to be a bit more modular, so it will be easy for other to extend the platform with different targets.  Stay tuned for further exiting announcements 🙂