Startups – The Importance of Momentum

I’ve been working in my present job in a mobile games studio for about 6 months.  It’s been pretty intense, but we just shipped our first game.  I produced it, I’ve got my name in the credits, and it got 500 downloads within 24 hours of being launched, so I’m very pleased, and I expect things to only get better 🙂  Not bad for a bunch of guys with limited experience in the gaming industry.

I wanted to blog some thoughts about lessons I’ve drawn as we’ve gone along, before the nervous breakdown hits.  First up – The Importance of Momentum.

Shark dive, UnderWater World

What it felt like on my first day

The technology industry attracts a lot of smart people, which can be pretty intimidating when you’re starting out.  How can we compete with EA, how can we compete with Microsoft, but you can – as long as you keep moving quickly and get your products into the marketplace.

Here’s the top 10 ways and rationale for acting like a shark, and keeping on swimming:

1.  Stick to the Plan – At least once a day one of the team will say “Stick to the Plan”.  You can only move fast if you have a good idea of where you’re heading.  Plans need enough detail so you can spot when things are starting to wobble.  When things are moving at 90 miles per hour, as they do in a startup, there are all sorts of variables that will be thrown at you.  If you have a plan you can easily say No – we’re sticking to the plan.

2.  React – The corollary to Stick to the Plan is – don’t always stick to the plan.  Over time you’ll inevitably be presented with evidence that your plan isn’t working.  Feature X is bogged down, Team Member Y is struggling with task Z etc etc.  React to this evidence and change your plan, make a new better plan and hit the accelerator again.

3.  There’s a fine line between order and chaos – The corollary to the corollary is being sensible enough to hit the brakes and drive within your limits (to push the metaphor).  Don’t change everything at once, prioritise problems and learn to live with uncertainty.

4.  You ain’t going to get it right first time – Most startups will probably have 3 years, or less, to demonstrate they can make money, so the earlier you can get some traction in the marketplace the better.  Spending 2 years in development, is a big risk, much better to ship 4-6 products or significant updates in that time (see the MVP).

5.  If in doubt, keep it simple – When faced with a choice, the safest bet is to go for the simplest solution.  If you’re wrong at least you’ll find out sooner, and in my experience simplest is best 9 times out of 10.  Large estimates and complex solutions have red flags all over them. (see YAGNI)

6.  Shipping teaches tough lessons – A feature we spent a number of weeks developing was rejected by Apple.  If we hadn’t shipped early we’d have wasted additional man-hours on a feature that we had to remove.

7.  Perfect is the enemy of good-enough – It’s comforting to gold-plate features, nail another bug, spend another few days in QA, optimise a bit more.  But if your product is good-enough, ship it, then react to real-world data, rather than second guess.  We went live with 20 known issues, but they were all issues we could live with.  If no-one downloads it, you’re much better to know after 3 months, rather than 6, as you could have spent the previous 3 months doing something different.

8.  Play to your strengths – It’s important to recognise where the strengths and weaknesses of your team lie.  Over time what you thought was a strength may prove to be a weakness, so change your plan.  Where you have gaps, either hire, or better, partner and use freelance resource until you can demonstrate you have the need/resource for a full time employee.

 9.  Don’t wish for what you don’t have – Don’t waste time on toying with the latest fads.  If you have a team of PHP programmers just write PHP as much as you might wish for a team of hipster Rubyists (see play to your strengths).

10.  My rushed site got 30,000 views – One of the first things we did was launch a website which we’d be the first to admit isn’t going to win any design awards.  However, we’re working on an updated design that addresses some of the original’s shortcomings.  In the time version 1’s been live we got 30,000 visitors and 3000 sign-ups, a 10% conversion rate that gave us some confidence we had a market.  If we’d waited until everything was addressed we wouldn’t have the confidence or the numbers (see perfect is the enemy of good enough).

10.1 Investors like momentum – If you can show your investors something tangible that they can see and play with, their confidence and happiness will increase (and you WANT happy/confident investors).  They can’t take a burndown chart to the bank.

So in summary, ship early, be agile in the truest sense, and keep moving.

Advertisement

HOWTO Install Python3, pip3 & Tornado on Mac

I recently needed to install Python3 on my Mac.  While the bearded Linux masses just seem to know this stuff, or it’s already part of their distro, in Mac-Land by default we’re stuck on Python 2.7.2 and guidance is lacking.

So to save people doing the digging I had to do, here’s a quick HOWTO on installing Python3 on your Mac.  I like understanding who things work, so this post also details where things are installed.

Install Latest Python 3

Download the lastest Python3 installer (v3.3.0 at time of writing) take care you get the version appropriate for your OSX version:

Download Python3 here

Follow the on-screen instructions, Python3 should be successfully installed into /Library/Frameworks/Python.framework/Versions/3.3 the installer will also create symlinks for python3 in /usr/local/bin – which makes python3 available from the command line.

Check it works

From the command line type:


python3

You should see the following:


Python 3.3.0 (v3.3.0:bd8afb90ebf2, Sep 29 2012, 01:25:11)
[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>

Install pip3

Like all good languages Python has a package manager.  Python’s is called pip.  Pip can be used to install packages into the Python framework so they can be used in your programs.

Something that is not at all obvious to the uninitiated is that to use pip  with Python3, you need to compile run the various install scripts against Python3, otherwise everything just installs in the Python 2.7 directory (this is the voice of bitter experience speaking).

To work with pip3 we need to first install distribute_setup.py.  As far I understand it, distribute_setup.py parses the setup.py script in the python package and ensure everything is compatible with Python3 (correct me if I’m wrong Python community 🙂

Download and run the script as follows, you’ll need to use sudo on mac, as the script will need privs to write into /Library dir


curl -O http://python-distribute.org/distribute_setup.py
sudo python3 distribute_setup.py

Now you should be able to install pip, again you’ll need to run the script with python3 (not sure if sudo is definitely required this time)


curl -O https://raw.github.com/pypa/pip/master/contrib/get-pip.py
sudo python3 get-pip.py

pip should successfully installed.  Now you might be thinking you’re home and dry, however, if you type pip on the command line you’ll probably get Command not found, you’re going to have to alter your Path variable to make pip3 available.  So add the following line to your .bash_profile:


export PATH=/Library/Frameworks/Python.framework/Versions/3.3/bin:$PATH

view raw

bash_profile.sh

hosted with ❤ by GitHub

Restart you terminal and check if it’s working by doing, the following, and checking that (Python 3.3) is appended at the end.


pip --version

If you have an older version of pip installed you should still be able to use it by entering pip-2.7

Install Tornado webserver using pip3

So now we can give our new pip a testdrive by installing the Tornado web server.  Again on mac we appear to need to use sudo, otherwise strange errors occur:


sudo pip install tornado

You should see tornado being installed successfully into Python3’s site_packages dir /Library/Frameworks/Python.framework/Versions/3.3/lib/python3.3/site-packages

So now we can use Python3 to run our Tornado hello world app, and see it running in the browser


import tornado.ioloop
import tornado.web
class MainHandler(tornado.web.RequestHandler):
def get(self):
self.write("Hello, Python3")
application = tornado.web.Application([
(r"/", MainHandler),
])
if __name__ == "__main__":
application.listen(8888)
tornado.ioloop.IOLoop.instance().start()

So we can run our hello-world.py using Python3 and we should see it running successfully in the browser at http://localhost:8888

Good luck

5 lessons from 3 years at a start-up

Some thoughts in no particular order after 3 years at a start-up

Have a plan – sounds obvious but a weakness of agile is that it can give rise to the illusion that there’s a plan.  However, in reality planning is emergent as the iterations and stories float by.  Emergent planning means that the team can drift or can become distracted, or it’s hard to turn down non-core projects because you can’t point to a strategy or project delivery.  Plans can be flexible and tested in the MVP style, and changed when they are proved not to be working – but there’s no excuse not to have one.

Then ensure everyone is signed up to the plan.  Even in a small team it’s easy for factions and agendas to emerge.  Getting everyone pulling in the same direction is non-trivial

Sales and marketing are waaay more important than devs admit/realise
– Make time to support sales and marketing efforts.  Devs love to scoff at sales people with their suits, lines in BS and vague promises.  But the hard fact is there are very few successful products that have gained market share on technical superiority alone, and the chances are your team is not producing one of them.  You need to think long and hard about your sales and marketing approach.

Only today did I read in the Sunday Times that the publishers of Grand Theft Auto hired Max Clifford to create a media shit-storm regarding the moral failings of the game.  Resulting, of course, in millions of additional sales.

Avoid non-core projects at all costs – Pressure for sales may mean you’re tempted to take on side projects, or do free work in exchange for some kind of marketing exposure. DON’T!!  DON’T EVEN THINK ABOUT IT!!

My experience was that this was a huge distraction and money-pit and time waster and just a generally bad idea that should be pushed back against at all costs.  If you’re tempted and think you can manage it – trust me it will still be a distraction.  If you’re still tempted time-box the work hard and ensure all stakeholders understand that there’s a maximum amount of time you can afford.

Don’t white-label and abstract features until at least 2 customers ask for them – This is basically a rewording of YAGNI – it’s tempting to assume all customers will want feature X or Y.  However, until you have hard evidence that multiple customers want the same feature, try to avoid wasting time abstracting them.  This sounds simple but is very difficult to police and make hard/fast decisions about without getting devs backs up – kanban boards etc can help here to demonstrate to the team how these tasks can add time and cost to the project.

Invest in your team – This doesn’t just mean salaries, this mean listening to your employees.  If you notice the team doing a lot of overtime, do something about it.  Encourage R&D, make sure they have some “slack” time, pay for them to attend conferences, encourage them to blog, take them out for dinner.  Encourage experimentation with new technologies.  Let them do flexi-time, homeworking.

Things like this make a job enjoyable, and mean your team aren’t scouring the job ads.

So in conclusion, as usual, we can say the golden rule is that there are no golden rules, no doubt success can be achieved by ignoring all of the above, but these stuck out to me over the last few years.

See also:

The SDK business is dead – It’s a commodity market now.

SQL Azure – Disaster Recovery

In this post I look at how to set up some tooling to help implement a Disaster Recovery plan for your SQL Azure database.

Fundamentals

The key to any successful DR plan is that it has to be a fire and forget process.  If your DR process involves any manual components – ie Bob from infrastructure needs to push a button at 3pm on Wednesdays, you can guarantee that when disaster strikes you’ll discover Bob hasn’t pushed the button since February.

Thus you want to make sure everything is automated, and you want to hear about it if anything goes wrong.

It’s worth pointing out that every SQL Azure instance is mirrored twice, therefore it is highly unlikely you’re going to suffer an actual outage or data loss from unexpected downtime.  So what we’re doing here is creating a backup in-case someone inadvertently deletes the Customers table.  Of course it never hurts to have a backup under your pillow (so to speak) if it’s going to help you get to sleep at night.

Tooling

Tools you will need:

Exporting your SQL Azure DB

The first thing we’re going to do is to export your SQL Azure DB to a blob file.  The blob file can be used to import your backup into a new DB in the event of disaster.

  • If you haven’t already got one, create a new Azure Storage account.  It’s a good idea to create this in a different location from your SQL Azure DB, so in the event of a catastropic data-centre melt-down your backup will be located far away.  Eg if your database is in North-Europe setup your Storage Account in East-Asia.
  • Now fire-up Azure Storage Explorer and connect to your new storage account.  Create a new private container for sticking the backups in.  If you don’t create a container you can’t actually save anything into your storage account

  • Now we can configure Azure Import Export Client to download your DB into your newly created storage account.  This is a command line util which is ideal for automating but for now we’ll just run manually.  Run the following, editing for your specific account details:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-DB-NAME -X -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

view raw

exportsqldb.txt

hosted with ❤ by GitHub

  • Important – Make sure you the BLOBURL argument correctly specifies your container name, ie -BLOBURL http://iainsbackups.blob.core.windows.net/dbbackups/MyDb_120820.bacpac
  • If all has gone well you want to see something like below.  Note – this command simply kicks off the backup process it may take some time before your backup file is complete, you can actually monitor the backup jobs on the portal if you want.

Importing your SQL Azure DB

A DR plan is of little use if you don’t test your backup, so we want to ensure that our backup file can actually be used to create a rescue DB from.  So lets import our .bacpac file to see if we can recreate our DB and connect our app to it.

  • We basically reverse the process.  This time create a new empty SQL Azure DB
  • Now we can configure Azure Import Export Service to import our .bacpac file as follows:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-RESCUE-DB-NAME -I -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

  • If it works as expected we should see

  • Now you want to connect your app to your DB to ensure it works as expected.

Automating your backups

Now we’ve proven we can export and import our db we want to make sure the process happens automatically so we can forget about it.  The easiest way of doing that is to create a simple powershell script that runs the above commands for us, and then schedule it on the task manager.

Here’s a basic script that will run the Import/Export service for us, you can tailor as you see fit.  Note that I’m creating a timestamped backup file so we should get a new file every day


###############################################################################
# Description: Backup Script for Sql Azure DB
# Author: Iain Hunter
# Date: 21/08/12
###############################################################################
$today = Get-Date
$todayStr = $today.ToString("yyyyMMdd")
$backupFile = "your-db" + $todayStr + ".bacpac"
echo "Exporting backup to: $backupFile"
# Export DB to blob storage with datestamp
C:\dev\tools\DACImportExport1.6\DacIESvcCli S YOURSQLAZURESERVERNAME.database.windows.net U YOURSQLAZUREUSERNAME P YOURSQLAZUREDBPASSWORD D YOURSQLAZUREDBNAME X BLOBACCESSKEY YOURBLOBSTORAGEACCOUNTKEY BLOBURL YOURBLOBSTORAGEADDRESS/CONTAINERNAME/$backupFile ACCESSKEYTYPE storage
exit

Now we have the script we can call it from the task scheduler, I created a Basic Task to run every night at 23:30, to call our script we can just run powershell from the schedular, as so:

Important – You will have to set your powershell executionpolicy to Remotesigned or the script won’t run when called.

Next Steps

So that’s it we’re backing up our Azure DB and storing in Blob storage all for the cost of a few pennies.  Next we might want to create a more sophisticated script/program that would email us in event of failure, or tidy up old backups – I’ll leave it up to you 🙂

Useful Links

http://msdn.microsoft.com/en-us/library/windowsazure/383f0cb9-0647-4e67-985d-e88369ef0508

Easy database migrations with C# and FluentMigrator

Database migrations are an increasingly common pattern for managing and automating the creation of your projects database schemas. Typically each migration has 3 elements:

  • A Unique Id – each new migration is given a numeric identifier higher than the previous one.
  • An UP component – describing the table / row / column / key – you want to create
  • A DOWN component – which exactly reverses the change you’re are making in the UP component.

Thus by running each of the migrations in order you can go from an empty database, to the schema required by your project, or if an error occurs you can simply roll back any number of migrations to get to a known stable state.

But why?

The advantages of this approach may not be immediately obvious, but I’d say the main advantages are:

  • No seperate SQL scripts, db exports etc.  All database migrations are contained within the project and can be reviewed and managed by the team – No DBA Required.
  • Easy deployment onto as many servers as required – most medium to large projects will have a number of environments, minimally dev, test and live.  By creating DB migrations it’s simple to keep each of these environments in synch.
  • As the project grows, database migrations can be created as required, meaning you can easily rollout new changes to live projects.
  • Easy rollback – if you rollout a patch containing a migration, you can instantly roll it back without discovering you don’t have rollback scripts.
  • Roll outs to the live db are usually possible with no downtime

Introducing FluentMigrator

FluentMigrator is a package that allows you to create “fluent” migrations within Visual Studio.  The easiest thing to do is just to demo a migration:


using FluentMigrator;
namespace Nant.Builder.Sample.Migrations
{
[Migration(3)]
public class Mig003_CreateUserRoleTable : Migration
{
private const string TableName = "UserRole";
public override void Up()
{
Create.Table(TableName)
.WithColumn("UserId").AsInt32().NotNullable()
.WithColumn("RoleId").AsInt16().NotNullable();
var compKey = new[] { "UserId", "RoleId" };
Create.PrimaryKey("PK_UserRole").OnTable("UserRole").Columns(compKey);
Create.ForeignKey("FK_UserRole_User").FromTable("UserRole").ForeignColumn("UserId").ToTable("User").PrimaryColumn("UserId");
Create.ForeignKey("FK_UserRole_Role").FromTable("UserRole").ForeignColumn("RoleId").ToTable("Role").PrimaryColumn("RoleId");
}
public override void Down()
{
Delete.ForeignKey("FK_UserRole_User").OnTable(TableName);
Delete.ForeignKey("FK_UserRole_Role").OnTable(TableName);
Delete.Table(TableName);
}
}
}

You can see that the migration is decorated with a number in this case 3.  You can see the UP migration creates the UserRoles table and a number of Foreign Keys.  You should also see that the DOWN migration reverses this change, deleting the keys then deleting the table itself.

Thus if a problem occurs during the creation of this table, FluentMigrator can rollback the migration, as described in your DOWN migration.

Running the migrations

Migrations on their own are of little value if you can’t run them against a target database.  FluentMigrator integrates with both MSBuild and Nant, meaning you can run migrations as part of your build process. What’s nice about this is that you can only run in DB changes if all your unit tests etc pass, and if you have a more sophisticated build script you can control which db migrations are rolled out onto different environments.

Readers of my blog will be familiar with my love of Nant and my Nuget package Nant.Builder.  So it won’t come as a great surprise to find out I’ve added a FluentMigrator target into my Nant.Builder scripts.

You can set up the appropriate values in Nant.xml, you can see a working sample in the Nant.Builder.Sample this uses sqlite (note it will only run on 64bit windows).


<!– Database Migration Settings – expects FluentMigrator–>
<property name="run.db.migrations" value="true" />
<!–The name of the project containing your FluentMigrator tests–>
<property name="migrations.project.name" value="Nant.Builder.Sample.Migrations" />
<!–Database type, eg sqlserver2008, sqlite –>
<property name="database.type" value="sqlite" />
<!–The connection string you for the db you want to migrate–>
<property name="connection.string" value="Data Source=C:\dev\tools\sqlite\nant-builder-sample.sqlite;Version=3;" />
<!– Set flag to true if you wish to rollback a db migration, you need to specify a migration number also–>
<property name="rollback.db.migrations" value="false" />
<!– The migration number you wish to rollback to – WARNING be careful you don't delete data by setting this value incorrectly–>
<property name="rollback.to.version" value="0" />

Best Practice

As far as best practice goes, as I demonstrate in the sample project I’d advise:

  • Keep all the migration classes in a separate project
  • Number the migration classes in line with their migration number ie Mig012_CreateTableX – makes it easier to manage them once you have a few migrations.

Here’s a snap of our migrations on one of our large projects, as you can see after iteration 18 it occurs to us to start creating folders per iteration, allowing to keep things a bit more ordered:

Conclusions

Fluentmigrator takes a lot of the pain out of managing your database schema over multiple environments.  If there’s a negative it’s that the project is a little bit flakey on environments other than Sqlserver, I’ve tried it on SqlserverCe and Sqlite and it’s not worked or only worked after poking at the code for a while.

On the positive side the project is being actively maintained and updated, I had a minor update accepted and committed within a few days of posting it.  So download it and get involved.

Azure CDN – Cache busting CSS image references and minification

In my previous post I discussed my AzureCdn.Me nuget package which can be used to add a cache busting query string to your CSS file references.  In this post we look at cache busting image references contained in your CSS files, and minifying the result.

Cache Busters

Inexplicably the Azure CDN doesn’t ship with a big red reset button to allow you to clear the cache for the CDN.  Meaning if you upload a new image or file that’s currently cached it may take hours or even days before the CDN refreshes the cache.

As I outlined in my previous post the easiest way round this is to add a query string to the resource reference, which differs on each deploy.  Meaning the CDN will see the resource as changed and pull down a fresh copy.

All well and good, but as often as not your CSS file may actually contain image reference within, ie

.lolcat1
{
    background-image: url("./images/lolcat1.jpg");
    background-size : 100% 100%;
}

Now if you modify that image it won’t be updated on the CDN, which is bad news.

Introducing AzureCdn.Me.Nant

Any regular reader of my blog will know I’m a big fan of Nant.  So I thought I’d write a build task to append a cache-busting query string onto the end of the image ref.  I spent a bit of time investigating whether anyone else had addressed this problem, to my surprise I couldn’t find any.

In the course of that investigation I took a look at YUI Compressor.  This open source tool minifies your CSS and Javascript.  I downloaded the code, and realised I could enhance it to add a cache buster prior to doing the minification.  Anyone interested can check out my fork on Github here.

Usage

I packaged up my changes as a nuget package AzureCdn.Me.Nant.  You can install it into your build project.

Step 1 – You need to ensure all image references within your css are quoted, eg:

  • good – url(“image.png”)
  • good – url(‘image.png’)
  • bad – url(image.png) – no quotes

If image refs aren’t quotes the YUI Compressor code won’t pick it up

Step 2 – Add a task similar to this into your build file, change the params to suit:

<loadtasks assembly=".\lib\Yahoo.Yui.Compressor.Build.Nant.dll" verbose="true" />
<target name="CompressCss">
    <echo message="Compressing files"/>
    <cssCompressor
        deleteSourceFiles="false"
        outputFile="${buildspace.src.dir}/AzureCdnMe.Sample.Web/cdn/content/minified.css"
        compressionType="Standard"
        loggingType="Info"
        preserveComments="false"
        lineBreakPosition="-1"
        cdnQueryString="1.01"
    >
    <sourceFiles>
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\azurecdnme.css" />
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\bootstrap.css" />
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\bootstrap-responsive.css" />
     </sourceFiles>
     </cssCompressor>
</target>

The new property I’ve added is cdnQueryString.   When it runs it will both minify and cache bust your CSS image references, by appending a the supplied version number.

Referencing your code

Once you’ve minified the CSS you need to ensure your solution uses the new minified version.  If you install my AzureCdn.Me package you can find a helper method that will allow you to determine if you are in debug mode or release, eg:

if (Html.IsInDebugMode())
{
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet"  />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet"  />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet"  />
}
else
{
    <link href="@Url.AzureCdnContent("~/Content/minified.css")" rel="stylesheet" type="text/css" />
}

Show me the code

You can find a working sample of the above here, where it’s hopefully obvious what is going on.

Conclusions

If anyone is interested it would be easy to enhance the YUI Compressor code to add an MSBuild task to add the cdnQueryString parameter.  Also the cdnQueryString param will work in the same way if you want to also minify any javascript.

Azure CDN – with AzureCdn.Me and MVC3

I’ve been doing a fair bit of work with the Azure CDN recently.  I’ve put together this blog post outlining how to get started and to give an overview of some tooling I’ve written to help you get up and running.

Azure CDN Quickstart

Currently CDNs need to be created in the original Silverlight portal and need to be attached to an existing Hosted Service or Storage Account.  We’ll attach ours to a storage account.  When creating you should click Enable CDN  and Query String (this option will invalidate the CDN cache on any resources if you vary the query string in the resource address, more on this later).  You should now have a CDN in the cloud, now to populate it.

Populating the CDN with static content

Assuming you haven’t altered the MVC standard layout, your static content is probably in the Content folder.  However, where-ever your static content is residing you’ll need to create a new folder titled CDN and move your content into it.  The Azure CDN expects to find your content in the CDN folder.  The easiest thing to do is to cut and paste your Content folder into the CDN folder.  You should now be ready to update the image references.

Introducing AzureCdn.Me

To make the process of referencing images on the Azure CDN a bit more straight-forward I created the AzureCdn.Me nuget package which includes a couple of extension methods to ease the pain.  So install-package AzureCdn.Me into your web project.  AzureCdn.Me will create a CDN folder for you and add the extensions Url.AzureCdnContent and Html.IsDebugMode and a couple of parameters into your web.config.  If we open web.config we can see the new params:

<add key="AzureCDNEndpoint" value="CDN" />
<add key="AzureCDNDebug" value="true" />

The parameters have been defaulted with the values appropriate for debug.  We can now alter our _layout file to use the extensions method.  First off you’ll need to reference the package by adding a using statement at the top of the file, eg:

@using AzureCdn.Me.Code.Extensions
<!DOCTYPE html>
<html>
<head>
    <title>AzureCdn.Me Sample</title>
    <meta charset="utf-8" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet" />
    ...

Note I haven’t altered the address of the static files.  Now if we run our project everything should still be fine, however if we open Firebug, we can see that the extension method has both appended the CDN folder to the front of the string, as read from web.cofig, additionally and importantly it’s also added a querystring containing a cache busting random number.  As it’s stored in a static class this number should stay the same until you redploy.  This is very useful, as if we create subsequent versions of our website, each deploy will force the CDN to fetch a fresh version of the stylesheet.

Overloaded

There is also an overloaded where you can pass in a value of your choice.  For example, you could pass in todays date, which would mean that the cache would refresh itself every 24hours.  Or you could pass in the version number of the executing assembly etc.  Hopefully you get the idea.

<link href="@Url.AzureCdnContent("~/Content/bootstrap.css", DateTime.Now.ToString("ddMMyy"))" rel="stylesheet"  />

Doing it Live

So once you’re happy with your site, you want to push your site live, you’re going to need to replace the debug AzureCDNEndpoint value in web.config with the actual value of your CDN endpoint, ie http://az123456.vo.msecnd.net.  The easiest way of doing that is with web config transformations.  Once deployed your Css files, images etc will be served up from the Azure CDN, and because of the querystring component, any changes will always be picked up as soon as you push the changes live.

In my next post I’ll outline how to minify your css and how you can invalidate the cache for any images that are referenced in your CSS and Javascript .

Visual Studio Turbo – DIY AppHarbor with Nant.Builder

In the final part of this series I look at automating uploading your app into the Windows Azure Cloud, or as I like to think of it a Do It Yourself AppHarbor, hopefully with no leftover screws ;-).  The series so for:

  1. Visual Studio 2010 Workflow
  2. Automating Your Builds with Nant.Builder
  3. DIY AppHarbor – Deploying Your Builds onto Windows Azure

Update 08/08/12 – Updated Nant.Builder and links to reflect changes for Azure 1.7 and Azure Powershell Commandlets

Prerequisites

1.  You’ll hopefully not be surprised to learn you’re going to need a Windows Azure account (there’s a rather stingy 90 day free trial, if you haven’t signed up already).  Within your account you’re going to need to set up one Hosted Service where we’ll deploy the app to, and one Storage Account where the package gets uploaded to prior to deployment.  If you’re stuggling just Google for help on configuration and setting up Windows Azure, there’s plenty of good guides out there.

2. You’ll also need to install the .net Windows Azure SDK v1.7.  Again I’ll assume you know how to add and configure an Azure project to your solution.

3.  Finally, you need to download the Windows Azure Powershell Cmdlets.  This will be installed automatically using Web Platform Installer.  Follow the Getting Started instructions here to ensure it was successfully installed.  You can get a list of available commands, here.

Getting Started – Importing your Azure Credentials

  • You’re going to need to download your Azure credentials, so Nant.Builder can contact Azure on your behalf.  We can do this by clicking here:
  • You should now have file called <your-sub>-<date>-credentials.publishsettings
    • Unhelpfully you can’t seem to rename the file on the portal to make it more meaningful
  • If you open the file you’ll see it’s an XML file containing your subscription details.
    • IMPORTANT– if you have multiple azure subscriptions you’ll need to edit the file so that it only includes the one subscription that you want to deploy your app into.
  • With the file downloaded open powershell and run the following commands, note you’ll need to change the path and filename to your .publishsettings file:


Import-AzurePublishSettingsFile PublishSettingsFile 'c:\users\<username>\downloads\your-credentials.publishsettings' SubscriptionDataFile 'c:\dev\tools\windowsazure\subscriptions\your-sub.xml'

  • If the above command run successfully you should have an xml containing your subscriptionId and thumbprint in c:\dev\tools\windowsazure\subscriptions
  • ** REALLY IMPORTANT** – The subscription xml file is basically the keys to your Azure account, so you DO NOT want to be casually emailing it around, take it to the pub etc.  Ensure you save it behind a firewall etc etc.
  • OK that’s us got our Azure credentials organised, next we can configure Nant.Builder

Configure Nant.Builder for Azure Deployment

Packaging your solution for Azure

  • Install and configure Nant.Builder as described in Part 2 of this series.
  • Open the Nant.build file and navigate to the Azure Settings section.
  • Set the create.azure.package parameter to true, this will call CSPack to package your solution in a format suitable for deployment to Windows Azure.  If you’re interested in what’s happening here I’ve talked about CSPack in depth here and here
  • Set the azure.project.name parameter to the name of the Azure project in your solution.
  • Set the azure.role.project.name parameter to the name of the project which contains the entrypoint to your app.  This will most likely be the Web project containing your MVC views etc.
  • Finally set the azure.service.config.file parameter to the name of the *.cscfg file containing the Azure config you want to deploy.  The default is *.cloud.cscfg but may be different if you have a test config, live config etc.
  • You can run Nant.Builder now and your solution should be packaged and output in C:\dev\releases\<your-solution-name>

Deploying your solution to Azure

  • If packaging has succeeded, you can now finally automate deployment to Azure.  Navigate to the Azure deployment section within Nant.Build
  • Set the deploy.azure.package parameter to true
  • Set the azure.subscription.credentials.file parameter to the name of the the file you created in the Import your Azure Credentials section above, ie C:\dev\tools\WindowsAzure\Subscriptions\yourSubscription.xml
  • Set the azure.hosted.service.name parameter to the name of the hosted service you want to deploy your app into.  IMPORTANT – be aware that this is the name listed as the DNS Prefix not the actual service name

  • Set the azure.deployment.environment parameter to the environment type you wish to deploy your app into.  Valid values are either staging or production
  • Finally set the azure.storage.account.name parameter to the name of the storage account you set up earlier, this is where the app will be uploaded to temporarily when it’s being deployed.
  • That’s it we should now be ready to test our DIY App Harbor.  Your Azure Config section should look similar to this, obviously with your app details replaced:
 <!--Azure Settings-->

<!-- Packaging -->
 
 <!--The name of the project containing the Azure csdef, cscfg files-->
 
 <!-- This is the name of the project containing your app entry point, probably the Web project, but may be a library if using a worker role-->
 
 <!-- The name of the file containing the azure config for your app, default is .Cloud but may be custom if you have multiple configs, eg test, live etc -->


<!-- Deployment -->
 
 <!-- The name of the file containing your exported subcription details - IMPORTANT keep this file safe as it contains very sensitive credentials about your Azure sub -->
 
 <!-- The name of a azure hosted service where you want to deploy your app-->
 
 <!-- The environment type either Staging or Production-->
 
 <!-- The name of a storage account that exists on your subscription, this will be used to temporarily load your app into while it's being deploed-->
 

One Click Deployment

So we have hopefully achieved the dream of all modern developers being able to deploy our app into the cloud with one click.  If it’s successful you should see something similar to

DeployAzurePackage:

     [exec] 27/05/2012 22:54 - Azure Cloud App deploy script started.
     [exec] 27/05/2012 22:54 - Preparing deployment of ContinuousDeploy to your service
     [exec] or inception with Subscription ID your subid
     [exec] 27/05/2012 22:54 - Creating New Deployment: In progress
     [exec] 27/05/2012 22:56 - Creating New Deployment: Succeeded, Deployment ID
     [exec] 27/05/2012 22:56 - Starting Instances: In progress
     [exec] 27/05/2012 22:56 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Stopped
     [exec] 27/05/2012 22:57 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Initializing
     [exec] 27/05/2012 23:00 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Busy
     [exec] 27/05/2012 23:01 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Ready
     [exec] 27/05/2012 23:01 - Starting Instances: Succeeded
     [exec] 27/05/2012 23:01 - Created Cloud App with URL http://xxx
     [exec] 27/05/2012 23:01 - Azure Cloud App deploy script finished.

BUILD SUCCEEDED

Note – You are better to run Nant from the command line to see the above output, as the powershell script that deploys your build echos progress to the command line, but not to Visual Studio, if you are running Nant as an external tool

Nant.Builder.Sample

I’ve created a sample project on GitHub that shows Nunit.Builder integrated into it, so it should be more obvious how it all wires up.  Download Nant.Builder.Sample here

Conclusions

I hope you’ve found the series useful, and that you benefit from turbo-charging your workflow.  Over the next month I’m going to refactor Nant.Builder to be a bit more modular, so it will be easy for other to extend the platform with different targets.  Stay tuned for further exiting announcements 🙂

Run Nunit 2.6 from within Visual Studio 2010

If you don’t have Resharper it’s often handy to be able to run NUnit from within VisualStudio.  This is easily achieved from the Tools | External Tools menu.  Just point the Command at where you have installed the Nunit executable, and complete the rest of the parameters as follows:

After completing this you should find that you can launch Nunit from the Tools menu using your newly created NUnit 2.6 option 🙂

Automating Visual Studio 2010 builds and deployments with Nant.Builder

Part 2 in my Visual Studio 2010 Turbo series

  1. Visual Studio 2010 Workflow
  2. Automating Your Builds with Nant.Builder
  3. DIY AppHarbor – Deploying Your Builds onto Windows Azure

In this post I look at using Nant and my Nant.Builder nuget package to quickly get your builds automated, from here it should be simple for you to integrate with a CI tool of your choice.

Update (27/07/12) – Anoop Shetty has put together an awesome post on using Nant.Builder here.  Thanks Anoop 🙂

Nant and Nant.Builder

I’ve been using Nant for years now, it’s a great tool for scripting and automating tedious build and deployment tasks.  Some might say it’s getting a bit long in the tooth, and it’s not as hip as rake etc.  However, I find it perfectly usable, with a learning curve that’s not too steep, it’s very well documented and it’s updated usually once or twice a year.

I’ve recently been doing more and more with Nuget, and I’m increasingly finding it a very powerful way of quickly setting up new projects.  One task that always takes a bit of time is setting up a build script for the new project.  Usually I’d cut and paste an existing script and hack out the bits that needed changed.  This was painful, and I wanted to get rid of this boring step, so Nant.Builder was born.

Installing and Integrating Nant

Hopefully you’ve followed part 1 of this enthralling series, so if you haven’t get Nant installed, download the latest stable build and extract it c:\dev\tools\nant-0.91.  If like me you’re trying to do more from the command line, add the bin directory into your Path environment var, ie C:\dev\tools\nant-0.91\bin

Open powershell and type nant you should see something like this, don’t worry about the Failure message for now:

NAnt 0.91 (Build 0.91.4312.0; release; 22/10/2011)
Copyright (C) 2001-2011 Gerry Shaw
http://nant.sourceforge.net

Nant can also be launched from Visual Studio.  Go to the Tools | External Tools menu option, click Add and complete as per screenshot, ensure you tick the Use Output Option.   You can now launch Nant from VS.

Install Nant.Builder

If you followed the first installment of this series you should have your new Solution in your workspace.  Now lets setup Nant.Builder:

  • Add a new empty project and name it  <yoursolutionname>.Build, ensure you save it in the src directory.  We’ll use this project to hold our build scripts.

  • We don’t want the compiler to build this project so click Build | Configuration Manager.  Untick build on any configurations

  • Now we can install Nant.Builder from Nuget run the following command, from the package manager command line:
install-package nant.builder -projectname <yoursolutionname>.Build
  • We now have Nant.Builder installed into your .Build project 🙂

Configure Nant.Builder for your solution

I’ve tried to keep configuration to the bare minimum, as the whole point is to keep things fast.

  • Open the Nant.Build file.
  • Set the solution.name property to the name of your solution, in our example SampleSolution
  • If you’ve set up your workspace as described in the Workspace blog, you won’t need to edit solution.src.dir.  If you don’t save your projects in a source dir, and save them in the same directory as the .sln file, edit this property to blank, ie “”
  • Set the solution.projects property to a comma separated list (no spaces) of all the projects contained in your solution, in our example SampleSolution.Services,SampleSolution.Tests,SampleSolution.Web
  • Set the release.configuration property to the configuration you want the solution to be compiled under, default is Release
  • If you’re not using CI, you can manually set the version number.  Nant.Builder will then version all your dlls with the version number you specify.  If you are using CCNet, Nant.Builder will pick up the version number from CCNet
  • Set the company.name property to the name of your company, this will also be added to the Assembly.Info, so users can see who created the dll
  • So in our sample we have this:
<!--The name of your solution, please overwrite the default -->
<property name="solution.name" value="SampleSolution"/>

<-- If your projects reside in a different directory from the .sln file specify here, or leave empty if not -->
<property name="solution.src.dir" value="src" />

<!-- Comma seperated list of projects contained in your solution -->
<property name="solution.projects" value="SampleSolution.Services,SampleSolution.Tests,SampleSolution.Web" />

<!-- Set the configuration for compilation, typically release, but may be custom -->
<property name="release.configuration" value="Release" />

<!-- Manually set version, if using CCNet this will be overwritten later -->
<property name="version.tag" value="1.0.0.1"/>
<property name="company.name" value="iainhunter.wordpress.com" />

If you’ve followed the first tutorial you shouldn’t need to change anything in GlobalBuildSettings.xml.  However, if you have a different workspace, buildspace, or have msbuild4 located in a non-standard location, set the values appropriately or you’ll get errors.

Running Nant

We can now run Nant from the command line by opening powershell, navigate to your Build directory, eg C:\dev\work\SampleSolution\src\SampleSolution.Build  then type Nant.  Your solution should build, or throw errors if you have warnings etc.

Alternatively in Visual Studio open the Nant.Build file, then in Tools  run your new Nant tool you created above.

Now if you navigate to your builds directory C:\dev\builds\SampleSolution you should see your build, and if you look at one of the Dlls you should see it has been versioned according to your instructions

Next steps

Nant.Builder is available on github here, so feel free to fork or send me a patch if you think it can be improved.  I’m planning to add a few enhancements like a msdeploy task etc, we’ll see how time allows.

Next time

We alter Nant.Builder to automatically deploy your solution onto Windows Azure