SQL Azure – Disaster Recovery

In this post I look at how to set up some tooling to help implement a Disaster Recovery plan for your SQL Azure database.

Fundamentals

The key to any successful DR plan is that it has to be a fire and forget process.  If your DR process involves any manual components – ie Bob from infrastructure needs to push a button at 3pm on Wednesdays, you can guarantee that when disaster strikes you’ll discover Bob hasn’t pushed the button since February.

Thus you want to make sure everything is automated, and you want to hear about it if anything goes wrong.

It’s worth pointing out that every SQL Azure instance is mirrored twice, therefore it is highly unlikely you’re going to suffer an actual outage or data loss from unexpected downtime.  So what we’re doing here is creating a backup in-case someone inadvertently deletes the Customers table.  Of course it never hurts to have a backup under your pillow (so to speak) if it’s going to help you get to sleep at night.

Tooling

Tools you will need:

Exporting your SQL Azure DB

The first thing we’re going to do is to export your SQL Azure DB to a blob file.  The blob file can be used to import your backup into a new DB in the event of disaster.

  • If you haven’t already got one, create a new Azure Storage account.  It’s a good idea to create this in a different location from your SQL Azure DB, so in the event of a catastropic data-centre melt-down your backup will be located far away.  Eg if your database is in North-Europe setup your Storage Account in East-Asia.
  • Now fire-up Azure Storage Explorer and connect to your new storage account.  Create a new private container for sticking the backups in.  If you don’t create a container you can’t actually save anything into your storage account

  • Now we can configure Azure Import Export Client to download your DB into your newly created storage account.  This is a command line util which is ideal for automating but for now we’ll just run manually.  Run the following, editing for your specific account details:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-DB-NAME -X -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

view raw

exportsqldb.txt

hosted with ❤ by GitHub

  • Important – Make sure you the BLOBURL argument correctly specifies your container name, ie -BLOBURL http://iainsbackups.blob.core.windows.net/dbbackups/MyDb_120820.bacpac
  • If all has gone well you want to see something like below.  Note – this command simply kicks off the backup process it may take some time before your backup file is complete, you can actually monitor the backup jobs on the portal if you want.

Importing your SQL Azure DB

A DR plan is of little use if you don’t test your backup, so we want to ensure that our backup file can actually be used to create a rescue DB from.  So lets import our .bacpac file to see if we can recreate our DB and connect our app to it.

  • We basically reverse the process.  This time create a new empty SQL Azure DB
  • Now we can configure Azure Import Export Service to import our .bacpac file as follows:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-RESCUE-DB-NAME -I -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

  • If it works as expected we should see

  • Now you want to connect your app to your DB to ensure it works as expected.

Automating your backups

Now we’ve proven we can export and import our db we want to make sure the process happens automatically so we can forget about it.  The easiest way of doing that is to create a simple powershell script that runs the above commands for us, and then schedule it on the task manager.

Here’s a basic script that will run the Import/Export service for us, you can tailor as you see fit.  Note that I’m creating a timestamped backup file so we should get a new file every day


###############################################################################
# Description: Backup Script for Sql Azure DB
# Author: Iain Hunter
# Date: 21/08/12
###############################################################################
$today = Get-Date
$todayStr = $today.ToString("yyyyMMdd")
$backupFile = "your-db" + $todayStr + ".bacpac"
echo "Exporting backup to: $backupFile"
# Export DB to blob storage with datestamp
C:\dev\tools\DACImportExport1.6\DacIESvcCli S YOURSQLAZURESERVERNAME.database.windows.net U YOURSQLAZUREUSERNAME P YOURSQLAZUREDBPASSWORD D YOURSQLAZUREDBNAME X BLOBACCESSKEY YOURBLOBSTORAGEACCOUNTKEY BLOBURL YOURBLOBSTORAGEADDRESS/CONTAINERNAME/$backupFile ACCESSKEYTYPE storage
exit

Now we have the script we can call it from the task scheduler, I created a Basic Task to run every night at 23:30, to call our script we can just run powershell from the schedular, as so:

Important – You will have to set your powershell executionpolicy to Remotesigned or the script won’t run when called.

Next Steps

So that’s it we’re backing up our Azure DB and storing in Blob storage all for the cost of a few pennies.  Next we might want to create a more sophisticated script/program that would email us in event of failure, or tidy up old backups – I’ll leave it up to you 🙂

Useful Links

http://msdn.microsoft.com/en-us/library/windowsazure/383f0cb9-0647-4e67-985d-e88369ef0508

Advertisement

Azure CDN – Cache busting CSS image references and minification

In my previous post I discussed my AzureCdn.Me nuget package which can be used to add a cache busting query string to your CSS file references.  In this post we look at cache busting image references contained in your CSS files, and minifying the result.

Cache Busters

Inexplicably the Azure CDN doesn’t ship with a big red reset button to allow you to clear the cache for the CDN.  Meaning if you upload a new image or file that’s currently cached it may take hours or even days before the CDN refreshes the cache.

As I outlined in my previous post the easiest way round this is to add a query string to the resource reference, which differs on each deploy.  Meaning the CDN will see the resource as changed and pull down a fresh copy.

All well and good, but as often as not your CSS file may actually contain image reference within, ie

.lolcat1
{
    background-image: url("./images/lolcat1.jpg");
    background-size : 100% 100%;
}

Now if you modify that image it won’t be updated on the CDN, which is bad news.

Introducing AzureCdn.Me.Nant

Any regular reader of my blog will know I’m a big fan of Nant.  So I thought I’d write a build task to append a cache-busting query string onto the end of the image ref.  I spent a bit of time investigating whether anyone else had addressed this problem, to my surprise I couldn’t find any.

In the course of that investigation I took a look at YUI Compressor.  This open source tool minifies your CSS and Javascript.  I downloaded the code, and realised I could enhance it to add a cache buster prior to doing the minification.  Anyone interested can check out my fork on Github here.

Usage

I packaged up my changes as a nuget package AzureCdn.Me.Nant.  You can install it into your build project.

Step 1 – You need to ensure all image references within your css are quoted, eg:

  • good – url(“image.png”)
  • good – url(‘image.png’)
  • bad – url(image.png) – no quotes

If image refs aren’t quotes the YUI Compressor code won’t pick it up

Step 2 – Add a task similar to this into your build file, change the params to suit:

<loadtasks assembly=".\lib\Yahoo.Yui.Compressor.Build.Nant.dll" verbose="true" />
<target name="CompressCss">
    <echo message="Compressing files"/>
    <cssCompressor
        deleteSourceFiles="false"
        outputFile="${buildspace.src.dir}/AzureCdnMe.Sample.Web/cdn/content/minified.css"
        compressionType="Standard"
        loggingType="Info"
        preserveComments="false"
        lineBreakPosition="-1"
        cdnQueryString="1.01"
    >
    <sourceFiles>
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\azurecdnme.css" />
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\bootstrap.css" />
        <include name="..\AzureCdnMe.Sample.Web\cdn\content\bootstrap-responsive.css" />
     </sourceFiles>
     </cssCompressor>
</target>

The new property I’ve added is cdnQueryString.   When it runs it will both minify and cache bust your CSS image references, by appending a the supplied version number.

Referencing your code

Once you’ve minified the CSS you need to ensure your solution uses the new minified version.  If you install my AzureCdn.Me package you can find a helper method that will allow you to determine if you are in debug mode or release, eg:

if (Html.IsInDebugMode())
{
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet"  />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet"  />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet"  />
}
else
{
    <link href="@Url.AzureCdnContent("~/Content/minified.css")" rel="stylesheet" type="text/css" />
}

Show me the code

You can find a working sample of the above here, where it’s hopefully obvious what is going on.

Conclusions

If anyone is interested it would be easy to enhance the YUI Compressor code to add an MSBuild task to add the cdnQueryString parameter.  Also the cdnQueryString param will work in the same way if you want to also minify any javascript.

Azure CDN – with AzureCdn.Me and MVC3

I’ve been doing a fair bit of work with the Azure CDN recently.  I’ve put together this blog post outlining how to get started and to give an overview of some tooling I’ve written to help you get up and running.

Azure CDN Quickstart

Currently CDNs need to be created in the original Silverlight portal and need to be attached to an existing Hosted Service or Storage Account.  We’ll attach ours to a storage account.  When creating you should click Enable CDN  and Query String (this option will invalidate the CDN cache on any resources if you vary the query string in the resource address, more on this later).  You should now have a CDN in the cloud, now to populate it.

Populating the CDN with static content

Assuming you haven’t altered the MVC standard layout, your static content is probably in the Content folder.  However, where-ever your static content is residing you’ll need to create a new folder titled CDN and move your content into it.  The Azure CDN expects to find your content in the CDN folder.  The easiest thing to do is to cut and paste your Content folder into the CDN folder.  You should now be ready to update the image references.

Introducing AzureCdn.Me

To make the process of referencing images on the Azure CDN a bit more straight-forward I created the AzureCdn.Me nuget package which includes a couple of extension methods to ease the pain.  So install-package AzureCdn.Me into your web project.  AzureCdn.Me will create a CDN folder for you and add the extensions Url.AzureCdnContent and Html.IsDebugMode and a couple of parameters into your web.config.  If we open web.config we can see the new params:

<add key="AzureCDNEndpoint" value="CDN" />
<add key="AzureCDNDebug" value="true" />

The parameters have been defaulted with the values appropriate for debug.  We can now alter our _layout file to use the extensions method.  First off you’ll need to reference the package by adding a using statement at the top of the file, eg:

@using AzureCdn.Me.Code.Extensions
<!DOCTYPE html>
<html>
<head>
    <title>AzureCdn.Me Sample</title>
    <meta charset="utf-8" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet" />
    ...

Note I haven’t altered the address of the static files.  Now if we run our project everything should still be fine, however if we open Firebug, we can see that the extension method has both appended the CDN folder to the front of the string, as read from web.cofig, additionally and importantly it’s also added a querystring containing a cache busting random number.  As it’s stored in a static class this number should stay the same until you redploy.  This is very useful, as if we create subsequent versions of our website, each deploy will force the CDN to fetch a fresh version of the stylesheet.

Overloaded

There is also an overloaded where you can pass in a value of your choice.  For example, you could pass in todays date, which would mean that the cache would refresh itself every 24hours.  Or you could pass in the version number of the executing assembly etc.  Hopefully you get the idea.

<link href="@Url.AzureCdnContent("~/Content/bootstrap.css", DateTime.Now.ToString("ddMMyy"))" rel="stylesheet"  />

Doing it Live

So once you’re happy with your site, you want to push your site live, you’re going to need to replace the debug AzureCDNEndpoint value in web.config with the actual value of your CDN endpoint, ie http://az123456.vo.msecnd.net.  The easiest way of doing that is with web config transformations.  Once deployed your Css files, images etc will be served up from the Azure CDN, and because of the querystring component, any changes will always be picked up as soon as you push the changes live.

In my next post I’ll outline how to minify your css and how you can invalidate the cache for any images that are referenced in your CSS and Javascript .

Visual Studio Turbo – DIY AppHarbor with Nant.Builder

In the final part of this series I look at automating uploading your app into the Windows Azure Cloud, or as I like to think of it a Do It Yourself AppHarbor, hopefully with no leftover screws ;-).  The series so for:

  1. Visual Studio 2010 Workflow
  2. Automating Your Builds with Nant.Builder
  3. DIY AppHarbor – Deploying Your Builds onto Windows Azure

Update 08/08/12 – Updated Nant.Builder and links to reflect changes for Azure 1.7 and Azure Powershell Commandlets

Prerequisites

1.  You’ll hopefully not be surprised to learn you’re going to need a Windows Azure account (there’s a rather stingy 90 day free trial, if you haven’t signed up already).  Within your account you’re going to need to set up one Hosted Service where we’ll deploy the app to, and one Storage Account where the package gets uploaded to prior to deployment.  If you’re stuggling just Google for help on configuration and setting up Windows Azure, there’s plenty of good guides out there.

2. You’ll also need to install the .net Windows Azure SDK v1.7.  Again I’ll assume you know how to add and configure an Azure project to your solution.

3.  Finally, you need to download the Windows Azure Powershell Cmdlets.  This will be installed automatically using Web Platform Installer.  Follow the Getting Started instructions here to ensure it was successfully installed.  You can get a list of available commands, here.

Getting Started – Importing your Azure Credentials

  • You’re going to need to download your Azure credentials, so Nant.Builder can contact Azure on your behalf.  We can do this by clicking here:
  • You should now have file called <your-sub>-<date>-credentials.publishsettings
    • Unhelpfully you can’t seem to rename the file on the portal to make it more meaningful
  • If you open the file you’ll see it’s an XML file containing your subscription details.
    • IMPORTANT– if you have multiple azure subscriptions you’ll need to edit the file so that it only includes the one subscription that you want to deploy your app into.
  • With the file downloaded open powershell and run the following commands, note you’ll need to change the path and filename to your .publishsettings file:


Import-AzurePublishSettingsFile PublishSettingsFile 'c:\users\<username>\downloads\your-credentials.publishsettings' SubscriptionDataFile 'c:\dev\tools\windowsazure\subscriptions\your-sub.xml'

  • If the above command run successfully you should have an xml containing your subscriptionId and thumbprint in c:\dev\tools\windowsazure\subscriptions
  • ** REALLY IMPORTANT** – The subscription xml file is basically the keys to your Azure account, so you DO NOT want to be casually emailing it around, take it to the pub etc.  Ensure you save it behind a firewall etc etc.
  • OK that’s us got our Azure credentials organised, next we can configure Nant.Builder

Configure Nant.Builder for Azure Deployment

Packaging your solution for Azure

  • Install and configure Nant.Builder as described in Part 2 of this series.
  • Open the Nant.build file and navigate to the Azure Settings section.
  • Set the create.azure.package parameter to true, this will call CSPack to package your solution in a format suitable for deployment to Windows Azure.  If you’re interested in what’s happening here I’ve talked about CSPack in depth here and here
  • Set the azure.project.name parameter to the name of the Azure project in your solution.
  • Set the azure.role.project.name parameter to the name of the project which contains the entrypoint to your app.  This will most likely be the Web project containing your MVC views etc.
  • Finally set the azure.service.config.file parameter to the name of the *.cscfg file containing the Azure config you want to deploy.  The default is *.cloud.cscfg but may be different if you have a test config, live config etc.
  • You can run Nant.Builder now and your solution should be packaged and output in C:\dev\releases\<your-solution-name>

Deploying your solution to Azure

  • If packaging has succeeded, you can now finally automate deployment to Azure.  Navigate to the Azure deployment section within Nant.Build
  • Set the deploy.azure.package parameter to true
  • Set the azure.subscription.credentials.file parameter to the name of the the file you created in the Import your Azure Credentials section above, ie C:\dev\tools\WindowsAzure\Subscriptions\yourSubscription.xml
  • Set the azure.hosted.service.name parameter to the name of the hosted service you want to deploy your app into.  IMPORTANT – be aware that this is the name listed as the DNS Prefix not the actual service name

  • Set the azure.deployment.environment parameter to the environment type you wish to deploy your app into.  Valid values are either staging or production
  • Finally set the azure.storage.account.name parameter to the name of the storage account you set up earlier, this is where the app will be uploaded to temporarily when it’s being deployed.
  • That’s it we should now be ready to test our DIY App Harbor.  Your Azure Config section should look similar to this, obviously with your app details replaced:
 <!--Azure Settings-->

<!-- Packaging -->
 
 <!--The name of the project containing the Azure csdef, cscfg files-->
 
 <!-- This is the name of the project containing your app entry point, probably the Web project, but may be a library if using a worker role-->
 
 <!-- The name of the file containing the azure config for your app, default is .Cloud but may be custom if you have multiple configs, eg test, live etc -->


<!-- Deployment -->
 
 <!-- The name of the file containing your exported subcription details - IMPORTANT keep this file safe as it contains very sensitive credentials about your Azure sub -->
 
 <!-- The name of a azure hosted service where you want to deploy your app-->
 
 <!-- The environment type either Staging or Production-->
 
 <!-- The name of a storage account that exists on your subscription, this will be used to temporarily load your app into while it's being deploed-->
 

One Click Deployment

So we have hopefully achieved the dream of all modern developers being able to deploy our app into the cloud with one click.  If it’s successful you should see something similar to

DeployAzurePackage:

     [exec] 27/05/2012 22:54 - Azure Cloud App deploy script started.
     [exec] 27/05/2012 22:54 - Preparing deployment of ContinuousDeploy to your service
     [exec] or inception with Subscription ID your subid
     [exec] 27/05/2012 22:54 - Creating New Deployment: In progress
     [exec] 27/05/2012 22:56 - Creating New Deployment: Succeeded, Deployment ID
     [exec] 27/05/2012 22:56 - Starting Instances: In progress
     [exec] 27/05/2012 22:56 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Stopped
     [exec] 27/05/2012 22:57 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Initializing
     [exec] 27/05/2012 23:00 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Busy
     [exec] 27/05/2012 23:01 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Ready
     [exec] 27/05/2012 23:01 - Starting Instances: Succeeded
     [exec] 27/05/2012 23:01 - Created Cloud App with URL http://xxx
     [exec] 27/05/2012 23:01 - Azure Cloud App deploy script finished.

BUILD SUCCEEDED

Note – You are better to run Nant from the command line to see the above output, as the powershell script that deploys your build echos progress to the command line, but not to Visual Studio, if you are running Nant as an external tool

Nant.Builder.Sample

I’ve created a sample project on GitHub that shows Nunit.Builder integrated into it, so it should be more obvious how it all wires up.  Download Nant.Builder.Sample here

Conclusions

I hope you’ve found the series useful, and that you benefit from turbo-charging your workflow.  Over the next month I’m going to refactor Nant.Builder to be a bit more modular, so it will be easy for other to extend the platform with different targets.  Stay tuned for further exiting announcements 🙂

Working with SSL certificates on Azure and IIS7

Recently I’ve had cause to set-up a SSL secured site on Windows Azure.  Never having worked with SSL before I thought I’d quickly blog about a few of the gotcha’s I experienced along the way.

You are here

Before I started, naturally, and as usual, I hit Google to find all Azure SSL wisdom.  I found a great post by SondreB – Windows Azure: Secure Site with SSL certificate.  This is a great guide and I urge you to follow it, however, as an SSL amateur I found it had a couple of gaps.  Which I’ll expand on below.

.crt != .pfx

First off Azure expects to work with SSL certificates that are saved in the .pfx format.  However, when you request a SSL from an authority like Verisign, it comes in .crt format.  These are both valid X.509 formats but you need to convert the cert you receive from Verisign to a format readily understood by Azure.

Gotcha #1 – you have to create the Certificate Request from the same PC that you plan to generate the .pfx from.  So if you were working with on-prem IIS you’d generate the Certificate Request from the IIS server that you’re planning to host the app on.  However, when working with Azure do all SSL work on your dev PC that has both Visual Studio and IIS7 installed.

Creating a Certificate Request

Open IIS7 locally and double-click Server Certificates.  In the left hand side you should have the option to Create Certificate Request.  Complete the wizard as SondreB advises and save the .txt file to your local PC.  If you open the .txt file you’ll see it contains a Certificate Request cryptographic key.

Creating a SSL Certificate

Verisign have a great free service to allow you to create multiple temporary SSL certificate which lasts for 30 days.  Which is great, as if like me, you’ll probably create one or two before you get the process down.  You can purchase a real one once you are happy you’ve got everything worked out.

Complete the contact form, then you’ll be invited to input your Certificate Request.  Ensure you select Microsoft and IIS7 in the Platform details section, then paste in your Certificate Request key.  Complete the process and you’ll be emailed your certificate shortly afterwards.

Importing your SSL Certificate

The email from Verisign will contain another cryptographic key.  Copy it out of the email and paste it into a new file, ensuring you give it a .crt extension – ie myVerisignCert.crt.

Gotcha #2 – SondreB states in the blog you can right-click the .crt and import it into your PC.  This didn’t work for me.  Luckily, however, there’s a free tool that did.  Download the DigiCertUtil.exe and run it.  You should now be able to import the .crt into your certificate store on your local PC (nb – ensure you run as admin or you won’t be able to import the cert).

Export the .pfx

With your .crt successfully imported, you can export it in .pfx format again as SondreB outlines.  You can then simply upload the .pfx to the appropriate host in Azure.  The .crt should now also be available to your Azure cloud project within Visual Studio and again can be imported, as outlined by SondreB.

You looking at my endpoints?

The final gotcha to be aware of, is that you can’t VIP Swap a project that has more endpoints than the one you’re attempting to replace.  If the solution you’re running only has the standard http endpoint you’ll have to delete the running live instance to swap in the package with the additional 443 endpoint.

This could be a major pain, as you’ll be assigned a new IP address etc, and in my experiments it’ll take around 30 minutes for your site to appear on line again, as the CName record seems to take a while to catch up to the new IP address.

So if you know you’re working on a site that will be SSL enabled it would be a good idea to create the 443 endpoint from the start, not at the end of the project, as I discovered.

Otherwise that should be it, SSL happiness awaits

Update – Auto Packaging using CSPack and Azure SDK 1.6

This post is related to two of my previous posts:

Azure 1.5 ate my diagnostics

I had diagnotics working quite happily until SDK 1.5 came out.  Then all of a sudden data was no longer being transferred to  Azure storage.  Even more mysteriously diagnostics would happily transfer data to Azure storage when being emulated locally, but not when on the Azure cloud (in other words a nightmare problem)

I didn’t get around to investigating why till this week.  I saw that several people had the same problem, and assumed that the problem was that I wasn’t configuring the diagnostics correctly in the OnStart method.

Finally I saw this forum thread.  The thead described that if you upload your solution from visual studio diagnostic works correctly, but not when deployed from the build process.  I tried for myself, and yep diagnostics would magically work when the solution was deployed from Visual Studio.  This finally clued me into the fact that the problem had nothing to do with the code, but everything to do with packaging.  Which leads us to this update on Auto Packaging your Azure solution.

Configuring Your Azure Continuous Integration process with CSPack and SDK 1.6

My previous post on using CSPack to automatically build your deployment packages is largely still correct.  But as of (I assume SDK 1.5) there’s a new EntryPoint property.

So you need to specify the name of the DLL that is the entrypoint to your solution.  In mycase HuzuSocial.App.dll.  So my AzureProperties.txt file now looks like this:

TargetFrameWorkVersion=v4.0
EntryPoint=HuzuSocial.App.dll

Now configured correctly, Diagnostics works as expected from our Continuous Integration process.

Windows Azure Diagnostics with SDK 1.6 for WebRoles

There appears to be a lot of conflicting and confused advice about configuring Diagnostics on Windows Azure.  The situation is not at all helped by Microsoft’s own site which, to paraphrase Morecambe and Wise, has all the right pieces of information, just not necessarily in the right order.

It doesn’t help that what used to work with earlier versions of the Azure SDK, no longer works with later versions.  So here I outline:

  • The steps to get Diagnostics outputting correctly to Windows Azure Storage with SDK 1.6 for WebRoles (although I’d imagine it’s largely the same for WorkerRoles)
  • Azure 1.5 ate my diagnostics – Another post where I update my Auto Packaging post to be compatible with SDK 1.6

Setting up Windows Azure Diagnostics for your WebRole with SDK 1.6

1. Configure Web.Config – required if you are using Trace statements

I use Log4Net for my general logging/tracing needs so don’t use Trace statements, thus the example shown in step 3, below, does not require you to complete this step.

However, if you are using Trace statements,  ie:

System.Diagnostics.Trace.TraceError("Error has occurred");

You’ll need to configure Web.config as described here

<system.diagnostics>
    <trace>
        <listeners>
            <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener,
                Microsoft.WindowsAzure.Diagnostics,
                Version=1.0.0.0,
                Culture=neutral,
                PublicKeyToken=31bf3856ad364e35"
                name="AzureDiagnostics">
                <filter type="" />
            </add>
        </listeners>
    </trace>
</system.diagnostics>

2. Initialise Diagnostics

As outlined here, you’ll need to ensure you add the Import element for the Diagnostics module in your ServiceDefinition.csdef file.  Here’s what mine looks like:

<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="HuzuSocial.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
    <WebRole name="HuzuSocial.App" vmsize="Small" >
        <Sites>
            <Site name="Web">
                <Bindings>
                    <Binding name="Endpoint1" endpointName="Endpoint1" />
                </Bindings>
            </Site>
        </Sites>
        <Endpoints>
            <InputEndpoint name="Endpoint1" protocol="http" port="80" />
        </Endpoints>
        <Imports>
            <Import moduleName="Diagnostics" />
        </Imports>
    </WebRole>
</ServiceDefinition>

Secondly you’ll need to add your Azure Storage Account details into your ServiceConfiguration.cscfg, mine looks like this (obviously replace with your account name and key):

<?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="HuzuSocial.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">
        <Role name="HuzuSocial.App">
        <Instances count="2" />
        <ConfigurationSettings>
            <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=[youracountnamehere];AccountKey=[youraccountkeyhere]/>
        </ConfigurationSettings>
        <Certificates>
        </Certificates>
    </Role>
</ServiceConfiguration>

3. Override the OnStart method in WebRole.cs

In the root of your web project you should have a WebRole class.  You’ll need to override the OnStart method to correctly initialise the Diagnostics.  There is loads of different sample code out there, some of it highly dubious.  This is my configuration, and works well for me (I lifted this from a post out there somewhere, unfortunately I forgot to bookmark it and can no longer find it, so thankyou whoever you are)

public override bool OnStart()
{
    DiagnosticMonitorConfiguration diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();

    var perfCounters = new List<string>
    {
        @"\Processor(_Total)\% Processor Time",
        @"\Memory\Available Mbytes",
        @"\TCPv4\Connections Established",
        @"\ASP.NET Applications(__Total__)\Requests/Sec",
        @"\Network Interface(*)\Bytes Received/sec",
        @"\Network Interface(*)\Bytes Sent/sec"
    };

    // Add perf counters to configuration
    foreach (var counter in perfCounters)
    {
        var counterConfig = new PerformanceCounterConfiguration
                            {
                                CounterSpecifier = counter,
                                SampleRate = TimeSpan.FromSeconds(5)
                            };

        diagConfig.PerformanceCounters.DataSources.Add(counterConfig);
    }

    diagConfig.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);

    //Windows Event Logs
    diagConfig.WindowsEventLog.DataSources.Add("System!*");
    diagConfig.WindowsEventLog.DataSources.Add("Application!*");
    diagConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);
    diagConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Warning;

    //Azure Trace Logs
    diagConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);
    diagConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Warning;

    //Crash Dumps
    CrashDumps.EnableCollection(true);

    //IIS Logs
    diagConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);

    DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);

    return base.OnStart();
}

4. That’s it

When deployed to Azure your diagnostics should be successfully transferred to Azure Storage.  To analyse them in any meaningful way, I’d recommend Cerebrate Diagnostics manager, which gives you a nice dashboard.  See below

Automating Azure 1.4 Packaging using MSBuild and CSPack

Update 20/06/12 – A working sample of the below can be found in my Nant.Builder nuget package.

Our team recently upgraded to the August 2011 Azure tools update, which provided some nice new features, but completely broke our automated build process.  This was because the way Azure builds were created was modified (although in fairness it now seems to be more inline with Msdeploy builds).  Fixing this was pretty painful, so I thought I’d blog it.

There are 2 steps to creating an Azure package:

  1. Use MSBuild to create a MSDeploy package with the appropriate configuration
  2. Use CSPack to use the MSDeploy package files to create the Azure package you’ll deploy to the cloud.

Compiling your solution for Azure using MSBuild on the command line

First we need is to build a MSDeploy package with the correct configuration.  If you’re using web.config transforms (and you really should be) this also ensures that the web.config file is set with the correct transforms.  So we want to execute MSBuild with the following parameters:

msbuild.exe YourSolution.sln
/p:Configuration=Release
/p:DeployOnBuild=true
/p:DeployTarget=Package
/p:AutoParameterizationWebConfigConnectionStrings=false

This will create a MSDeploy package.  The AutoParameterizationWebConfigConnectionStrings flag is required if you’re using web.config transforms – because we’re not using MSDeploy to actually deploy our package we can’t rely on it to populate the connection string.  If the value is set to true your connection string will look like this:

connectionString="$(ReplacableToken_HuzuSocialDB-Web.config Connection String_0)" />

Clearly this would fail once running on Azure.  With the flag set to false, the connection string is transformed along with any other settings.

Using CSPack on the command line to create your Azure package

We now want to take the MSDeploy package files and turn them into a Azure package.  First we need to find the MSDeploy package files – these should be located in the obj directory of your Web App.  The location is set in the Package/Publish Web settings in the properties of your web project, however, I wouldn’t recommend changing the defaults:

If you navigate to the obj directory, you’ll want to then drill into the directory named after the Release-Configuration you supplied to MSBuild, eg:

The directory we are actually interested in is PackageTmp as this is what MSDeploy uses to create the package, but most importantly here you will find the transformed web.config file.

We can now point CSPack at the PackageTmp directory.  The CSPack executable lives in the bin directory of Azure SDK

C:\Program Files\Windows Azure SDK\v1.4\bin

So to build our package we run CSPack with the following params

cspack.exe C:\HuzuSocial\HuzuSocial.Azure\ServiceDefinition.csdef
/role:HuzuSocial.App;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp
/rolePropertiesFile:HuzuSocial.App;.\AzureRoleProperties.txt"
/sitePhysicalDirectories:HuzuSocial.App;Web;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp
/out:C:\HuzuSocial\HuzuSocial.Azure\HuzuSocial-Azure-Release.cspkg

There’s a lot going on here, so I’ll break it down line by line

cspack.exe C:\HuzuSocial\HuzuSocial.Azure\ServiceDefinition.csdef

This tells CSPack where your ServiceDefinition file is located.  Next:

/role:HuzuSocial.App;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp

This specifies where the files to create the role are located, in our case, this the PackageTmp directory that MSBuild created for us, as described above.  Note that the role name, in this example HuzuSocial.App must match the WebRole name as specified in the ServiceDefinition file:

<WebRole name="HuzuSocial.App" vmsize="Small" >

Next:

/rolePropertiesFile:HuzuSocial.App;.\AzureRoleProperties.txt"

This is optional if you’re working on .Net 3.5 but if you’re working with .Net4 you need to tell CSPack that you’re targetting framework v4.  This is done by creating a txt file that you can then point CSPack at, in our example we have the file in the same directory as CSPack, but it you can store it anywhere (I actually have added it to our solution).  The entry in the file is as follows:

TargetFrameWorkVersion=v4.0

Update 08/12/11 – As of SDK 1.6 you also need to specify EntryPoint parameter, especially if you’re working with Diagnostics.  See this post for more details.

Next:

/sitePhysicalDirectories:HuzuSocial.App;Web;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp

Here we are telling CSPack how the physical files are to be packaged for deployment to Azure.  Here we are specifying the name of our role, our site name, as defined in the ServiceDefinition file, and finally where the files are located.  (It is here I believe that we can define multiple sites to be hosted in one role).  Finally we have:

/out:C:\HuzuSocial\HuzuSocial.Azure\HuzuSocial-Azure-Release.cspkg

This is the most straightforward param, simply telling CSPack where the package file should be output and what it should be called.

Conclusion

You should now be able to automate the above in your build tool of choice, I use NAnt.  Meaning everytime someone on the team checks in we have a new Azure package ready for deployment.  Next steps would be for the package to be deployed into a staging role automatically using CSRun, if I get round to this I’ll blog it as well.

We can also conclude that all of this is very non-obvious.  Microsoft do not make it easy to do Continuous Integration (CI), especially with Azure packages.  They have improved the tooling to deploy to Azure from your desktop, however, the whole point of CI is to ensure all your tests have passed, there’s no build errors etc.  Deploying from your desktop circumvents all of these, not to mention introducing a human element into deployment that could break something.  I feel MS really could help us out by making all the above a lot more straight-forward.

Useful Posts

I couldn’t have created this post without the following invaluable info:

http://tomkrueger.wordpress.com/2010/07/27/azure-deployment-issue-after-upgrading-to-visual-studio-2010-and-net-4-0/
http://zvolkov.com/blog/post/2010/05/18/How-to-Publish-Web-Site-project-using-VS2010-and-MsBuild.aspx
http://blog.jayway.com/2011/03/20/configuring-automatic-deployment-of-a-windows-azure-application-using-teamcity/

The case for Microsoft Azure

Currently we’re in the process of moving our MVC3 application onto Microsoft’s Azure cloud platform.  Azure offers a number of compelling features, however, note the asides which dispell some of the marketing smog

  1. Rapid deployment – A new client website can be deployed more or less instantly, without the need to provision costly equipment from your hoster – or more likely shoe-horn the new site onto your creaking live environment (or is that just us)
  2. Pay for what you use – small sites can be installed in Small or Extra Small instances and you can pay as little as $0.05 per hour for hosting and $10 per month for a database.  Pricing info is here
  3. Guaranteed 99.95% uptime*
    1. There are a couple of things to think about here.  To get the 99.95 guarantee you have to run a minimum of 2 instances, which obviously doubles your cost.  So you need to be aware that costs are not as cheap as they might appear.
    2. Like most hosters you don’t have a huge come-back if this figure isn’t achieved.  According to the SLA (as I read it) you’ll get your money back for the time your site was down, plus 10-25%.  Which is small fry compared to potential loss of reputation.
  4. Scaling up is easy*– as your site grows from a site with a readership like my blog (not even my Mum is interested) to hopefully the next Facebook, it is simple to add extra instances and have your site automatically loadbalanced across them.  This method also allows you to deal with traffic spikes, for instance if your site was popular in America you could add instances during peak time and scale it back down at nightime – this would be a lot harder to achieve on traditional hardware.
    1. Note the * – however, there is a certain amount of marketing hype in the ease of scaling out a site.  To achieve the above shangri-la you need to be very aware of session management, and if you’re currently relying on in-memory session management, or sticky sessions on your loadbalancer, your app (like ours) can not immediately take advantage of cloud scaling.  There are solutions to this problem, I’ll hopefully get a chance to blog about them in the future.
    2. Currently you’d need to write your own tooling to automatically spin up new instances in response to demand.  This is not a trivial task, and at time of writing documentation on this is reasonbly scarce.
  5. No OS to worry about– This take a little getting used to, but your app is installed into a Hosted Service or Role.  So you don’t need to actually worry about the platform that your app is running on.  Microsoft do that for you, so you don’t need to worry about security patches, access rights and all the headaches that go along with managing your infrastructure.  This is what sets Azure apart from services like Amazon EC2, which deal with OS images, so patching etc is still a major concern.
    1. It is possible to RDP onto a particular instance if you need to do a little bit of debugging.  But you need to bear in mind, that the whole point of cloud computing is the ability to easily scale up your site, so you should never have to manually configure anything on an instance of your site, as that defeats the point of automatically scaling up to meet demand.
  6. Easy integration with Visual Studio – If you’re already on the Microsoft stack, Azure fits easily into your workflow.  However, you will need to be running Windows Vista or later to work with the toolset.
  7. Suite of tools for working in the cloud – In addition to the web hosting, Azure offers a CDN, Queue Service for messaging, Table Storage.  Essentially Microsoft are matching most of Amazon’s services, so I’d expect to see more tools arriving soon.

As we work with Azure I’ll post updates on how we’re getting on

Useful links