SQL Azure – Disaster Recovery

In this post I look at how to set up some tooling to help implement a Disaster Recovery plan for your SQL Azure database.

Fundamentals

The key to any successful DR plan is that it has to be a fire and forget process.  If your DR process involves any manual components – ie Bob from infrastructure needs to push a button at 3pm on Wednesdays, you can guarantee that when disaster strikes you’ll discover Bob hasn’t pushed the button since February.

Thus you want to make sure everything is automated, and you want to hear about it if anything goes wrong.

It’s worth pointing out that every SQL Azure instance is mirrored twice, therefore it is highly unlikely you’re going to suffer an actual outage or data loss from unexpected downtime.  So what we’re doing here is creating a backup in-case someone inadvertently deletes the Customers table.  Of course it never hurts to have a backup under your pillow (so to speak) if it’s going to help you get to sleep at night.

Tooling

Tools you will need:

Exporting your SQL Azure DB

The first thing we’re going to do is to export your SQL Azure DB to a blob file.  The blob file can be used to import your backup into a new DB in the event of disaster.

  • If you haven’t already got one, create a new Azure Storage account.  It’s a good idea to create this in a different location from your SQL Azure DB, so in the event of a catastropic data-centre melt-down your backup will be located far away.  Eg if your database is in North-Europe setup your Storage Account in East-Asia.
  • Now fire-up Azure Storage Explorer and connect to your new storage account.  Create a new private container for sticking the backups in.  If you don’t create a container you can’t actually save anything into your storage account

  • Now we can configure Azure Import Export Client to download your DB into your newly created storage account.  This is a command line util which is ideal for automating but for now we’ll just run manually.  Run the following, editing for your specific account details:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-DB-NAME -X -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

view raw

exportsqldb.txt

hosted with ❤ by GitHub

  • Important – Make sure you the BLOBURL argument correctly specifies your container name, ie -BLOBURL http://iainsbackups.blob.core.windows.net/dbbackups/MyDb_120820.bacpac
  • If all has gone well you want to see something like below.  Note – this command simply kicks off the backup process it may take some time before your backup file is complete, you can actually monitor the backup jobs on the portal if you want.

Importing your SQL Azure DB

A DR plan is of little use if you don’t test your backup, so we want to ensure that our backup file can actually be used to create a rescue DB from.  So lets import our .bacpac file to see if we can recreate our DB and connect our app to it.

  • We basically reverse the process.  This time create a new empty SQL Azure DB
  • Now we can configure Azure Import Export Service to import our .bacpac file as follows:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-RESCUE-DB-NAME -I -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

  • If it works as expected we should see

  • Now you want to connect your app to your DB to ensure it works as expected.

Automating your backups

Now we’ve proven we can export and import our db we want to make sure the process happens automatically so we can forget about it.  The easiest way of doing that is to create a simple powershell script that runs the above commands for us, and then schedule it on the task manager.

Here’s a basic script that will run the Import/Export service for us, you can tailor as you see fit.  Note that I’m creating a timestamped backup file so we should get a new file every day


###############################################################################
# Description: Backup Script for Sql Azure DB
# Author: Iain Hunter
# Date: 21/08/12
###############################################################################
$today = Get-Date
$todayStr = $today.ToString("yyyyMMdd")
$backupFile = "your-db" + $todayStr + ".bacpac"
echo "Exporting backup to: $backupFile"
# Export DB to blob storage with datestamp
C:\dev\tools\DACImportExport1.6\DacIESvcCli S YOURSQLAZURESERVERNAME.database.windows.net U YOURSQLAZUREUSERNAME P YOURSQLAZUREDBPASSWORD D YOURSQLAZUREDBNAME X BLOBACCESSKEY YOURBLOBSTORAGEACCOUNTKEY BLOBURL YOURBLOBSTORAGEADDRESS/CONTAINERNAME/$backupFile ACCESSKEYTYPE storage
exit

Now we have the script we can call it from the task scheduler, I created a Basic Task to run every night at 23:30, to call our script we can just run powershell from the schedular, as so:

Important – You will have to set your powershell executionpolicy to Remotesigned or the script won’t run when called.

Next Steps

So that’s it we’re backing up our Azure DB and storing in Blob storage all for the cost of a few pennies.  Next we might want to create a more sophisticated script/program that would email us in event of failure, or tidy up old backups – I’ll leave it up to you 🙂

Useful Links

http://msdn.microsoft.com/en-us/library/windowsazure/383f0cb9-0647-4e67-985d-e88369ef0508

Advertisement

Azure CDN – with AzureCdn.Me and MVC3

I’ve been doing a fair bit of work with the Azure CDN recently.  I’ve put together this blog post outlining how to get started and to give an overview of some tooling I’ve written to help you get up and running.

Azure CDN Quickstart

Currently CDNs need to be created in the original Silverlight portal and need to be attached to an existing Hosted Service or Storage Account.  We’ll attach ours to a storage account.  When creating you should click Enable CDN  and Query String (this option will invalidate the CDN cache on any resources if you vary the query string in the resource address, more on this later).  You should now have a CDN in the cloud, now to populate it.

Populating the CDN with static content

Assuming you haven’t altered the MVC standard layout, your static content is probably in the Content folder.  However, where-ever your static content is residing you’ll need to create a new folder titled CDN and move your content into it.  The Azure CDN expects to find your content in the CDN folder.  The easiest thing to do is to cut and paste your Content folder into the CDN folder.  You should now be ready to update the image references.

Introducing AzureCdn.Me

To make the process of referencing images on the Azure CDN a bit more straight-forward I created the AzureCdn.Me nuget package which includes a couple of extension methods to ease the pain.  So install-package AzureCdn.Me into your web project.  AzureCdn.Me will create a CDN folder for you and add the extensions Url.AzureCdnContent and Html.IsDebugMode and a couple of parameters into your web.config.  If we open web.config we can see the new params:

<add key="AzureCDNEndpoint" value="CDN" />
<add key="AzureCDNDebug" value="true" />

The parameters have been defaulted with the values appropriate for debug.  We can now alter our _layout file to use the extensions method.  First off you’ll need to reference the package by adding a using statement at the top of the file, eg:

@using AzureCdn.Me.Code.Extensions
<!DOCTYPE html>
<html>
<head>
    <title>AzureCdn.Me Sample</title>
    <meta charset="utf-8" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/bootstrap-responsive.css")" rel="stylesheet" />
    <link href="@Url.AzureCdnContent("~/Content/azurecdnme.css")" rel="stylesheet" />
    ...

Note I haven’t altered the address of the static files.  Now if we run our project everything should still be fine, however if we open Firebug, we can see that the extension method has both appended the CDN folder to the front of the string, as read from web.cofig, additionally and importantly it’s also added a querystring containing a cache busting random number.  As it’s stored in a static class this number should stay the same until you redploy.  This is very useful, as if we create subsequent versions of our website, each deploy will force the CDN to fetch a fresh version of the stylesheet.

Overloaded

There is also an overloaded where you can pass in a value of your choice.  For example, you could pass in todays date, which would mean that the cache would refresh itself every 24hours.  Or you could pass in the version number of the executing assembly etc.  Hopefully you get the idea.

<link href="@Url.AzureCdnContent("~/Content/bootstrap.css", DateTime.Now.ToString("ddMMyy"))" rel="stylesheet"  />

Doing it Live

So once you’re happy with your site, you want to push your site live, you’re going to need to replace the debug AzureCDNEndpoint value in web.config with the actual value of your CDN endpoint, ie http://az123456.vo.msecnd.net.  The easiest way of doing that is with web config transformations.  Once deployed your Css files, images etc will be served up from the Azure CDN, and because of the querystring component, any changes will always be picked up as soon as you push the changes live.

In my next post I’ll outline how to minify your css and how you can invalidate the cache for any images that are referenced in your CSS and Javascript .

Working with SSL certificates on Azure and IIS7

Recently I’ve had cause to set-up a SSL secured site on Windows Azure.  Never having worked with SSL before I thought I’d quickly blog about a few of the gotcha’s I experienced along the way.

You are here

Before I started, naturally, and as usual, I hit Google to find all Azure SSL wisdom.  I found a great post by SondreB – Windows Azure: Secure Site with SSL certificate.  This is a great guide and I urge you to follow it, however, as an SSL amateur I found it had a couple of gaps.  Which I’ll expand on below.

.crt != .pfx

First off Azure expects to work with SSL certificates that are saved in the .pfx format.  However, when you request a SSL from an authority like Verisign, it comes in .crt format.  These are both valid X.509 formats but you need to convert the cert you receive from Verisign to a format readily understood by Azure.

Gotcha #1 – you have to create the Certificate Request from the same PC that you plan to generate the .pfx from.  So if you were working with on-prem IIS you’d generate the Certificate Request from the IIS server that you’re planning to host the app on.  However, when working with Azure do all SSL work on your dev PC that has both Visual Studio and IIS7 installed.

Creating a Certificate Request

Open IIS7 locally and double-click Server Certificates.  In the left hand side you should have the option to Create Certificate Request.  Complete the wizard as SondreB advises and save the .txt file to your local PC.  If you open the .txt file you’ll see it contains a Certificate Request cryptographic key.

Creating a SSL Certificate

Verisign have a great free service to allow you to create multiple temporary SSL certificate which lasts for 30 days.  Which is great, as if like me, you’ll probably create one or two before you get the process down.  You can purchase a real one once you are happy you’ve got everything worked out.

Complete the contact form, then you’ll be invited to input your Certificate Request.  Ensure you select Microsoft and IIS7 in the Platform details section, then paste in your Certificate Request key.  Complete the process and you’ll be emailed your certificate shortly afterwards.

Importing your SSL Certificate

The email from Verisign will contain another cryptographic key.  Copy it out of the email and paste it into a new file, ensuring you give it a .crt extension – ie myVerisignCert.crt.

Gotcha #2 – SondreB states in the blog you can right-click the .crt and import it into your PC.  This didn’t work for me.  Luckily, however, there’s a free tool that did.  Download the DigiCertUtil.exe and run it.  You should now be able to import the .crt into your certificate store on your local PC (nb – ensure you run as admin or you won’t be able to import the cert).

Export the .pfx

With your .crt successfully imported, you can export it in .pfx format again as SondreB outlines.  You can then simply upload the .pfx to the appropriate host in Azure.  The .crt should now also be available to your Azure cloud project within Visual Studio and again can be imported, as outlined by SondreB.

You looking at my endpoints?

The final gotcha to be aware of, is that you can’t VIP Swap a project that has more endpoints than the one you’re attempting to replace.  If the solution you’re running only has the standard http endpoint you’ll have to delete the running live instance to swap in the package with the additional 443 endpoint.

This could be a major pain, as you’ll be assigned a new IP address etc, and in my experiments it’ll take around 30 minutes for your site to appear on line again, as the CName record seems to take a while to catch up to the new IP address.

So if you know you’re working on a site that will be SSL enabled it would be a good idea to create the 443 endpoint from the start, not at the end of the project, as I discovered.

Otherwise that should be it, SSL happiness awaits

Automating Azure 1.4 Packaging using MSBuild and CSPack

Update 20/06/12 – A working sample of the below can be found in my Nant.Builder nuget package.

Our team recently upgraded to the August 2011 Azure tools update, which provided some nice new features, but completely broke our automated build process.  This was because the way Azure builds were created was modified (although in fairness it now seems to be more inline with Msdeploy builds).  Fixing this was pretty painful, so I thought I’d blog it.

There are 2 steps to creating an Azure package:

  1. Use MSBuild to create a MSDeploy package with the appropriate configuration
  2. Use CSPack to use the MSDeploy package files to create the Azure package you’ll deploy to the cloud.

Compiling your solution for Azure using MSBuild on the command line

First we need is to build a MSDeploy package with the correct configuration.  If you’re using web.config transforms (and you really should be) this also ensures that the web.config file is set with the correct transforms.  So we want to execute MSBuild with the following parameters:

msbuild.exe YourSolution.sln
/p:Configuration=Release
/p:DeployOnBuild=true
/p:DeployTarget=Package
/p:AutoParameterizationWebConfigConnectionStrings=false

This will create a MSDeploy package.  The AutoParameterizationWebConfigConnectionStrings flag is required if you’re using web.config transforms – because we’re not using MSDeploy to actually deploy our package we can’t rely on it to populate the connection string.  If the value is set to true your connection string will look like this:

connectionString="$(ReplacableToken_HuzuSocialDB-Web.config Connection String_0)" />

Clearly this would fail once running on Azure.  With the flag set to false, the connection string is transformed along with any other settings.

Using CSPack on the command line to create your Azure package

We now want to take the MSDeploy package files and turn them into a Azure package.  First we need to find the MSDeploy package files – these should be located in the obj directory of your Web App.  The location is set in the Package/Publish Web settings in the properties of your web project, however, I wouldn’t recommend changing the defaults:

If you navigate to the obj directory, you’ll want to then drill into the directory named after the Release-Configuration you supplied to MSBuild, eg:

The directory we are actually interested in is PackageTmp as this is what MSDeploy uses to create the package, but most importantly here you will find the transformed web.config file.

We can now point CSPack at the PackageTmp directory.  The CSPack executable lives in the bin directory of Azure SDK

C:\Program Files\Windows Azure SDK\v1.4\bin

So to build our package we run CSPack with the following params

cspack.exe C:\HuzuSocial\HuzuSocial.Azure\ServiceDefinition.csdef
/role:HuzuSocial.App;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp
/rolePropertiesFile:HuzuSocial.App;.\AzureRoleProperties.txt"
/sitePhysicalDirectories:HuzuSocial.App;Web;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp
/out:C:\HuzuSocial\HuzuSocial.Azure\HuzuSocial-Azure-Release.cspkg

There’s a lot going on here, so I’ll break it down line by line

cspack.exe C:\HuzuSocial\HuzuSocial.Azure\ServiceDefinition.csdef

This tells CSPack where your ServiceDefinition file is located.  Next:

/role:HuzuSocial.App;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp

This specifies where the files to create the role are located, in our case, this the PackageTmp directory that MSBuild created for us, as described above.  Note that the role name, in this example HuzuSocial.App must match the WebRole name as specified in the ServiceDefinition file:

<WebRole name="HuzuSocial.App" vmsize="Small" >

Next:

/rolePropertiesFile:HuzuSocial.App;.\AzureRoleProperties.txt"

This is optional if you’re working on .Net 3.5 but if you’re working with .Net4 you need to tell CSPack that you’re targetting framework v4.  This is done by creating a txt file that you can then point CSPack at, in our example we have the file in the same directory as CSPack, but it you can store it anywhere (I actually have added it to our solution).  The entry in the file is as follows:

TargetFrameWorkVersion=v4.0

Update 08/12/11 – As of SDK 1.6 you also need to specify EntryPoint parameter, especially if you’re working with Diagnostics.  See this post for more details.

Next:

/sitePhysicalDirectories:HuzuSocial.App;Web;C:\HuzuSocial\HuzuSocial.App\obj\Release\Package\PackageTmp

Here we are telling CSPack how the physical files are to be packaged for deployment to Azure.  Here we are specifying the name of our role, our site name, as defined in the ServiceDefinition file, and finally where the files are located.  (It is here I believe that we can define multiple sites to be hosted in one role).  Finally we have:

/out:C:\HuzuSocial\HuzuSocial.Azure\HuzuSocial-Azure-Release.cspkg

This is the most straightforward param, simply telling CSPack where the package file should be output and what it should be called.

Conclusion

You should now be able to automate the above in your build tool of choice, I use NAnt.  Meaning everytime someone on the team checks in we have a new Azure package ready for deployment.  Next steps would be for the package to be deployed into a staging role automatically using CSRun, if I get round to this I’ll blog it as well.

We can also conclude that all of this is very non-obvious.  Microsoft do not make it easy to do Continuous Integration (CI), especially with Azure packages.  They have improved the tooling to deploy to Azure from your desktop, however, the whole point of CI is to ensure all your tests have passed, there’s no build errors etc.  Deploying from your desktop circumvents all of these, not to mention introducing a human element into deployment that could break something.  I feel MS really could help us out by making all the above a lot more straight-forward.

Useful Posts

I couldn’t have created this post without the following invaluable info:

http://tomkrueger.wordpress.com/2010/07/27/azure-deployment-issue-after-upgrading-to-visual-studio-2010-and-net-4-0/
http://zvolkov.com/blog/post/2010/05/18/How-to-Publish-Web-Site-project-using-VS2010-and-MsBuild.aspx
http://blog.jayway.com/2011/03/20/configuring-automatic-deployment-of-a-windows-azure-application-using-teamcity/

The case for Microsoft Azure

Currently we’re in the process of moving our MVC3 application onto Microsoft’s Azure cloud platform.  Azure offers a number of compelling features, however, note the asides which dispell some of the marketing smog

  1. Rapid deployment – A new client website can be deployed more or less instantly, without the need to provision costly equipment from your hoster – or more likely shoe-horn the new site onto your creaking live environment (or is that just us)
  2. Pay for what you use – small sites can be installed in Small or Extra Small instances and you can pay as little as $0.05 per hour for hosting and $10 per month for a database.  Pricing info is here
  3. Guaranteed 99.95% uptime*
    1. There are a couple of things to think about here.  To get the 99.95 guarantee you have to run a minimum of 2 instances, which obviously doubles your cost.  So you need to be aware that costs are not as cheap as they might appear.
    2. Like most hosters you don’t have a huge come-back if this figure isn’t achieved.  According to the SLA (as I read it) you’ll get your money back for the time your site was down, plus 10-25%.  Which is small fry compared to potential loss of reputation.
  4. Scaling up is easy*– as your site grows from a site with a readership like my blog (not even my Mum is interested) to hopefully the next Facebook, it is simple to add extra instances and have your site automatically loadbalanced across them.  This method also allows you to deal with traffic spikes, for instance if your site was popular in America you could add instances during peak time and scale it back down at nightime – this would be a lot harder to achieve on traditional hardware.
    1. Note the * – however, there is a certain amount of marketing hype in the ease of scaling out a site.  To achieve the above shangri-la you need to be very aware of session management, and if you’re currently relying on in-memory session management, or sticky sessions on your loadbalancer, your app (like ours) can not immediately take advantage of cloud scaling.  There are solutions to this problem, I’ll hopefully get a chance to blog about them in the future.
    2. Currently you’d need to write your own tooling to automatically spin up new instances in response to demand.  This is not a trivial task, and at time of writing documentation on this is reasonbly scarce.
  5. No OS to worry about– This take a little getting used to, but your app is installed into a Hosted Service or Role.  So you don’t need to actually worry about the platform that your app is running on.  Microsoft do that for you, so you don’t need to worry about security patches, access rights and all the headaches that go along with managing your infrastructure.  This is what sets Azure apart from services like Amazon EC2, which deal with OS images, so patching etc is still a major concern.
    1. It is possible to RDP onto a particular instance if you need to do a little bit of debugging.  But you need to bear in mind, that the whole point of cloud computing is the ability to easily scale up your site, so you should never have to manually configure anything on an instance of your site, as that defeats the point of automatically scaling up to meet demand.
  6. Easy integration with Visual Studio – If you’re already on the Microsoft stack, Azure fits easily into your workflow.  However, you will need to be running Windows Vista or later to work with the toolset.
  7. Suite of tools for working in the cloud – In addition to the web hosting, Azure offers a CDN, Queue Service for messaging, Table Storage.  Essentially Microsoft are matching most of Amazon’s services, so I’d expect to see more tools arriving soon.

As we work with Azure I’ll post updates on how we’re getting on

Useful links