SQL Azure – Disaster Recovery

In this post I look at how to set up some tooling to help implement a Disaster Recovery plan for your SQL Azure database.

Fundamentals

The key to any successful DR plan is that it has to be a fire and forget process.  If your DR process involves any manual components – ie Bob from infrastructure needs to push a button at 3pm on Wednesdays, you can guarantee that when disaster strikes you’ll discover Bob hasn’t pushed the button since February.

Thus you want to make sure everything is automated, and you want to hear about it if anything goes wrong.

It’s worth pointing out that every SQL Azure instance is mirrored twice, therefore it is highly unlikely you’re going to suffer an actual outage or data loss from unexpected downtime.  So what we’re doing here is creating a backup in-case someone inadvertently deletes the Customers table.  Of course it never hurts to have a backup under your pillow (so to speak) if it’s going to help you get to sleep at night.

Tooling

Tools you will need:

Exporting your SQL Azure DB

The first thing we’re going to do is to export your SQL Azure DB to a blob file.  The blob file can be used to import your backup into a new DB in the event of disaster.

  • If you haven’t already got one, create a new Azure Storage account.  It’s a good idea to create this in a different location from your SQL Azure DB, so in the event of a catastropic data-centre melt-down your backup will be located far away.  Eg if your database is in North-Europe setup your Storage Account in East-Asia.
  • Now fire-up Azure Storage Explorer and connect to your new storage account.  Create a new private container for sticking the backups in.  If you don’t create a container you can’t actually save anything into your storage account

  • Now we can configure Azure Import Export Client to download your DB into your newly created storage account.  This is a command line util which is ideal for automating but for now we’ll just run manually.  Run the following, editing for your specific account details:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-DB-NAME -X -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

view raw

exportsqldb.txt

hosted with ❤ by GitHub

  • Important – Make sure you the BLOBURL argument correctly specifies your container name, ie -BLOBURL http://iainsbackups.blob.core.windows.net/dbbackups/MyDb_120820.bacpac
  • If all has gone well you want to see something like below.  Note – this command simply kicks off the backup process it may take some time before your backup file is complete, you can actually monitor the backup jobs on the portal if you want.

Importing your SQL Azure DB

A DR plan is of little use if you don’t test your backup, so we want to ensure that our backup file can actually be used to create a rescue DB from.  So lets import our .bacpac file to see if we can recreate our DB and connect our app to it.

  • We basically reverse the process.  This time create a new empty SQL Azure DB
  • Now we can configure Azure Import Export Service to import our .bacpac file as follows:


.\DacIESvcCli -S YOUR-SQL-AZURE-SERVERNAME.database.windows.net -U YOUR-SQL-AZURE-USERNAME -P YOUR-SQL-AZURE-DB-PASSWORD -D YOUR-SQL-AZURE-RESCUE-DB-NAME -I -BLOBACCESSKEY YOUR-BLOB-STORAGE-ACCOUNT-KEY -BLOBURL YOUR-BLOB-STORAGE-ADDRESS/CONTAINER-NAME/BackupName.bacpac -ACCESSKEYTYPE storage

  • If it works as expected we should see

  • Now you want to connect your app to your DB to ensure it works as expected.

Automating your backups

Now we’ve proven we can export and import our db we want to make sure the process happens automatically so we can forget about it.  The easiest way of doing that is to create a simple powershell script that runs the above commands for us, and then schedule it on the task manager.

Here’s a basic script that will run the Import/Export service for us, you can tailor as you see fit.  Note that I’m creating a timestamped backup file so we should get a new file every day


###############################################################################
# Description: Backup Script for Sql Azure DB
# Author: Iain Hunter
# Date: 21/08/12
###############################################################################
$today = Get-Date
$todayStr = $today.ToString("yyyyMMdd")
$backupFile = "your-db" + $todayStr + ".bacpac"
echo "Exporting backup to: $backupFile"
# Export DB to blob storage with datestamp
C:\dev\tools\DACImportExport1.6\DacIESvcCli S YOURSQLAZURESERVERNAME.database.windows.net U YOURSQLAZUREUSERNAME P YOURSQLAZUREDBPASSWORD D YOURSQLAZUREDBNAME X BLOBACCESSKEY YOURBLOBSTORAGEACCOUNTKEY BLOBURL YOURBLOBSTORAGEADDRESS/CONTAINERNAME/$backupFile ACCESSKEYTYPE storage
exit

Now we have the script we can call it from the task scheduler, I created a Basic Task to run every night at 23:30, to call our script we can just run powershell from the schedular, as so:

Important – You will have to set your powershell executionpolicy to Remotesigned or the script won’t run when called.

Next Steps

So that’s it we’re backing up our Azure DB and storing in Blob storage all for the cost of a few pennies.  Next we might want to create a more sophisticated script/program that would email us in event of failure, or tidy up old backups – I’ll leave it up to you 🙂

Useful Links

http://msdn.microsoft.com/en-us/library/windowsazure/383f0cb9-0647-4e67-985d-e88369ef0508

Advertisement

Visual Studio Turbo – DIY AppHarbor with Nant.Builder

In the final part of this series I look at automating uploading your app into the Windows Azure Cloud, or as I like to think of it a Do It Yourself AppHarbor, hopefully with no leftover screws ;-).  The series so for:

  1. Visual Studio 2010 Workflow
  2. Automating Your Builds with Nant.Builder
  3. DIY AppHarbor – Deploying Your Builds onto Windows Azure

Update 08/08/12 – Updated Nant.Builder and links to reflect changes for Azure 1.7 and Azure Powershell Commandlets

Prerequisites

1.  You’ll hopefully not be surprised to learn you’re going to need a Windows Azure account (there’s a rather stingy 90 day free trial, if you haven’t signed up already).  Within your account you’re going to need to set up one Hosted Service where we’ll deploy the app to, and one Storage Account where the package gets uploaded to prior to deployment.  If you’re stuggling just Google for help on configuration and setting up Windows Azure, there’s plenty of good guides out there.

2. You’ll also need to install the .net Windows Azure SDK v1.7.  Again I’ll assume you know how to add and configure an Azure project to your solution.

3.  Finally, you need to download the Windows Azure Powershell Cmdlets.  This will be installed automatically using Web Platform Installer.  Follow the Getting Started instructions here to ensure it was successfully installed.  You can get a list of available commands, here.

Getting Started – Importing your Azure Credentials

  • You’re going to need to download your Azure credentials, so Nant.Builder can contact Azure on your behalf.  We can do this by clicking here:
  • You should now have file called <your-sub>-<date>-credentials.publishsettings
    • Unhelpfully you can’t seem to rename the file on the portal to make it more meaningful
  • If you open the file you’ll see it’s an XML file containing your subscription details.
    • IMPORTANT– if you have multiple azure subscriptions you’ll need to edit the file so that it only includes the one subscription that you want to deploy your app into.
  • With the file downloaded open powershell and run the following commands, note you’ll need to change the path and filename to your .publishsettings file:


Import-AzurePublishSettingsFile PublishSettingsFile 'c:\users\<username>\downloads\your-credentials.publishsettings' SubscriptionDataFile 'c:\dev\tools\windowsazure\subscriptions\your-sub.xml'

  • If the above command run successfully you should have an xml containing your subscriptionId and thumbprint in c:\dev\tools\windowsazure\subscriptions
  • ** REALLY IMPORTANT** – The subscription xml file is basically the keys to your Azure account, so you DO NOT want to be casually emailing it around, take it to the pub etc.  Ensure you save it behind a firewall etc etc.
  • OK that’s us got our Azure credentials organised, next we can configure Nant.Builder

Configure Nant.Builder for Azure Deployment

Packaging your solution for Azure

  • Install and configure Nant.Builder as described in Part 2 of this series.
  • Open the Nant.build file and navigate to the Azure Settings section.
  • Set the create.azure.package parameter to true, this will call CSPack to package your solution in a format suitable for deployment to Windows Azure.  If you’re interested in what’s happening here I’ve talked about CSPack in depth here and here
  • Set the azure.project.name parameter to the name of the Azure project in your solution.
  • Set the azure.role.project.name parameter to the name of the project which contains the entrypoint to your app.  This will most likely be the Web project containing your MVC views etc.
  • Finally set the azure.service.config.file parameter to the name of the *.cscfg file containing the Azure config you want to deploy.  The default is *.cloud.cscfg but may be different if you have a test config, live config etc.
  • You can run Nant.Builder now and your solution should be packaged and output in C:\dev\releases\<your-solution-name>

Deploying your solution to Azure

  • If packaging has succeeded, you can now finally automate deployment to Azure.  Navigate to the Azure deployment section within Nant.Build
  • Set the deploy.azure.package parameter to true
  • Set the azure.subscription.credentials.file parameter to the name of the the file you created in the Import your Azure Credentials section above, ie C:\dev\tools\WindowsAzure\Subscriptions\yourSubscription.xml
  • Set the azure.hosted.service.name parameter to the name of the hosted service you want to deploy your app into.  IMPORTANT – be aware that this is the name listed as the DNS Prefix not the actual service name

  • Set the azure.deployment.environment parameter to the environment type you wish to deploy your app into.  Valid values are either staging or production
  • Finally set the azure.storage.account.name parameter to the name of the storage account you set up earlier, this is where the app will be uploaded to temporarily when it’s being deployed.
  • That’s it we should now be ready to test our DIY App Harbor.  Your Azure Config section should look similar to this, obviously with your app details replaced:
 <!--Azure Settings-->

<!-- Packaging -->
 
 <!--The name of the project containing the Azure csdef, cscfg files-->
 
 <!-- This is the name of the project containing your app entry point, probably the Web project, but may be a library if using a worker role-->
 
 <!-- The name of the file containing the azure config for your app, default is .Cloud but may be custom if you have multiple configs, eg test, live etc -->


<!-- Deployment -->
 
 <!-- The name of the file containing your exported subcription details - IMPORTANT keep this file safe as it contains very sensitive credentials about your Azure sub -->
 
 <!-- The name of a azure hosted service where you want to deploy your app-->
 
 <!-- The environment type either Staging or Production-->
 
 <!-- The name of a storage account that exists on your subscription, this will be used to temporarily load your app into while it's being deploed-->
 

One Click Deployment

So we have hopefully achieved the dream of all modern developers being able to deploy our app into the cloud with one click.  If it’s successful you should see something similar to

DeployAzurePackage:

     [exec] 27/05/2012 22:54 - Azure Cloud App deploy script started.
     [exec] 27/05/2012 22:54 - Preparing deployment of ContinuousDeploy to your service
     [exec] or inception with Subscription ID your subid
     [exec] 27/05/2012 22:54 - Creating New Deployment: In progress
     [exec] 27/05/2012 22:56 - Creating New Deployment: Succeeded, Deployment ID
     [exec] 27/05/2012 22:56 - Starting Instances: In progress
     [exec] 27/05/2012 22:56 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Stopped
     [exec] 27/05/2012 22:57 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Initializing
     [exec] 27/05/2012 23:00 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Busy
     [exec] 27/05/2012 23:01 - Starting Instance 'Nant.Builder.Sample.Web_IN_0': Ready
     [exec] 27/05/2012 23:01 - Starting Instances: Succeeded
     [exec] 27/05/2012 23:01 - Created Cloud App with URL http://xxx
     [exec] 27/05/2012 23:01 - Azure Cloud App deploy script finished.

BUILD SUCCEEDED

Note – You are better to run Nant from the command line to see the above output, as the powershell script that deploys your build echos progress to the command line, but not to Visual Studio, if you are running Nant as an external tool

Nant.Builder.Sample

I’ve created a sample project on GitHub that shows Nunit.Builder integrated into it, so it should be more obvious how it all wires up.  Download Nant.Builder.Sample here

Conclusions

I hope you’ve found the series useful, and that you benefit from turbo-charging your workflow.  Over the next month I’m going to refactor Nant.Builder to be a bit more modular, so it will be easy for other to extend the platform with different targets.  Stay tuned for further exiting announcements 🙂

Working with SSL certificates on Azure and IIS7

Recently I’ve had cause to set-up a SSL secured site on Windows Azure.  Never having worked with SSL before I thought I’d quickly blog about a few of the gotcha’s I experienced along the way.

You are here

Before I started, naturally, and as usual, I hit Google to find all Azure SSL wisdom.  I found a great post by SondreB – Windows Azure: Secure Site with SSL certificate.  This is a great guide and I urge you to follow it, however, as an SSL amateur I found it had a couple of gaps.  Which I’ll expand on below.

.crt != .pfx

First off Azure expects to work with SSL certificates that are saved in the .pfx format.  However, when you request a SSL from an authority like Verisign, it comes in .crt format.  These are both valid X.509 formats but you need to convert the cert you receive from Verisign to a format readily understood by Azure.

Gotcha #1 – you have to create the Certificate Request from the same PC that you plan to generate the .pfx from.  So if you were working with on-prem IIS you’d generate the Certificate Request from the IIS server that you’re planning to host the app on.  However, when working with Azure do all SSL work on your dev PC that has both Visual Studio and IIS7 installed.

Creating a Certificate Request

Open IIS7 locally and double-click Server Certificates.  In the left hand side you should have the option to Create Certificate Request.  Complete the wizard as SondreB advises and save the .txt file to your local PC.  If you open the .txt file you’ll see it contains a Certificate Request cryptographic key.

Creating a SSL Certificate

Verisign have a great free service to allow you to create multiple temporary SSL certificate which lasts for 30 days.  Which is great, as if like me, you’ll probably create one or two before you get the process down.  You can purchase a real one once you are happy you’ve got everything worked out.

Complete the contact form, then you’ll be invited to input your Certificate Request.  Ensure you select Microsoft and IIS7 in the Platform details section, then paste in your Certificate Request key.  Complete the process and you’ll be emailed your certificate shortly afterwards.

Importing your SSL Certificate

The email from Verisign will contain another cryptographic key.  Copy it out of the email and paste it into a new file, ensuring you give it a .crt extension – ie myVerisignCert.crt.

Gotcha #2 – SondreB states in the blog you can right-click the .crt and import it into your PC.  This didn’t work for me.  Luckily, however, there’s a free tool that did.  Download the DigiCertUtil.exe and run it.  You should now be able to import the .crt into your certificate store on your local PC (nb – ensure you run as admin or you won’t be able to import the cert).

Export the .pfx

With your .crt successfully imported, you can export it in .pfx format again as SondreB outlines.  You can then simply upload the .pfx to the appropriate host in Azure.  The .crt should now also be available to your Azure cloud project within Visual Studio and again can be imported, as outlined by SondreB.

You looking at my endpoints?

The final gotcha to be aware of, is that you can’t VIP Swap a project that has more endpoints than the one you’re attempting to replace.  If the solution you’re running only has the standard http endpoint you’ll have to delete the running live instance to swap in the package with the additional 443 endpoint.

This could be a major pain, as you’ll be assigned a new IP address etc, and in my experiments it’ll take around 30 minutes for your site to appear on line again, as the CName record seems to take a while to catch up to the new IP address.

So if you know you’re working on a site that will be SSL enabled it would be a good idea to create the 443 endpoint from the start, not at the end of the project, as I discovered.

Otherwise that should be it, SSL happiness awaits

Update – Auto Packaging using CSPack and Azure SDK 1.6

This post is related to two of my previous posts:

Azure 1.5 ate my diagnostics

I had diagnotics working quite happily until SDK 1.5 came out.  Then all of a sudden data was no longer being transferred to  Azure storage.  Even more mysteriously diagnostics would happily transfer data to Azure storage when being emulated locally, but not when on the Azure cloud (in other words a nightmare problem)

I didn’t get around to investigating why till this week.  I saw that several people had the same problem, and assumed that the problem was that I wasn’t configuring the diagnostics correctly in the OnStart method.

Finally I saw this forum thread.  The thead described that if you upload your solution from visual studio diagnostic works correctly, but not when deployed from the build process.  I tried for myself, and yep diagnostics would magically work when the solution was deployed from Visual Studio.  This finally clued me into the fact that the problem had nothing to do with the code, but everything to do with packaging.  Which leads us to this update on Auto Packaging your Azure solution.

Configuring Your Azure Continuous Integration process with CSPack and SDK 1.6

My previous post on using CSPack to automatically build your deployment packages is largely still correct.  But as of (I assume SDK 1.5) there’s a new EntryPoint property.

So you need to specify the name of the DLL that is the entrypoint to your solution.  In mycase HuzuSocial.App.dll.  So my AzureProperties.txt file now looks like this:

TargetFrameWorkVersion=v4.0
EntryPoint=HuzuSocial.App.dll

Now configured correctly, Diagnostics works as expected from our Continuous Integration process.

Windows Azure Diagnostics with SDK 1.6 for WebRoles

There appears to be a lot of conflicting and confused advice about configuring Diagnostics on Windows Azure.  The situation is not at all helped by Microsoft’s own site which, to paraphrase Morecambe and Wise, has all the right pieces of information, just not necessarily in the right order.

It doesn’t help that what used to work with earlier versions of the Azure SDK, no longer works with later versions.  So here I outline:

  • The steps to get Diagnostics outputting correctly to Windows Azure Storage with SDK 1.6 for WebRoles (although I’d imagine it’s largely the same for WorkerRoles)
  • Azure 1.5 ate my diagnostics – Another post where I update my Auto Packaging post to be compatible with SDK 1.6

Setting up Windows Azure Diagnostics for your WebRole with SDK 1.6

1. Configure Web.Config – required if you are using Trace statements

I use Log4Net for my general logging/tracing needs so don’t use Trace statements, thus the example shown in step 3, below, does not require you to complete this step.

However, if you are using Trace statements,  ie:

System.Diagnostics.Trace.TraceError("Error has occurred");

You’ll need to configure Web.config as described here

<system.diagnostics>
    <trace>
        <listeners>
            <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener,
                Microsoft.WindowsAzure.Diagnostics,
                Version=1.0.0.0,
                Culture=neutral,
                PublicKeyToken=31bf3856ad364e35"
                name="AzureDiagnostics">
                <filter type="" />
            </add>
        </listeners>
    </trace>
</system.diagnostics>

2. Initialise Diagnostics

As outlined here, you’ll need to ensure you add the Import element for the Diagnostics module in your ServiceDefinition.csdef file.  Here’s what mine looks like:

<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="HuzuSocial.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
    <WebRole name="HuzuSocial.App" vmsize="Small" >
        <Sites>
            <Site name="Web">
                <Bindings>
                    <Binding name="Endpoint1" endpointName="Endpoint1" />
                </Bindings>
            </Site>
        </Sites>
        <Endpoints>
            <InputEndpoint name="Endpoint1" protocol="http" port="80" />
        </Endpoints>
        <Imports>
            <Import moduleName="Diagnostics" />
        </Imports>
    </WebRole>
</ServiceDefinition>

Secondly you’ll need to add your Azure Storage Account details into your ServiceConfiguration.cscfg, mine looks like this (obviously replace with your account name and key):

<?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="HuzuSocial.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">
        <Role name="HuzuSocial.App">
        <Instances count="2" />
        <ConfigurationSettings>
            <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=[youracountnamehere];AccountKey=[youraccountkeyhere]/>
        </ConfigurationSettings>
        <Certificates>
        </Certificates>
    </Role>
</ServiceConfiguration>

3. Override the OnStart method in WebRole.cs

In the root of your web project you should have a WebRole class.  You’ll need to override the OnStart method to correctly initialise the Diagnostics.  There is loads of different sample code out there, some of it highly dubious.  This is my configuration, and works well for me (I lifted this from a post out there somewhere, unfortunately I forgot to bookmark it and can no longer find it, so thankyou whoever you are)

public override bool OnStart()
{
    DiagnosticMonitorConfiguration diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();

    var perfCounters = new List<string>
    {
        @"\Processor(_Total)\% Processor Time",
        @"\Memory\Available Mbytes",
        @"\TCPv4\Connections Established",
        @"\ASP.NET Applications(__Total__)\Requests/Sec",
        @"\Network Interface(*)\Bytes Received/sec",
        @"\Network Interface(*)\Bytes Sent/sec"
    };

    // Add perf counters to configuration
    foreach (var counter in perfCounters)
    {
        var counterConfig = new PerformanceCounterConfiguration
                            {
                                CounterSpecifier = counter,
                                SampleRate = TimeSpan.FromSeconds(5)
                            };

        diagConfig.PerformanceCounters.DataSources.Add(counterConfig);
    }

    diagConfig.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);

    //Windows Event Logs
    diagConfig.WindowsEventLog.DataSources.Add("System!*");
    diagConfig.WindowsEventLog.DataSources.Add("Application!*");
    diagConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);
    diagConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Warning;

    //Azure Trace Logs
    diagConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);
    diagConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Warning;

    //Crash Dumps
    CrashDumps.EnableCollection(true);

    //IIS Logs
    diagConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);

    DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);

    return base.OnStart();
}

4. That’s it

When deployed to Azure your diagnostics should be successfully transferred to Azure Storage.  To analyse them in any meaningful way, I’d recommend Cerebrate Diagnostics manager, which gives you a nice dashboard.  See below

The case for Microsoft Azure

Currently we’re in the process of moving our MVC3 application onto Microsoft’s Azure cloud platform.  Azure offers a number of compelling features, however, note the asides which dispell some of the marketing smog

  1. Rapid deployment – A new client website can be deployed more or less instantly, without the need to provision costly equipment from your hoster – or more likely shoe-horn the new site onto your creaking live environment (or is that just us)
  2. Pay for what you use – small sites can be installed in Small or Extra Small instances and you can pay as little as $0.05 per hour for hosting and $10 per month for a database.  Pricing info is here
  3. Guaranteed 99.95% uptime*
    1. There are a couple of things to think about here.  To get the 99.95 guarantee you have to run a minimum of 2 instances, which obviously doubles your cost.  So you need to be aware that costs are not as cheap as they might appear.
    2. Like most hosters you don’t have a huge come-back if this figure isn’t achieved.  According to the SLA (as I read it) you’ll get your money back for the time your site was down, plus 10-25%.  Which is small fry compared to potential loss of reputation.
  4. Scaling up is easy*– as your site grows from a site with a readership like my blog (not even my Mum is interested) to hopefully the next Facebook, it is simple to add extra instances and have your site automatically loadbalanced across them.  This method also allows you to deal with traffic spikes, for instance if your site was popular in America you could add instances during peak time and scale it back down at nightime – this would be a lot harder to achieve on traditional hardware.
    1. Note the * – however, there is a certain amount of marketing hype in the ease of scaling out a site.  To achieve the above shangri-la you need to be very aware of session management, and if you’re currently relying on in-memory session management, or sticky sessions on your loadbalancer, your app (like ours) can not immediately take advantage of cloud scaling.  There are solutions to this problem, I’ll hopefully get a chance to blog about them in the future.
    2. Currently you’d need to write your own tooling to automatically spin up new instances in response to demand.  This is not a trivial task, and at time of writing documentation on this is reasonbly scarce.
  5. No OS to worry about– This take a little getting used to, but your app is installed into a Hosted Service or Role.  So you don’t need to actually worry about the platform that your app is running on.  Microsoft do that for you, so you don’t need to worry about security patches, access rights and all the headaches that go along with managing your infrastructure.  This is what sets Azure apart from services like Amazon EC2, which deal with OS images, so patching etc is still a major concern.
    1. It is possible to RDP onto a particular instance if you need to do a little bit of debugging.  But you need to bear in mind, that the whole point of cloud computing is the ability to easily scale up your site, so you should never have to manually configure anything on an instance of your site, as that defeats the point of automatically scaling up to meet demand.
  6. Easy integration with Visual Studio – If you’re already on the Microsoft stack, Azure fits easily into your workflow.  However, you will need to be running Windows Vista or later to work with the toolset.
  7. Suite of tools for working in the cloud – In addition to the web hosting, Azure offers a CDN, Queue Service for messaging, Table Storage.  Essentially Microsoft are matching most of Amazon’s services, so I’d expect to see more tools arriving soon.

As we work with Azure I’ll post updates on how we’re getting on

Useful links