Monday, August 30, 2010

Presenting A Night Of Azure! RockNUG Sept. 8

On Wednesday, September 8, I’ll be taking over both the n00b presentation and the Featured presentation at the Rockville .NET User Group in Rockville, MD. I’ll be filling your heads with lots and lots of Azure, Microsoft’s most-excellent cloud computing platform!

If you’re new to Azure, the n00b session is the place to be! We’ll have our very first Azure application up and running in a few minutes, and we’ll learn about the basic moving parts.

For the Feature Presentation, we’ll build on our basic application, mixing in features from the three different Azure “portals”:

  • Windows Azure – this deals with the virtual machines and highly-scalable storage
  • SQL Azure – this is SQL Server, cloud-style!
  • AppFabric – this provides connectivity and access control

For those who would like to follow along: We’ll be taking things pretty slow in the n00b session. If you’d like to try building your first Azure app along with me, you’ll need to a few things first – see details here. A quick summary to get up and running:

  • Windows Vista, 2008, or Windows 7 (sorry, XP folks…)
  • Visual Studio 2010 Web Developer Express or any version from your MSDN subscription (Professional, Ultimate, etc.)
  • SQL Server 2005 Express or above
  • Azure SDK 1.2 + Visual Studio Tools

Friday, August 20, 2010

Azure Tip of the Day: Determine if running in Dev Fabric or Azure Fabric

One potential goal, when writing a new Azure application, is to support running in both Azure-hosted and non-Azure environments. The SDK gives us an easy way to check this:
            if (RoleEnvironment.IsAvailable)
            {
                // Azure-specific actions
            }

With this check, you could, say, build a class library that has specific behaviors depending on the environment in which it’s running.

Now, let’s assume the Azure environment is available, and you need to take a specific action depending on whether you’re running in the Dev Fabric vs. the Azure Fabric. Unfortunately, there’s no specific method or property in RoleEnvironment that helps us out.

This brings me to today’s tip: Determining whether an app is running in Dev Fabric or Azure Fabric.


The Deployment ID

While there’s no direct call such as RoleEnvironment.InDevFabric, there’s a neat trick I use, making it trivial to figure out. A disclaimer first: This trick is dependent on the specific way the Dev Fabric and Azure Fabric generate their Deployment IDs. This could possibly change with a future SDK update.


Whenever you deploy an Azure application, the deployment gets a unique Deployment ID. This value is available in:
RoleEnvironment.DeploymentId.

As it turns out, the deployment ID has two different formats, depending on runtime environment:


  • Dev Fabric: The deployment ID takes the form of  deployment(n), where n is sequentially incremented with each Dev Fabric deployment.
  • Azure Fabric: The deployment ID is a Guid.
With this little detail, you can now write a very simple method to determine whether you’re in Dev Fabric or Azure Fabric:
        private bool IsRunningInDevFabric()
        {
            // easiest check: try translate deployment ID into guid
            Guid guidId;
            if (Guid.TryParse(RoleEnvironment.DeploymentId, out guidId))
                return false;   // valid guid? We're in Azure Fabric
            return true;        // can't parse into guid? We're in Dev Fabric
        }

 Taking this one step further, I wrote a small asp.net demo app that prints out the deployment ID, along with the current instance ID. For example, here’s the output when running locally in my Dev Fabric:


BrowserCap-DevFabric

 Here’s the same app, published to, and running in, the Azure Fabric:

BrowserCap-AzureFabric



Try it yourself

I uploaded my demo code so you can try it out yourself. You’ll need to change the diagnostic storage account information in the asp.net role configuration, prior to deploying it to Azure.

Sunday, August 15, 2010

Azure Tip of the Day: Separate Diagnostic Storage Account

Recently I was helping someone debug a bizarre Azure Table storage issue. For some reason, the role’s state went into a busy/running loop as soon as the OnStart() event handler attempted to set up some Azure tables. To make matters worse, once the startup code attempted to connect to the storage account and create a table, we no longer received Trace log output. This doesn’t help much when the only log message is “In OnStart()…”.

To get our diagnostics back, we created a separate storage account exclusively for diagnostics. Once we did this, we had an uninterrupted flow of trace statements, even though the table-access code was still having issues with the table storage account.

This leads me to my tip of the day: Set up a separate storage account for diagnostics.

Aside from isolating storage connectivity issues, there are other benefits to having a separate storage account for diagnostics:

  • You can have a separate access key for diagnostics, granting this to a broader audience. For instance, you could give out the access key for people to use inside a diagnostics tool such as Cerabrata’s Diagnostics Manager, without having to give out the access key to your production data storage account.
  • Storage accounts have a transactional limit of approx. 500 transactions / second. Beyond that, and the Azure fabric throttles your access. If your app is writing even a single trace statement to diagnostic tables for every real data transaction, you’re doubling your transaction rate and you could experience throttling much sooner than expected.
  • An additional storage account does not necessarily equate to additional cost. You’re simply billed for the storage you consume. If the total amount of storage across two accounts remains the same as with a single account, your cost will remain the same.

Setting things up

First head to the Azure portal, and set up two accounts. I advise putting them in the same affinity group, alongside your Azure services.

twostorageaccts

Each account will have its own access keys. Simply configure both in your role.

connectionstrings

Now, all that’s left is specifying the diagnostic storage account for the DiagnosticMonitor, and the data storage account for “real” data access.  For instance, this example enables the DiagnosticsMonitor using MyAppDiagnosticStorage, while the table service uses MyAppDataStorage:

        public override bool OnStart()
        {

            DiagnosticMonitor.Start("MyAppDiagnosticStorage");
            Trace.TraceInformation("Writing to diagnostic storage");

            var dataStorageAccount = CloudStorageAccount
                .FromConfigurationSetting("MyAppDataStorage");
            var tableClient = dataStorageAccount.CreateCloudTableClient();
            tableClient.CreateTableIfNotExist("MyTable");

            RoleEnvironment.Changing += RoleEnvironmentChanging;

            return base.OnStart();
        }


That’s it!