Tuesday, December 28, 2010

Engaging the hand break

I was enjoying a peaceful day on Christmas with my family.  Santa hooked us up with an XBox and Kinect, which took over the TV room. My son scored a guitar+amp starter kit, which he promptly took to his bedroom and started working through the lessons.

Eventually (and miraculously), the kids grew tired and finally went to bed.  Then, nearing midnight, my son awoke and called me to his room. I entered on autopilot, not thinking of potential roadblocks such as an elongated cardboard guitar box strewn across the doorjamb. And that's when my foot made first contact and sent me flying through the air, hands outstretched in front of me.

I might have made it all the way to the far wall, had it not been for the foot of his bed being directly in my flight path. My outstretched right hand connected solidly with his Cargo bed. The clearest memory I have, while engaging the hand brake, is the sound my bone made upon contact. I hoped it was simply a cracked knuckle, but the X-Ray proved otherwise.

fracture

I have  a metacarpal shaft fracture [also known as a Boxer's Fracture, a common injury to boxers who connect with the last two knuckles of their hand during a punch].  My right hand will be immobilized for 2-3 weeks while the bone fuses itself back together.

Luckily I’m left-handed so I’m still relatively functional. I reprogrammed my mouse for left handed operation, and I'm taking advantage of speech-to-text to reduce the need to type.

After this experience I no longer doubt the validity of ER incompetence stories:

  • The check-in nurse at the front desk asked for my ID and if it still had a current address (“Yes,” I replied). Upon checkout, she once again asked for my ID and if the address had changed. I informed her that during my ER stay, I moved to a new house.  She said, “Really?” It took a few moments before she returned to reality and realized the absurdity of her question (and my answer).
  • My nurse had no idea how to cut gauze with normal scissors,  spending almost a half-minute on a 4-inch cut (with a quizzical expression on her face the entire time).
  • The ER doctor refused to believe my story, convinced that I was in a brawl (contrary to twitter rumors, I did not beat up someone who was dissing Azure).

So there it is: the story of my broken hand.

Saturday, November 6, 2010

Presentation Materials: CMAP Code Camp, Fall 2010

For those who attended my Azure Quick-Start or Azure Tips-n-Tricks sessions at CMAP Code Camp on Nov. 6, here are the slide decks and demo projects I went through.

Each of the slide decks also includes a list of new Azure features announced at PDC 2010.

Thanks for all the great questions! Feel free to post additional questions or comments here.

Sunday, October 24, 2010

New employer, new Azure role

For over five years, I’ve been fortunate to work for RDA, a consultancy headquartered in Baltimore, MD. The company is a class act, with great people.  I’ve worked on nearly 20 engagements, with technology all over the .NET map. My last day with RDA was Wednesday. Let me elaborate a bit…

About two years ago, I started working with Azure, Microsoft’s cloud computing platform. My first project was with Coca Cola Enterprises. Then, in 2010, I spent almost 6 months “on loan” to Microsoft, as an Azure Virtual Technology Specialist. In my V-TS role, I worked with over a dozen customers, helping them with Azure migration solutions.

Over the past year, I’ve been speaking about Azure all over the Mid-Atlantic, at user groups, code camps, and even an Azure Bootcamp. If you couldn’t tell by now, let me spell things out for you: I really, really enjoy working with, and teaching, Azure.

On October 1, only a few short weeks ago, I was honored with an Azure MVP Award from Microsoft (I blogged about this earlier). I couldn’t be happier! Through the MVP program, I’ve met some seriously-talented Azure folks that share my enthusiasm and passion for the platform.

Ironically, at the same time the MVP announcement came out, I had been looking into a new role at another company. A perfect-fit role, one that I simply could not say no to. A role that would be dedicated to Azure.

The role? Azure Architect Evangelist, Mid-Atlantic.

The company? Microsoft.

I'll be a member of the Developer and Platform Evangelism (DPE) team. My primary responsibility will be working with ISVs, helping them migrate their applications to Azure. As this position specifically covers the mid-Atlantic area, I won't have to relocate.

And that brings me to today. I’m sitting on a plane, en route to Redmond. I officially become a Microsoftee tomorrow morning, only 3 days days before the Azure-heavy Professional Developers Conference, being held on the Microsoft campus. The PDC will be a great way to kick off my Microsoft career.

With Microsoft as my new employer, I’ll have to step down as an active MVP, effective Monday morning. However, that little technicality has no bearing on my developer community participation. In fact, I have three talks scheduled in November: Two Azure talks Nov. 6 at CMAP Code Camp in Columbia, MD, and an Azure+MongoDB talk at the Mongo DC conference, Nov. 18.

I’ll close this post out now, as I have lots to do (including another Azure post). I’m totally stoked about this career move!!!

Thursday, October 14, 2010

Presentation Materials: Richmond Code Camp, Fall 2010

For those who attended either my Azure Quick-Start or Azure Tips n Tricks, I posted the slide decks and a few demo projects that served as the basis for those talks.

The tips-n-tricks slide deck has more details than what we covered during the session, along with links to various reference articles, blog posts, and company websites.

Sunday, October 3, 2010

Richmond Code Camp Oct 9, 2010: 2 Azure Talks

Code Camp season is underway, and for me, the season kicks off in Richmond, where I’ll be presenting two back-to-back Azure talks:

  • Azure Quick-Start. In this intro-level session, I’ll show how to quickly get up and running with Azure, including a tour of the Azure portal, Visual Studio integration, and what it takes to build and launch your first app.
  • Azure Tips, Tricks, and How-To’s. This session goes a bit deeper, and assumes you know the basics of Azure (if you attend my first session, you should be all set). I have a bunch of real-world items I’ll be running through, complete with code samples, that you can use in your own Azure projects.

There are about 40 sessions overall this year at Richmond Code Camp. If you’re in the Richmond, VA area on the 9th, please come by, say hi, and enjoy the day!

Learn more about Richmond Code Camp and sign up  here.

Saturday, October 2, 2010

I’ve been awarded Azure MVP!

October 1, 2010: A day I’ll remember for a long time. I was on an early flight home from Seattle after 3 amazing days with the Azure team, participating in a Software Design Review (SDR). I was headed home to my family, and to support my wife during this difficult time (she lost her father a week ago).

Being October 1, I knew that it was MVP award day. Naturally, my plan was to stare at my email client , hoping my magic powers would help deliver some good news to me.

My flight took off at 7:15am. I distracted myself with Vittorio Bertocci’s new Windows Identity Foundation book until WiFi was available. The book caught the attention of the gentleman next to me, who develops against Amazon’s AWS, and we spent a bit of time discussing Azure and its benefits.

Shortly after crossing into Montana (courtesy of FlightAware),  I received “The Email,” welcoming me to the inaugural group of Azure MVPs! I found it only fitting that we were passing through cloud cover at the time, and ironic that the first person I told was an AWS guy (and yes, I now have his business card and plan to visit his group in Atlanta).

I’m truly honored to be the recipient of this award, and thankful for all the support I’ve received from the developer community. I first got involved in the community back in October 2008, when Microsoft’s Dr. Z invited me to speak at techdays ‘08 in Washington, DC. Since then, I’ve had the opportunity to speak at many user groups, code camps, webcasts, podcasts such as Community Megaphone, and even an Azure Bootcamp. I’ve met some great people at these events, and it has greatly enriched my life (I can only hope I’ve had some positive effect on theirs as well).

My family has been very understanding and supportive of my speaking and blogging activities (although my wife no longer believes that I’ll be off the computer “in just a minute…”).  RDA, my employer, supports my efforts as well, and co-sponsored the Azure Bootcamp. And, last but not least, I owe a debt of gratitude to Microsoft for presenting me with an MVP!

Sheesh – I feel like one of those long-winded Academy Award winners who rambles too long and get-off-the-stage music kicks in. Time to get back to techy goodness, especially Windows Azure Goodness! I have some cool things to blog about, including my adventures with MongoDB running in Azure.

Monday, September 20, 2010

MongoDB on Azure: Boston, Sept. 20

A few months ago, I had the pleasure of working with the folks at 10gen to explore the feasibility of running their MongoDB document-centric database on the Azure platform.

I had planned on presenting this in Boston today at the MongoDB Conference. In my place, Microsoft's Mark Eisenberg will be presenting MongoDB on Azure.

I will be posting a short series on this as well, including:

  1. The challenges (and solutions) of hosting MongoDB in Azure, including scalability, management, and client-side accessibility
  2. How to run a standalone MongoDB database instance in an Azure worker role.
  3. How to configure and run a MongoDB replica set spanning several worker roles

For more information about 10gen, visit their company website, or the MongoDB website.

Follow Mark Eisenberg on Twitter.

Sunday, September 19, 2010

Mid-Atlantic Developer Community Leadership Summit–Sept. 18

Yesterday I had the honor of attending an all-day Developer Community leadership summit, where user group leaders, code camp organizers, and speakers gathered to discuss how to manage the Mid-Atlantic developer community events in the coming year.

Andrew Duthie, our local Developer Evangelist, was our fearless leader and provided a great environment for us, complete with food, drink, and Youtube-worthy dance moves (if someone has video, please send me the link!). I’d be remiss without a tip o’ the hat to Andrew’s wife and kids who helped out considerably.

We had representation from INETA as well: Rob Zelt (INETA President), Frank La Vigne (membership mentor, DC, DE, MD, NJ and NY), and Kevin Griffin (membership mentor, VA).

On to the event and its purpose. We spent the morning introducing ourselves, many of which have only known each other through online conversations (Twitter is great for this – maybe we should call ourselves the Under-140 crowd?). We also talked about what we hope to discuss and achieve during the summit. Personally, I hoped to discuss ways of encouraging more community participation. As a group, we settled in on a few key topics. I’ll mention four in particular: Community Growth Management, Finding New Speakers, Planning in the Large, and The Big Event.

Community Growth Management

We have an interesting problem in the Mid-Atlantic: An ever-growing developer community, with an equally-growing number of community events. Covering Pennsylvania, West Virginia, Maryland, DC, and Virginia, we have user groups and special interest groups covering basic programming, databases, content management systems, cloud computing, C64, and just about anything else to do with software development. These groups meet in the evenings, and we’re starting to see some overlap.

If that wasn’t enough, we have the larger day-long events: SharePoint Saturdays, SQL Saturdays, and Code Camps. With maybe a dozen such events typically taking place in the spring and fall, it’s becoming a challenge to find speakers. Then there’s the issue of attendance: how to choose between all of these great events???

One thing was clear: we needed to up our game when it came to scheduling. We talked about using resources like Community Megaphone to publish upcoming events, along with incorporating its event widget in our community websites and blogs. We also talked about taking advantage of mailing lists, with newsletters published to the lists with upcoming events.

User group scheduling is certainly important. Some areas might have only one or two main user groups, but other areas, such as the DC area, there are a very large number of groups (CMAP and related SIGs, BaltoMSDN, FredNUG, RockNUG, DCDNUG, Capital Area Cloud Computing User Group, CapArea.NET and related SIGs, and more). That’s quite a few user groups in one concentrated area for attendees to choose from.

Large event event scheduling seemed to be a bigger challenge though. This fall, a few of the code camps such as Philly and Richmond overlapped, which makes it improbable for speakers to present at both events (though it might be possible to attend a kickoff keynote and  morning presentation in Philly, sprint to Richmond, and arrive in time for some awesome SWAG…).

Moving forward, I think the group will push forward with Community Megaphone, the Mid Atlantic Devs mailing list, and the new list server being set up by Steve Presley.

Finding New Speakers

How do you encourage people to speak at events? Many people attend local user groups on a regular basis, and it would be so beneficial to these individuals and to the community if they would give a talk at an upcoming event. We talked about some of the benefits that may or may not be obvious: Personal growth and learning; exposure (who knows might be in the audience, waiting to snatch you away from your over-cautious CTO’s company?); and networking were key benefits we could point out when talking to user community participants.

A few of the user groups, such as RockNUG, have occasional lightning-round style meetings, where several small (5-10 minute) presentations are given. These are much easier to prepare for, and offer a great way for someone to get introduced to speaking. It’s perfectly acceptable for these topics to be beginner-level, and the presenter doesn’t need to be an expert in their presentation topic – it’s all about sharing information with other people in the community.

Planning in the Large

Dux Raymond, who runs some crazy-big SharePoint Saturday events, gave his insight on running such large events, covering planning, costs, sponsors, venues, advertising and more.  There’s quite a bit that goes into pulling off such an event, and it was great hearing this information first-hand.

This talk helped inspire us when discussing The Big Event…

The Big Event

We have some seriously forward-thinking people in this community (I guess I should mention pride too?). With events such as Codestock and Devlink, it was only a matter of time that someone said – Hey, what about a Mega Dev Event in the Mid Atlantic??? There was no shortage of enthusiasm and discussion around this, from the likes of The Kevins (Hazzard and Griffin), Rich Dudley, Brian Lanham, Joel Cochran, Pete Brown, Dane Morgridge, Frank La Vigne, Pete Brown, and, oh, a dozen or two others whose names I only wish I wrote down!

Based on the short but energetic talk about The Big Event, I can definitely see this happening, with a core group of volunteers stepping up to coordinate things. Someone suggested an Olympics-style plan over several years, where the venue would move to a different area annually. There was some jockeying for being the host city. I wonder if we’ll have a bidding war? Rock-Paper-Scissors battle?

Final Thoughts

From a personal perspective, I got quite a bit out of this event, especially around the topic of community involvement. I’m looking forward to some upcoming conversations with a few people I know…

As a community speaker, I have a renewed appreciation for what it takes to organize and run user groups,code camps, and large events. There are some seriously talented and dedicated people who make this happen. Some of these behind-the-scenes details tend to get lost in the noise, especially when you arrive at an event and everything is working smoothly, as planned.

Wednesday, September 15, 2010

Presenting an intro to Azure, September 16

I'll be heading down to Alexandria, VA Thursday, September 16 to talk about the Azure cloud computing platform. This talk will be at the Capital Area Cloud Computing User Group.

I'll be talking about the core features of Azure, such as Compute, Storage, and SQL Azure. Of course, no talk would be complete without demos - including the coolest little database management tool for SQL Azure, Code-Name Houston.

For more details and directions, head on over here.

Monday, August 30, 2010

Presenting A Night Of Azure! RockNUG Sept. 8

On Wednesday, September 8, I’ll be taking over both the n00b presentation and the Featured presentation at the Rockville .NET User Group in Rockville, MD. I’ll be filling your heads with lots and lots of Azure, Microsoft’s most-excellent cloud computing platform!

If you’re new to Azure, the n00b session is the place to be! We’ll have our very first Azure application up and running in a few minutes, and we’ll learn about the basic moving parts.

For the Feature Presentation, we’ll build on our basic application, mixing in features from the three different Azure “portals”:

  • Windows Azure – this deals with the virtual machines and highly-scalable storage
  • SQL Azure – this is SQL Server, cloud-style!
  • AppFabric – this provides connectivity and access control

For those who would like to follow along: We’ll be taking things pretty slow in the n00b session. If you’d like to try building your first Azure app along with me, you’ll need to a few things first – see details here. A quick summary to get up and running:

  • Windows Vista, 2008, or Windows 7 (sorry, XP folks…)
  • Visual Studio 2010 Web Developer Express or any version from your MSDN subscription (Professional, Ultimate, etc.)
  • SQL Server 2005 Express or above
  • Azure SDK 1.2 + Visual Studio Tools

Friday, August 20, 2010

Azure Tip of the Day: Determine if running in Dev Fabric or Azure Fabric

One potential goal, when writing a new Azure application, is to support running in both Azure-hosted and non-Azure environments. The SDK gives us an easy way to check this:
            if (RoleEnvironment.IsAvailable)
            {
                // Azure-specific actions
            }

With this check, you could, say, build a class library that has specific behaviors depending on the environment in which it’s running.

Now, let’s assume the Azure environment is available, and you need to take a specific action depending on whether you’re running in the Dev Fabric vs. the Azure Fabric. Unfortunately, there’s no specific method or property in RoleEnvironment that helps us out.

This brings me to today’s tip: Determining whether an app is running in Dev Fabric or Azure Fabric.


The Deployment ID

While there’s no direct call such as RoleEnvironment.InDevFabric, there’s a neat trick I use, making it trivial to figure out. A disclaimer first: This trick is dependent on the specific way the Dev Fabric and Azure Fabric generate their Deployment IDs. This could possibly change with a future SDK update.


Whenever you deploy an Azure application, the deployment gets a unique Deployment ID. This value is available in:
RoleEnvironment.DeploymentId.

As it turns out, the deployment ID has two different formats, depending on runtime environment:


  • Dev Fabric: The deployment ID takes the form of  deployment(n), where n is sequentially incremented with each Dev Fabric deployment.
  • Azure Fabric: The deployment ID is a Guid.
With this little detail, you can now write a very simple method to determine whether you’re in Dev Fabric or Azure Fabric:
        private bool IsRunningInDevFabric()
        {
            // easiest check: try translate deployment ID into guid
            Guid guidId;
            if (Guid.TryParse(RoleEnvironment.DeploymentId, out guidId))
                return false;   // valid guid? We're in Azure Fabric
            return true;        // can't parse into guid? We're in Dev Fabric
        }

 Taking this one step further, I wrote a small asp.net demo app that prints out the deployment ID, along with the current instance ID. For example, here’s the output when running locally in my Dev Fabric:


BrowserCap-DevFabric

 Here’s the same app, published to, and running in, the Azure Fabric:

BrowserCap-AzureFabric



Try it yourself

I uploaded my demo code so you can try it out yourself. You’ll need to change the diagnostic storage account information in the asp.net role configuration, prior to deploying it to Azure.

Sunday, August 15, 2010

Azure Tip of the Day: Separate Diagnostic Storage Account

Recently I was helping someone debug a bizarre Azure Table storage issue. For some reason, the role’s state went into a busy/running loop as soon as the OnStart() event handler attempted to set up some Azure tables. To make matters worse, once the startup code attempted to connect to the storage account and create a table, we no longer received Trace log output. This doesn’t help much when the only log message is “In OnStart()…”.

To get our diagnostics back, we created a separate storage account exclusively for diagnostics. Once we did this, we had an uninterrupted flow of trace statements, even though the table-access code was still having issues with the table storage account.

This leads me to my tip of the day: Set up a separate storage account for diagnostics.

Aside from isolating storage connectivity issues, there are other benefits to having a separate storage account for diagnostics:

  • You can have a separate access key for diagnostics, granting this to a broader audience. For instance, you could give out the access key for people to use inside a diagnostics tool such as Cerabrata’s Diagnostics Manager, without having to give out the access key to your production data storage account.
  • Storage accounts have a transactional limit of approx. 500 transactions / second. Beyond that, and the Azure fabric throttles your access. If your app is writing even a single trace statement to diagnostic tables for every real data transaction, you’re doubling your transaction rate and you could experience throttling much sooner than expected.
  • An additional storage account does not necessarily equate to additional cost. You’re simply billed for the storage you consume. If the total amount of storage across two accounts remains the same as with a single account, your cost will remain the same.

Setting things up

First head to the Azure portal, and set up two accounts. I advise putting them in the same affinity group, alongside your Azure services.

twostorageaccts

Each account will have its own access keys. Simply configure both in your role.

connectionstrings

Now, all that’s left is specifying the diagnostic storage account for the DiagnosticMonitor, and the data storage account for “real” data access.  For instance, this example enables the DiagnosticsMonitor using MyAppDiagnosticStorage, while the table service uses MyAppDataStorage:

        public override bool OnStart()
        {

            DiagnosticMonitor.Start("MyAppDiagnosticStorage");
            Trace.TraceInformation("Writing to diagnostic storage");

            var dataStorageAccount = CloudStorageAccount
                .FromConfigurationSetting("MyAppDataStorage");
            var tableClient = dataStorageAccount.CreateCloudTableClient();
            tableClient.CreateTableIfNotExist("MyTable");

            RoleEnvironment.Changing += RoleEnvironmentChanging;

            return base.OnStart();
        }


That’s it!

Tuesday, July 20, 2010

Come learn Azure in DC, July 28!

I’ll be heading out to the Washington DC DotNet User Group July 28 to present an introduction to Azure. I’ll give a quick overview of the platform, then dive into code samples showing how to build and deploy an Azure application to the cloud. I’ll also show SQL Azure, the cloud-equivalent of SQL Server.

The meeting starts at 6:30. If you plan on attending, please register here so the group can plan accordingly.

For more information about DCDotNetUG, including directions, please visit their homepage at www.dcdnug.org. Directions are on their Contact Us page.

Friday, July 2, 2010

Words speak louder than reactions

I’ve heard the old expression so many times: Actions speak louder than words. That’s true, especially when the words sound like “I’m going to make the world a better place!” Or maybe , “Wait until you see this new program I’m going to build for you!” Truly, actions speak much, much louder than these words, especially if there’s no follow-through and the words simply turn into empty promises.

But… let’s look at another type of action: the knee-jerk reaction. How often have we taken a vengeful, spiteful, or anger-induced action before having a discussion about it? How often are those reactions a result of our emotions running amok before we have an opportunity to think about the ramifications of our reactions? If I do a bit of reflection, I can easily identify more than one occasion where I would have likely been better off talking something through than taking the action (the reaction) that I ultimately took.

Today’s Knee-Jerk

I have a lake house that I rent out, in Lake Anna, Virginia. We take pride in our property, and most of our guests return each year, knowing we strive to provide the best vacation environment possible. Today, our cleaning company emailed me, letting me know that I’ve been fired as a customer, simply because I emailed a punchlist of not-so-favorable feedback from last week’s renters, and suggested that her cleaning crew absorb the cost of the cleaning.

I certainly expected an email exchange, or possibly a phone call, with a settlement negotiation, perhaps. Maybe we split the cost? Maybe we educate this new crew on how to live up to expectations in the future? Instead, I received a terse email, stating that this situation is beyond acceptable, and that the cleaning company shouldn’t be financially responsible for such actions. So… we quit.

Do-the-right-thing attempt

Let’s face it: the damage was done, and it was unlikely that I’d be able to preserve a working relationship with this financially-struggling cleaning company. However, I took the high road and placed a well-crafted phone call to the business owner. I expressed the idea of open communication, and how, as a service provider, they had every right to call and discuss the cleaning terms with me, and even negotiate a compromise.

Alas, I was correct in my assumption: There was no way they were ever going to provide cleaning services to me again. They are running on extremely thin margins, they said. And since they already paid the errant cleaning crew, the company itself took the hit (less than $125), which now put them at financial risk. The bitterness and anger flowed through the phone line (it was on Skype, but you get the picture), and there was to be no compromise, no resolution, no happy-path.

Knee-Jerks in the software development world

Most of my readers are in the software development field, and probably don’t care too much about my cleaning crew woes. However: There’s a lesson in here for all of us: Communication is King, and should trump knee-jerk and emotional decisions!

Knee-jerk reactions are easy: Just sit back in your chair, fume for a while, and then act out your emotions with glee. It’s much harder, in a fit of emotion, to engage rational thought and communicate directly with the person or entity causing you stress and grief.

Think about the net result you’re looking for: Do you really want to pick a fight? Do you really want to sever ties and damage a potentially-lucrative future business relationship? Being in the software field, I am amazed at just how many people I keep running into over the years, from previous employment. As large as the industry is, sometimes that circle seems pretty small. I certainly want to keep these relationships alive and healthy!

Words over reaction

Will communication always lead to a satisfactory settlement? No way! However, I’m pretty sure that knee-jerk reactions will almost always lead to someone getting angry, hurt, disparaged, frustrated, and unwilling to work in a professional manner in the future.

Is a polite, professional phone call or email easy? Not always, and certainly not as easy as a knee-jerk email filled with venom and animosity! This is no reason to avoid doing the right thing, and exercising diplomacy wherever possible.

Today, I lost my cleaning crew. And this loss will have untold impact on their bottom line, as they can no longer showcase my business on their website or even use me as a reference for future work. Was the reaction really worth it?

As you run into your next on-the-job challenge, whether it’s with a teammate, your boss, your employee, or your customer: I urge you to consider words over reaction.

Related links

Our lake house website is www.LakeAnnaDream.com.

Tuesday, June 29, 2010

Azure Bootcamp Prerequisites

Updated 7/5/2010 with download link to SQL Server 2008 R2 Management Studio Express.

For those attending the Virginia Beach Azure Bootcamp, July 7 is only a week away! Kevin Griffin and I are going to hit the ground running, so you’ll want to spend some time getting all needed software installed. Here’s the list of what you’ll need, along with download links for each product.

For more information about the bootcamp, please leave a comment.

Kevin and I are both available via twitter. Kevin is @1kevgriff. I’m @dmakogon.

Visual Studio

First, there’s the development environment. You’ll need one of the following:

  • Visual Web Developer 2010 Express. This is a free edition that will let you build all of the samples we’ll be working with.
  • Visual Studio 2010. If you have an MSDN subscription, you can download and install any version of VS2010 from MSDN.

Download Visual Web Developer 2010 Express here.

Download Visual Studio 2010 from MSDN.

Azure SDK + Tools

You’ll need the latest SDK + Tools, version 1.2, released in June 2010. This includes Visual Studio extensions for the various cloud projects.

Download the Azure SDK, version 1.2  here.

AppFabric SDK

The AppFabric SDK version 1.0 was released April, 2010. The AppFabric SDK lets you build applications that take advantage of the Service Bus and Access Control services of Azure AppFabric.

Download the AppFabric SDK, version 1.0  here.

Table / Blob Storage Viewer

Visual Studio 2010, in conjunction with the Azure Tools v1.2, provides a built-in table storage viewer. However, this is a read-only set of tools. To modify storage data, you’ll need a more advanced tool such as the freely-available Azure Storage Explorer or the commercial Cerebrata Cloud Storage Studio.

Download the Azure Storage Explorer here.

Download the Cerebrata Cloud Storage Studio trial here.

Azure Training Kit

You’ll need the Azure Platform Training Kit, June 2010 update.

Download the training kit, June 2010  here.

Folding @Home

Folding @Home is a distributed computing project. You’ll need both the Visual Studio project and the Folding@home client application.

Download the Visual Studio 2010 project here.

Download the Folding@home console application here.

SQL Server Management Studio 2008 R2

SSMS is the de facto SQL-editing tool. With SQL Azure, you’ll need the new Management Studio Express for SQL 2008 R2, as SSMS Express 2008 R2 supports SQL Azure script generation.

Download SSMS 2008 R2 here.

Friday, June 25, 2010

SQL Azure 50GB is Live!

During Tech Ed this year, we learned about the new 50GB database limit for SQL Azure, up from 10GB. The go-live date was set for June 28th. Surprise – it’s live today!

How do I choose this new database size?

When creating your new database, select Business edition, and look at the size dropdown:

50GBdropdown

Notice that there are now five sizes to choose from. This sets the size limit, which also directly impacts monthly cost, as each 10GB increment runs $99.99 per month:

Size: Business Edition

Monthly Rate

10GB

$99.99

20GB

$199.98

30GB

$299.97

40GB

$399.96

50GB

$499.95

But wait… there’s more! Now take a look at the Web edition sizes:
5GBdropdown

Web Edition pricing only has two tiers:

Size: Web Edition

Monthly Rate

1GB

$9.99

5GB

$49.95

Size and Price

The really nice thing about all these additional sizes: this sets your spending cap as well as size cap. For example: if you set your Business Edition database to 10GB, your monthly charge will never exceed $99.99 per month.

Let’s make things more interesting. Let’s say you set up a 20GB Business Edition database. You are not simply charged $199.98 per month. Rather, your monthly cost is amortized daily, with the daily rate based on the maximum size the database reaches on a given day.

For this 5GB database, let’s say you stay under 1GB for the first 5 days. Those days will accrue at the 1GB rate. Then one day you go over 1GB. At that point, you start accruing at the 5GB rate. If your database ever drops back under 1GB, your daily accrual rate drops back to the 1GB rate.

The same rate pattern applies to the Business edition, where the billing tiers are 10, 20, 30, 40, and 50GB.

Changing Sizes

Ok, so you set up your new database. Let’s say it’s Web Edition, 1GB. And you now realize you need the ability to grow your database to 5GB. No problem: just connect to the Master database and issue an ALTER DATABASE command:

ALTER DATABASE MyDatabase MODIFY (EDITION='WEB', MAXSIZE=5GB)

Until your actual database size exceeds 1GB, this change will not cause you to incur additional costs; you’ll still be billed at the 1GB rate.

More information

The SQL Azure blog has details about the new sizes, as well as all T-SQL for creating and altering databases, here.

Today’s launch announcement is here.

Wednesday, June 23, 2010

Azure Guest OS 1.4

On June 7, the Azure team introduced the latest SDK, version 1.2, supporting .NET 4 and other goodies. Along with the SDK, the Azure Guest OS was updated to version 1.3.

A few days ago, a new Guest OS appeared: Version 1.4. Assuming Guest OS Auto-Upgrade is enabled, you’ve automatically been upgraded. If you have any older deployments that have a specific OS version in the service configuration file, simply change the OS version to “*”. If you visit the Azure portal, you’ll see this Guest OS:

azure-os-14

If you don’t see 1.4, that means your service is set to a specific OS. You can choose 1.4 from the OS Settings… dialog:

azure-14-manual

What’s new in 1.4?

There are a few changes you should be aware of.

Azure Drive fixes

If you’re taking advantage of Azure Drives in blob storage, be aware that there might be I/O errors under heavy load. OS 1.4 has a fix for this.

WCF Data Services fix

Guest OS 1.3 had a URL-encoding bug affecting Request URI’s when using LINQ. This is now fixed.

Security Patches

The latest security patches, through April 2010, have been applied to Guest OS 1.4, bringing it in line with Windows Server 2008 SP2.

Related Links

Friday, June 11, 2010

NoVa Code Camp June 2010 Materials: Azure

On Saturday, June 12, I presented “Azure: The Essential Setup Guide” at the Northern Virginia Code Camp. There were several great questions today. Two immediately come to mind. As I think of the others, I’ll add them here.

  • Can multiple roles be combined to run on a single virtual machine? No. Each role runs in its own VM instances. My advice is to build worker roles that handle multiple tasks. For instance, I gave an example in class where a single worker role processes both thumbnails and PDF generation based on different queue messages.
  • Does all code need to be added to a role, including code that used to reside in separate class libraries? Azure roles simply have to reference those class libraries – just add them to your solution and add a reference to the specific role that needs the library.
  • When creating an Azure service, does it only store the code? The service definition has a specific URL as well as the data center affinity. Affinity equates to the specific regional data center to run your code. Be sure that all of your related storage and services have the same affinity! This way, the bandwidth between them is free, and the speed is very fast (all communication stays within the data center).
  • When my virtual machine’s OS is upgraded, does Azure start a new instance before taking down the old instance? Azure will take down your instance and then re-launch it in an upgraded virtual machine. If you want to avoid downtime, run a minimum of two instances; at this point, Azure upgrades in server groups and won’t upgrade all at once.

I also called out a few tips and pointers:

  • When running your web role locally, be sure that the cloud project is set as the startup project. Make sure you see your website running on something like port 81, and that Visual Studio tells you it’s starting up the development fabric. If you’re running on some high-value random port, chances are you’re running in the asp.net development server.
  • If you’re pushing code to the cloud, and you’re set up for more than one instance, be sure to delete your deployment at the end of the workday to conserve hours.
  • When debugging locally, open the dev fabric UI (the Azure flag in the system tray). You can then view each running instance and see all of your Trace statements.

Here’s the slide deck and our simple Hello Azure app:

The slide deck contains the links we discussed and visited today.

If you have any specific questions, or you recall a question or tip from class that I forgot to list above, please post a comment!

Monday, June 7, 2010

Azure SDK+Tools 1.2: Publishing and Monitoring!

Today, Microsoft announced the availability of Azure SDK v1.2, with related tools for Visual Studio.

In my last post, I covered support for .NET 4.0, as well as the integration of IntelliTrace for Azure applications. In this post, I’ll cover deployment and monitoring.

Deployment

Visual Studio makes it trivial to build an Azure application. However, unless you were using a build script configured to automate this task, publishing has been a two- (or three-) step process:

  1. Build and create a service deployment package using Visual Studio
  2. (optionally) upload the deployment package to blob storage, using an Azure storage management tool such as Azure Storage Explorer.
  3. Through the Azure portal, select a deployment package either from your local disk or from blob storage.

This sequence was time-consuming, and there was no easy way to check on deployment status without either watching the portal’s website or running something like a PowerShell script to keep checking on your deployment status (both of which take your fingers and eyeballs away from Visual Studio).

With the new Visual Studio tools in Azure 1.2, you now have a fully-integrated publishing setup! First, there’s the publishing wizard:

azure-publish

Notice, up top, that you can choose between simply creating your service package and actually deploying your service package! You’ll need to configure the wizard to know about your subscription, which requires both a subscription ID and a certificate. Just drop down the Credentials dropdown and choose Add…

azure-configureservice

Notice that you now have an option for enabling IntelliTrace, as long as your roles target .NET 4 (see my previous post for more details, or Jim Nakashima’s post for even more details).

Once you finish filling out the Wizard and push OK, your service will be published asynchronously from Visual Studio, and its status is shown in the new Windows Azure Activity Log:

azure-deploy

Once the deployment is complete, you’ll see something like this in the History:

azure-deploy-complete

And that’s it – no switching to the portal, no manual upload to blob storage. Just… a publishing wizard.

Monitoring

Ok, if the Azure Tools was just baked-in deployment, I’d be happy. But wait – there’s more! Now there’s a baked-in service and storage viewer as well!

azure-explorer2

With the explorer, you’ll easily be able to view your service instances and storage data in blobs and tables. You even get filtering support. For instance, here’s a view of the WADLogs table, filtered to show all content dated after May 1:

azure-table-filtering

The explorer will show you the status of your services. For instance, I can see that my WebRole1 instance is running:

azure-instance-running

I can also ask for my IntelliTrace logs from here (again, see my previous post for details).

So you might be wondering: If there’s such a good explorer built into Visual Studio, why would I need a 3rd-party tool such as Azure Storage Explorer or Cerebrata’s Cloud Storage Studio? The simple answer is that the built-in explorer is read-only. You’ll be able to view your services and storage, but you won’t be able to modify anything. 3rd-party tools will give you the ability to upload content, suspend or upgrade instances, etc.

Go forth and publish!

These features are great additions to Visual Studio that simplify deployment and monitoring. Enjoy!

Azure SDK+Tools 1.2: .NET 4 Plus IntelliTrace!

Today, Microsoft announced the availability of Azure SDK v1.2, with related tools for Visual Studio.

This release is a big deal, both from an app-enabling point of view and from an ease-of-deployment point-of-view. Here’s a quick look at what’s new with Application Support, with .NET 4 and IntelliTrace. The next post will cover deployment.

Application support with .NET 4.0

The really big deal with v1.2 is support for the .NET 4 framework! Just choose .NET 4 as your target framework for any of your cloud-targeted projects:

dotnet4

Once you deploy your app to the cloud, you’ll see the all-new Guest OS 1.3 that supports .NET 4:

guestos13

Sweet! And you can still manage your OS upgrades. If you want your existing apps to use an older OS version, just click OS Settings… and configure your service for manual OS selection:

 

azure-os-manual

IntelliTrace

As long as you choose .NET 4 as your role’s target framework, you’ll be all set for IntelliTrace with your Azure app. You enable IntelliTrace within the all-new Publish wizard. Note that you can still simply create a deployment package without actually publishing to Azure. For actual publishing, you’ll need to have a Storage account configured, as that’s where the deployment package is pushed to, prior to deploying it to your actual Azure Service:

azure-publish

Note: The Intellitrace option is disabled for .NET 3.5 deployments.

 

Once you’ve enabled IntelliTrace and deployed your app, you can access IntelliTrace data directly from Visual Studio. Just right-click your role instance and View IntelliTrace logs. Note that IntelliTrace is only available for roles with (IntelliTrace) next to the service+slot name.

ViewIntelliTraceLogs

And then… magic happens, right in Visual Studio:

itrace-2

Once this completes, you’ll see the IntelliTrace information presented in Visual Studio:

itrace-3

 

Go have fun!!!

This is a great Azure SDK release, especially with .NET 4 support. And IntelliSense is icing on the cake. Sweet, sweet cake…

Friday, June 4, 2010

Interview with Microsoft Partner Network: Azure, November 2009

While out at PDC in November, I spent a bit of time talking with Microsoft's John McClelland about Azure and customer adoption. This interview is now online here.

Interview with Community Megaphone Podcast - May 8, 2010

During the CMAP Code Camp on May 8, The Community Megaphone dynamic duo, Andrew Duthie and Dane Morgridge, showed up and hosted their first live podcast - Episode #10. They interviewed several community speakers in attendance, including Yours Truly.

I participated in a group discussion about the ever-growing user group presence in the Washington, DC area, with its newest addition being the Washington DC .Net user group.  Following that, Joel Cochran stepped in for Dane as Guest Host, along with Kevin Griffin and Steve Andrews, and I was interviewed about Azure. We discussed pricing, target audience, and migration. We definitely had fun, as things devolved into a debate around pronunciation (a theme that continued throughout the day with other interviews). Enjoy!

Sunday, May 30, 2010

CMAP Code Camp May 2010 Materials: Azure

On May 8, I presented two Azure talks: “The Essential Setup Guide” and “Taking Advantage of the Platform.” Here’s the slide deck and sample code I used for these talks:

Thanks to all who attended. As most of you will recall, nearly everyone from the first session stayed around for the second session, so it ended up being like a double-length talk. Here are a few takeaways from the talks:

  • We went over some of the basics of Azure, and the fact that it offers everything except the app: network, computers, operating systems, failover, storage, monitoring… the works.
  • We went over some of the terms. Remember that every “role” you create for Azure is nothing more than a definition for a virtual machine. You can then deploy that role to Azure, and you can have any number of instances (essentially copies) of a given role. You pay for the number of roles you deploy, and you can scale the number up or down depending on your needs.
  • To get started, visit www.azure.com – from here, you can select Get Tools & SDK, which will have you on your way. You’ll also need to enable a few things on your local development machine. See here for more details.
  • We built and deployed a very simple Hello Azure demo, with nothing more than a web page. This demonstrated how easy it is to set up a new Hosted Service through the Azure Portal (www.azure.com). From Visual Studio, we right-clicked the cloud project and selected Publish. This packaged our entire cloud application into a single cspkg file that could then be selected through the portal.
  • During the second session, we continued on our journey by looking at an application with tables and queues. We saw how straightforward it was to use both a local (dev fabric) queue and a real-life Azure-hosted queue. The SMS demo code (provided in the link above) has comments describing how to switch between the two. The same case is for the storage table: you may choose local (dev fabric) storage or Azure-hosted storage (you’ll have to set up a storage account in your Azure account).
  • I briefly mentioned affinity. Azure has 6 data centers throughout the world, two being in the United States. When you deploy a service, you can choose where it goes (or not choose at all). Here’s the key thing: you want your service and your storage located in the same data center. By placing everything in the same data center, data transfer is super-speedy, and you won’t incur any bandwidth charges when reading and writing data with your Azure-hosted services. This is what affinity is all about, and when you create each service or storage, you’ll be able to specify its affinity.

If anyone’s interested in a 2-day Azure Deep Dive, I’ll be teaching a free 2-day Azure Bootcamp July 7-8 in Virginia Beach. Register here. If you can’t make it to my bootcamp, check out additional dates and venues here.

Saturday, May 22, 2010

Richmond Code Camp May 2010 Materials: Azure talk

On May 22, I presented “Azure: Taking Advantage of the Platform.” Here’s the slide deck, sample code, and sample PowerShell script from the talk:

Thanks to everyone who attended, and for all the great questions! Here are a few takeaways from the talk:

  • The Azure portal is http://www.azure.com. Here’s where you’ll be able to administer your account. You’ll also see a link to download the latest SDK.
  • To set up an MSDN Premium account, visit my blog post here for a detailed walkthrough.
  • Download the SDK here. Then grab the Azure PowerShell cmdlets here.
  • To understand the true cost of web and worker roles, visit my blog post here, and the follow-up regarding staging here.
  • The official pricing plan is here. MSDN Premium pricing details are here.
  • The Azure teams have several blogs, as well as voting sites for future features. I compiled a list of the blogs and voting sites here.
  • Remember to configure your service and storage to be co-located in the same data center. This is done by setting affinity when creating your services.
  • While all storage access is REST-based, the Azure SDK has a complete set of classes that insulate you from having to construct properly-formed REST-based calls.
  • We talked about the limited indexing available with table storage (partition key + row key). Don’t let this be a deterrent: Tables are scalable up to 100TB, where SQL Azure is limited to 50GB. Consider using SQL Azure for relational data, and offload content to table storage, creating a hybrid approach that offers both flexible indexing and massive scalability. You can reference partition keys in a relational table, for instance.
  • Clarifying timestamps across different data centers and time zones (a question brought up in Brian Lanham’s Azure Intro talk): Timestamps are stored as UTC.
  • Don’t forget about queue names: they must be all lower-case letters, numbers, or dash (and must start and end with letter or number)

If anyone’s interested in a 2-day Azure Deep Dive, I’ll be teaching a free 2-day Azure Bootcamp July 7-8 in Virginia Beach. Register here.

Friday, May 21, 2010

AzureUG May 2010 Materials: Azure Intro

On May 12, I presented an introduction to Azure. here’s the source code and slides from the talk.

The code includes a simple hello-world app, along with the SMS app that takes advantage of a web role, worker role, queue, and table storage.

Saturday, May 15, 2010

.Net Rocks! in Richmond, VA

Carl Franklin and Richard Campbell, hosts of the .NET Rocks talk show, decided to take their show on the road.

They rented an RV, nerded it up with some serious GPS tracking hardware (see their complete trip), and drove it across the US, visiting a bunch of cities, recording live shows, and demonstrating some cool Visual Studio 2010 tech.

On May 5, the DotNetMobile rolled into Richmond, VA (thanks to Kevin Hazzard and the Richmond .NET User Group for getting this venue on the map!). I hitched a ride with Rusty Romaine and drove down from Maryland to see the show. Our guest host was Mark Miller, creator of CodeRush. He and his wife Karen Mangiacotti bantered back & forth onstage, covering everything from animated commas to an over-engineered Viking Ship school project (note to self: never bake vinegar-soaked wood in my oven). The recorded show can be heard here.

After the show, I snagged a group shot with Carl and Richard, along with Mark and Karen,  ride-along winner Ben Dewey, and newly-annointed ride-along winner and HRNUG founder Kevin Griffin.

More photos are posted here. Enjoy!

Wednesday, May 5, 2010

Blackbird Pie: First Thoughts

Yesterday I heard about Blackbird Pie, a new tool from Twitter to create embeddable tweets. This actually sounded like a useful tool, as there are times I embed screenshots in my posts, some of those being twitter captures. My hope was that with embedded tweets, they’d be interactive (e.g. clickable if there was a link).

So off I went to http://media.twitter.com/blackbird-pie. The claim was that it was faster and easier to embed a tweet.

My initial experience was, um, slower and impossible to embed a tweet. Regardless of which tweet URL I inserted into the UI, I received notice that the tweet did not exist when, in fact, it did (sorry, no screenshot for that). And after only 3 or 4 attempts, I was told that I exceeded some limit. Eventually, the site slowed to a crawl and showed me this:

blackbird-stressed

At this point, I gave up for the day.  And I was all set to move on, until I saw a bit more positive traffic about Blackbird Pie this morning. So I gave it one more attempt.  This time it loaded extremely fast and worked on a tweet that failed to render yesterday. The UI generated an embeddable chunk of html/css:

blackbird-code

Pasting this in to this blog post, “magic happens” my tweet gets embedded, though not rendered the way I expected:

Just looked at my dev calendar: 7 #Azure talks scheduled between now and Sept, including a 2-day bootcamp. http://dmak.us/9VQX1rless than a minute ago via web

Just as advertised, my tweet is embedded, and it was incredibly easy. It included my twitter page background. And… all links are preserved: twitter search, site URL, and my profile. Sweet!

UPDATE: Once I published the post, I noticed that the embedded tweet looked nothing like it did inside Windows Live Writer. This is what it looked like prior to publishing, which is not how the actual embedded tweet appears:

blackbird-tweetwhileediting

There might be special tweaks required for embedding in Blogger posts…

Twitter added some disclaimers, so I suspect this tool will be going through revisions pretty quickly as it rises in popularity. For instance, here’s the disclaimer when adding a URL to bake:

blackbird-ownrisk

And here’s the disclaimer just following your baked tweet:

blackbird-disclaimer

Disclaimers aside, this is very cool! I like the idea of an interactive tweet, rather than an unclickable static image (or a static image that I have to wrap a hyperlink around). However, this is not yet ready for prime time, as my embedded tweets are not rendering properly on Blogger.

Tuesday, April 27, 2010

RockNUG April 2010 Materials: Silverlight 4

On April 14th, I presented an introduction to Silverlight 4 and its new user interface enhancements. Here is the source code from the talk.

 

Some takeaways we discussed:

Out-of-browser support

Silverlight 4 now offers more capabilities when applications are installed out-of-browser. To enable this feature, look at the Silverlight project’s Properties. You’ll then see an option for enabling out-of-browser, follow
oob-option
After selecting this option, view the out-of-browser settings:
oob  
Here, you can customize the shortcut name and window title, as well as a description that pops up when you float the mouse over your desktop shortcut. In our demo app, I also set the window size, based on our MainPage user control. Going further, notice the checked box highlighted. This grants elevated privileges to our application when running out-of-browser, which offers some extra features (such as using the native network stack instead of the browser’s network stack).

As an example of elevated network privileges, this app can communicate with the Klout social networking service (see www.klout.com for more details). When running out-of-browser, the app can retrieve a twitter user’s social relevance score. For example:

klout 

Note that this feature is disabled when running from the browser. Also note that Klout requires an API key. Simply visit http://developer.klout.com/ to request an API key. Then simply open MainPageViewModel.cs and search for kloutKey. Now just place your key here and you should be up and running.

Another cool out-of-browser feature is the Notification Window, similar to something you see in Microsoft Outlook in the bottom-right corner of your desktop when new mail arrives.  In the demo, drop a picture onto the drop target, and you’ll see a window pop up.  If you look at the code-behind, you’ll see two samples for setting up the notification window’s content. Note: you won’t see the pop-up window if you’re running in the browser.

Debugging out-of-browser apps

While we didn’t look at this during the demo, here’s how to debug an out-of-browser app with Visual Studio 2010. First, set the Silverlight project as the startup project. Then run it, and install it out-of-browser. Now shut down the app, view the Silverlight project’s properties, and choose the Debug tab. Then simply choose the “Out-of-browser application” radio button:

Debugging

That’s it. Now when you hit F5, your out-of-browser app should load, and your breakpoints will be hit as you’d expect.

Update checks

Ok, this was around in Silverlight 3, but since I attempted to show this and it refused to work, I thought I’d explain why, and how to make it work for you. For auto-update to work, the web app needs to be the startup project, not the Silverlight app. Once you set the correct startup project, the application update-check works just fine.

Media support

You now have access to all of your audio and video devices. When I demo’d this, we had two webcams to choose from. The first time you select a video device, you’ll see the Silverlight warning box. After you agree to allow the app to use the webcam, you won’t see the warning box again. There’s also an option to remember your preference so you won’t be prompted again when re-running the application. I added a Stop button, which returns the video rectangle to a green background.

Improved mouse support

Silverlight 4 now has events for right-click actions, as well as mouse scroll wheel support. I demo’d right-click support by adding a popup window when right-clicking the drop-target button.

Drag-n-drop support

You can now drag files from your desktop or file folders to a user interface element. I showed this by setting up a button as a drop-target. This button accepts image files such as png and jpg.

Visual Studio improvements

While this isn’t specifically related to Silverlight 4, Visual Studio 2010 now has a built-in XAML Visualizer. This means you no longer need to open Expression Blend simply to layout your user controls. Blend still has a considerable feature set beyond that of Visual Studio, especially when managing visual states and animations. For general layout, the built-in visualizer should be fine.

Printing

I showed just how easy it is to create printable output by instantiating a new PrintDocument object, setting the PageVisual property to a part of the visual tree (such as the photo button, or the entire page), and executing the Print method.

Implicit Styles

In Silverlight 3, you were required to specify a style that targeted a specific control type and give the style a name. Then, any control of that type would need to specify a style explicitly. For example, if you had 20 buttons, you’d first need to create a new button style and then reference that style from each button.

In Silverlight 4, you can create an unnamed style (again targeting a specific control type). Now, for all controls of that type, they will automatically get styled with the unnamed “implicit” style you created. Don’t worry – you can still created explicitly-named styles, and controls specifying these named styles will override the implicit styling. Take a look at App.xml to see both an explicit and implicit style:

styles 

Then look at the two buttons in the demo app:

styledbuttons

Lots more!

There are many more features in Silverlight 4, such as the new COM Interop support. For a more complete list, check out Tim Heuer’s post, where he provides descriptions or samples for each of the new features. Note that he Tim has a follow-up link to updates that were introduced in the Release Candidate. All of this applies to the official Silverlight 4 Release-To-Web (RTW) that shipped on April 15.

To get started, visit the Silverlight 4 developer page to grab the latest SDK and Silverlight developer runtime. You’ll also need Visual Studio 2010, which is available from MSDN. There’s also an Express version freely available for download.