Hybrid Applications with Microsoft Azure Queues

by Brian Prince

Microsoft Azure promises a new scenario of building hybrid applications that are both in the clouds and on-premises.


The cloud is exciting, it is fresh, and it brings wonderful new opportunities and capabilities we haven't had before. There is a but. That but is that not everything CAN or SHOULD be run in the cloud. That's right! Not everything should or can be run in the cloud. Microsoft has worked hard to make the Windows Azure Platform a broad platform that is a la carte in nature. This allows you to consume the parts of the cloud that make sense for you. This also enables a new scenario of building hybrid applications.

Hybrid Applications in Microsoft Azure

Hybrid applications are applications that are broken into pieces, and run in a variety of runtime environments. Each piece is then able to leverage the strengths of its runtime environment to its advantage. For example, we might keep a web application on premises, and move the image storage to the cloud to take advantage of the storage costs and distribution strengths. The Windows Azure Platform lets you take advantage of just the pieces you need to use. Some applications can't run in the cloud because they need to access local hardware (a printer or scanner for example), or you might have data restrictions where you can't possibly move the data to the cloud. A hybrid approach lets you mix and match so you get the best of both worlds.

We are going to build an example application that shows how this might work. In our fictitious company we will have sales reps out in the field with a laptop. They are distributing samples of our medical products to doctors' offices and hospitals. They must track when they leave a sample with a particular office so they can report their activities and track marketing results. Since the workers are in the field they need a way to report this activity back to the corporate office without actually going there and connecting. Once the corporate office receives the data it will be processed and inserted into the marketing data warehouse.

There are several ways we can get the information to the corporate office. In the past we might use VPNs and have the laptops synchronize a local SQL Express database to the corporate database. This approach can be very high maintenance. Synchronization relationships can need attention, and VPNs are never hands off. Even if they are working well on the corporate office side, it seems they don't always work on the user side. For example, the user might be in a hotel that doesn't allow VPN connections. This approach also requires a synchronous connection, requiring the home server to be online and ready to take connections. This can be an issue when no one is sending messages during the day, and suddenly everyone wants to get their reports in on Friday evening.

Using Microsoft Azure Queues

Windows Azure has four storage options available for use. These include blobs, tables, queues, and SQL Azure. We are going to focus on using Windows Azure Queues. The queue service provides a highly available and scalable way to asynchronously pass messages from the sales reps in the field and the processing server back at the corporate office. Since the queue is stored in the cloud, the sales reps will be able to always send messages to it.

Using the queue also allows us to stop using the VPN, at least for this application. This removes the need to punch holes through firewalls, and leaves the processing server to reach out to the cloud to grab the messages when it wants them.

Format of the Message

Messages in Windows Azure Queues are limited to 8KB in size. This means that when we architect our system we can't exactly stuff a giant document into the message and dump it in the queue. If you need to do this you will want to follow the worker ticket pattern. In this scenario you will upload the large piece of data to be worked on into shared storage (perhaps a blob container or a table) and then submit a work ticket to the queue with a pointer to the document in shared storage.

Our messages will simply contain the doctor Id from our global doctors list, the product sku that was given out, the quantity given, and the id of the sales rep. If we needed to also store the signature of the doctor to prove they received the samples we would store it in a blob container, with a file name that we would know (perhaps the id of the sent message.)

Messages in Azure must be text or serialized binary. In our sample we will simply pass the data as a string, but most teams will choose to use a POCO class to strongly type their messages, and just serialize that class to the message. If you do that you will reduce the interoperability of the queue, since each consumer will have to know how to deserialize that message back to the .NET object you defined.

Build a Simple Windows Presentation Foundation (WPF) Client

We are going to build a simple WPF client that lets us enter the required data fields. A real business application would include a lot of other features. We are going to focus on just want we need to communicate the message through the queue back to the corporate office.

Figure 1

Start by creating a new solution in Visual Studio (I am using VS2010 RC, but you can use VS2008). Add a WPF project to the solution. Just like when we connect to a database we need a connection string to the queue service in the cloud. This will include the URI to the queue's service, as well as our authentication credentials. We are going to store this in our app.config file, and access it using the configuration API. In our sample we are initially going to connect to the local developer storage. Later we will connect to the cloud.

    <add key="DataConnectionString" value="UseDevelopmentStorage=true"/>

We will also need to add some Azure references so we can use the Client Storage Library in our code. Add references to Microsoft.WindowsAzure.StorageClient and Microsoft.WindowsAzure.ServiceRuntime in your project. Once you have done that add the related using commands to your code.

Our first step is to add a settings publisher provider for the StorageClient library. The library only knows how to read Azure configuration files by default. This isn't hard; with a bit of code we will show it how to read our normal app.config file. We will put this code in our MainWindow constructor method so that it runs every time the window starts up.

private CloudQueueClient queueClient;
private CloudQueue inboundSampleReportsQueue;

public MainWindow()

            CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>

                CloudStorageAccount storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
                queueClient = storageAccount.CreateCloudQueueClient();
            catch (Exception ex)

            inboundSampleReportsQueue = queueClient.GetQueueReference("samplereportsqueue");

In order to access the queue with the client library we need to get a reference to our storage account. This is the account that holds the queue in the cloud. We will pass in the connection string we have stored in our app.config file. Once we have this account reference, we will create a queue client proxy object. The queueClient object we created will be the object we use to send commands to our queue in the cloud.

Once we have this reference we can use it to get a reference to our queue, which we called 'samplereportsqueue.' This queue doesn't have to exist. At this point this queue reference is just local. Changes to the queue aren't sent to the cloud until we use certain methods. In this case, we call the CreateIfNotExist method on the next line. This will cause our queue client object to see if a queue called 'samplereportsqueue' already exists. If the queue doesn't exist it will create the queue for us. We have just connected to our storage account in the cloud, connected to the queue service, and created a queue. The next step is to put a message on the end of the queue.

I placed some simple text boxes and a button on our form to collect the report data. A real application would have much more UI around collecting this data. I am avoiding putting all of that complexity into this sample because I want to focus on how to use the queue, not on how to build a great UI (which is important of course.)

To send the message to the queue we need just a little bit of code. We stored the account and queue object references in form level variables so we have easy access to them from anywhere. We don't want to have to recreate them every time we want to send a message.

  private void cmdSubmitReport_Click(object sender, RoutedEventArgs e)
            CloudQueueMessage newMessage = new CloudQueueMessage(
txtDoctorId.Text, txtQuantity.Text, txtSKU.Text, "bprince"));


All messages sent to a queue must be of type CloudQueueMessage. When we create the message we will pass in the contents of the message in the constructor. There is a second constructor parameter we can use (we don't here) that defines how long the message will live. This is useful if we want the message to expire after a certain period of time.

Once we have create the new message object, we just call the AddMessage method on the queue reference we have (inboundSamplesReportsQueue). This sends the message to the cloud and submits it into the queue. It is that easy.

Build a Processing Engine

Now to build the other end of the queue, the consumer. We are going to build our processing engine as a console application, to keep it really easy. When it starts up it will get a reference to the storage account, and the queue in the same way our client did. Once it has these references it will start polling the queue for messages. If it finds one it will grab the message, process it, and delete the message.

Figure 2

It is important that you do not delete the message until after you have processed it. If you delete it first, and something goes awry during processing you will have lost your message. When using queues people generally want it to be a durable transport, they never want to lose a message, even when there is a failure. With Windows Azure Queues if the message is not deleted within a timeout period (30 seconds by default) the message will become visible on the queue again, and handed out to the next consumer.

Create a new console project in your solution and add the same references we did in our WPF client.

using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure;
using System.Configuration;
using System.Threading;

static void Main(string[] args)
            CloudQueueClient queueClient = null;
            CloudQueue queueToProcess = null;

            ConnectToQueue(out queueClient, out queueToProcess);

            CloudQueueMessage aMessage;
            while (true)
                aMessage = queueToProcess.GetMessage();
                if (aMessage != null)

In our Main method we are going to first create the references to the storage account and queue. We hid this away in a method called ConnectToQueue, to get it out of the way. This method does basically the same thing we did in the WPF client, so I am not listing it here, but it is available in the sample code.

The rest of this method focuses on polling the queue with an infinite while loop. This loop will just run forever, looking for messages, until we stop the console application. The first line of the loop is a call to the GetMessage method on our queue reference object. If there is a message we will receive it and process it. If there isn't a message our aMessage variable will equal null.

If we don't receive a message, then we want to output to the console that we are sleeping, and then we call Thread.Sleep(10000) to sleep for ten seconds and try again. You can adjust the sleep period to match your particular business needs. It might be that you don't poll the queue at all during the day, and only start polling at 10pm. You can poll anyway you want.

If we do receive a message with our GetMessage method then we want to first process it. I extracted the code on how to process my message into its own method. This is how you might strip the contents out of the message and pass it to your business logic code to be processed. Our simple implementation will simply write out the contents of the message to the console.

private static void ProcessMessage(CloudQueueMessage aMessage)
      string[] report = aMessage.AsString.Split('#');
      Console.WriteLine(string.Format("Doctor {0} was given {1} samples of {2} by {3}.", 
                        report[0], report[1], report[2], report[3]));

To access the contents of the message we want to use either the AsString or AsBytes properties. The message object has several other useful properties. DequeueCount will tell you how many times this message has been taken off the queue, only to timeout and reappear. This might indicate a poison message in your queue, or some other problem. Each message is given a unique id that is a guid. This can be accessed with the Id property. This is handy when you need to know whether you are working with the same message, or two messages that happen to have the same content.

Once our processing is completed we call the DeleteMessage method on our queue reference object, queueToProcess. Behind the scenes the queue service will compare the PopReceipt we have with the most recently given out PopReceipt to see if we are the most recent owner of the message. If they match the message is deleted. After this the loop starts all over again.

Configuring Our Application To Use The Cloud

Up until now you might have noticed in the app.config file that our data connection string says 'UseDevelopmentStorage=true'. This is a magic string (which is normally a no-no by the way) that tells the client library to connect to the local developer storage service. This saves you from having to type in the hard coded local service endpoint and storage access key.

When we want to point our application to the cloud we are going to have to build a real connection string. Below you will see a sample. It has been modified to hide my real storage account key. This key is like the password to your storage account so you will not want to hand it out wily nilly.

<add key="DataConnectionString" 
AccountKey=IPe3JVJL+I7h8A87Q6pbXYzsamplesamplesampleHpcRJFhLZYcptgqKgadxA==" />

You will need to create a Windows Azure account and a storage account on the Windows Azure portal. Once you have done that you will be given an account key to use. Just change the connection string in both app.configs and you will be sending messages through the cloud. If you have MSDN you should enroll to get a free allotment of Azure time as part of your subscription. You can find out more about MSDN allotments at http://msdn.microsoft.com/en-us/subscriptions/ee461076.aspx.

You will need to use a credit card when you sign up for an account on Azure. If you are just running this sample application the cost to you will be close to zero, since you would be charged for a little bandwidth (.10/GB in, .15/GB out), a few storage transactions ($0.01 / 10K), and by the amount of active storage the queue takes up ($0.15 / GB stored / month). Since the messages are small, and are picked up immediately you will be charged very little (if not no) money at all. In this case you should create an account and try it out.


We have just found an easy way to send some simple messages across the cloud back to our internal enterprise software. We didn't have to use VPN's, complex firewalls, or anything else. We did this by sending the message to the queue in the cloud, where it patiently waited until the consumer could connect and download the message. This was just a simple example of how you can mix on premises data centers with a desktop applications and some bits in the cloud.

There are other ways we could handle this. If we wanted an even more robust way of communicating directly with our server then we could use the Service Bus, which is part of Windows Azure Platform AppFabric. When we use the service bus we would be able to bounce the message off the relay in the cloud and into our enterprise server in our internal data center.

This article was originally published on Monday Apr 5th 2010
Mobile Site | Full Site