Microsoft has recently released version 1.2 of the Microsoft Azure SDK and Microsoft Azure Tools for Microsoft Visual Studio 1.2. They have continued to release updated SDKs and new features on an aggressive pace. In this article we will walk through the new features of the updated tools and SDK.
What is the Microsoft Azure SDK?
Many people might not be aware of the differences between Microsoft Azure Tools and the Microsoft Azure SDK. The Microsoft Azure SDK is simply a set of binaries and helpful utilities to help you use the Microsoft Azure SDK platform as a developer. The tools package, on the other hand, includes the SDK as well as plug-ins and templates for Microsoft Visual Studio. If you are a .NET developer you only need to download and install the latest version of the Microsoft Azure Tools for Microsoft Visual Studio. Through the rest of the article I will refer to both packages as the SDK interchangeably.
If you have a prior version it is easy to upgrade. Just download the new version and run the setup program. It will install over the old version and handle everything for you. To run the SDK and the Tools you do need to have Windows Vista (or better). This is because the SDK uses IIS7, under the covers to simulate the real Azure cloud locally in what is called the devFabric, and only Vista or better versions of Windows support IIS7. This does include Windows Server OSes as well, if that is how you roll.
.NET Framework 4 Now Supported
The first and biggest feature of the new SDK is the new support for .NET 4. Prior to this release of the SDK Microsoft Azure only supported .NET framework 3.5 sp1. With the new support for.NET 4 you can take advantage of all of the great new features, like better WCF
support and configuration, MVC2, and many more. Along with this feature comes the support for Visual Studio 2010 RTM. The old version of the SDK only supported the beta versions of Microsoft Visual Studio 2010, so this makes it official. While you can develop for Microsoft Azure
in Microsoft Visual Studio 2008 SP1
, you should really use Visual Studio 2010. If you don't own professional you can use Microsoft Visual Studio 2010 Express (see http://www.microsoft.com/express/downloads/
for details) to build most types of applications.
Cloud Storage Explorer
The rest of the features are really enhancements to the Microsoft Visual Studio tooling. The first new tool is called the Cloud Storage Explorer. This adds support to the Server Explorer window in Visual Studio to help you browse your cloud data. It can connect to any cloud storage account for Microsoft Azure, as well as connect to your local devFabric storage.
It has a few limitations. First, you can't use it to inspect queues, it only works for BLOBs and tables. Second, you can only read your data, you can't use the tool to edit your data. This does limit how you might use the tool, but it is still handy to watch what is happening in your storage account as your code is running. The table view does let you define custom queries so that you can limit the entities shown from your table. Also, when you double click a BLOB in the BLOB list, Visual Studio will do it's best to open the document. For example, if you double clicked on an image, Visual Studio would open the image in the image editor that Visual Studio includes.
You can have the Cloud Storage Explorer point to as many storage accounts as you would like, and it comes preconfigured for your local devFabric storage. To add your own cloud based storage account, right click on "Microsoft Azure Storage" in the Server Explorer window of Visual Studio and choose "Add New Account..."
At this point you will see the following screen. You will need to enter your account name, and the key that goes with it. You can choose to have Visual Studio remember the key or not. You can find this information by logging into your Microsoft Azure portal and browsing to your storage account. The account information is stored in the configuration for Visual Studio, and is not associated with your solution or project file. Once you have supplied the correct data, click OK.
Once you have configured your account your storage will appear in the server explorer. You can drill through the explorer to find your BLOB containers and tables. In the screenshot below I have added a cloud storage account, and I have my local devFabric storage showing.
My FurnData storage account in the cloud has a table for furniture data for our website, as well as several BLOB containers for storing photos of the products.
When I double click on the BLOB container teleport-pads, Visual Studio will display a list of all of the BLOBs (aka files) that are stored in that container. I can't use this tool to upload or change the files, but I can double click on one to open it.
When you double click on a table you will get a spreadsheet view of the entities in the table. Again, you can't edit the data, but this can help you see what is going on as your application is running. Near the top of the list you can provide your own query to filter the entities that are displayed.
Being able to explore your cloud storage is handy, especially during development, but the new integrated deployment tool is one of my favorite new features. Once configured, this tool makes it easy for you to deploy your application to the cloud. You no longer have to do a local build, export it to a package, and then upload it by hand through the Azure portal. This tool is built using the service management API that you already have access to. The tool will also help you setup the certificates needed to use the service management API.
Part of the challenge in setting up the certificates for management API calls was that most developers have trouble remembering the complex command line needed to create them. This tool makes that all much easier.
To configure Microsoft Visual Studio to automate your deployments you have to take a few steps. Return to the Server Explorer window, and right click on "Windows Azure Compute," and select "Add slot..." Microsoft Visual Studio expects you to add each slot from each hosted service separately. Each hosted service you have created has two slots, production and staging. In this example we are going to only configure the production slot. You would follow the same steps to setup the staging slot.
Since you probably haven't configured a slot yet the window that comes up will be empty. This window displays the slots you have configured Microsoft Visual Studio to know how to talk to.
You will need to right click on "Windows Azure Accounts" and select "new". When you do the window to help you setup cloud authentication credentials will appear.
The next few steps will focus on filling out this form. Open the drop down in step 1. This is a list of the certificates installed on your computer. If you have already created a certificate for use with Microsoft Azure you can select it here. I am going to create a new certificate by selecting 'create' at the bottom of the list. You can use commercially signed certificates or self-signed certificates. I always use self-signed certificates because in this case we are only using the certificate to authenticate with Azure Service Management service, and a customer won't ever see it.
When you select 'create' you will be asked to provide a name for the certificate. You should choose something that is self-explanatory.
Next, click on the link in the window that says 'Copy the full path' near step 2. This will copy the path of the certificate you have selected or created to the clipboard. Then click the second link in step 2, 'Developer Portal.' This will open your browser to the Azure portal where you are going to upload the certificate.
One of the few things the management service can't do is deploy a management certificate. You have to do this by hand through the portal. Once you have logged in, click on the 'Account' tab in the middle of the top menu. You will see a link that says 'Manage My API Certificates' near the top. Click on it. The portal will now list for you the certificates you have deployed to your Azure account. From here you want to paste in the path to the certificate that Microsoft Visual Studio created for you, and then click upload. The certificate will be uploaded and registered with Azure.
Once it is uploaded, your certificate will appear in the list on the portal. You can have up to five certificates active at any time. You can come back to this screen on the portal to deactivate any certificate at any time, perhaps after it falls into the hands of an international villain.
The last bit of data you need is your subscription id. This was created for you when you created your Microsoft Azure account and accepted the terms of service. This subscription ID is your account number for Azure, and you will need it for any billing or tech support issues. Visual Studio needs it so it knows which account to connect to with the certificate we just uploaded when it is deploying code on your behalf. You can find your subscription id by looking at the bottom of the account page you were just on. You will need to copy and paste it into step 3 on our form in Visual Studio.
The last step is to give this set of credentials a name for easy recall, and then click 'OK.' What we have done is configured the portal and Visual Studio with the proper credentials to connect and deploy code on our behalf. The following screenshot is how the form will look when it is completed.
Once you click OK you will be returned to the 'Add Slot' window. It should now refresh with your new configuration. If it doesn't, right click on 'Windows Azure Accounts' and choose 'refresh'. In this case I will be selecting the production slot for my Furniture Shop Demo hosted service. This window will automatically show the hosted services you have created in the portal (or through scripts). If you haven't created one yet, this will remain empty.
You can come back to this window and add the staging slot as well, or any number of other Azure accounts that you might have. For example you might have an MSDN account with Azure, as well as a paid production account to run your real world applications.
Once everything is setup the Server Explorer window should be updated to show you the status of your service accounts. I am going to finish showing you how to use the integrated deployment feature, and come back to showing you how to manage your services from the Server Explorer window. In the following screenshot you can see we have the Furniture Shop Demo hosted account, and that the production slot is empty.
Azure is sad when we have an empty slot, so let's find a sample application to fill it with. You can use any application you want at this point. I am going to create an empty ASP.NET 4 web application. If you want something meatier you might want to check out the Microsoft Azure platform training kit. You can download it at http://www.microsoft.com/downloads/details.aspx?FamilyID=413E88F8-5966-4A83-B309-53B7B77EDF78&displaylang=en.
Once you have an application loaded, and you have tested it against the local devFabric, it is time to deploy it to the cloud. Before version 1.2 of the SDK we would have to do a publish, take the created files and upload them to the portal by hand. Now it is all integrated into a few simple steps. Now when you select 'Publish' from your Azure project (not the solution file) a window will come up.
You can continue to deploy your app the old way if that works for you. Perhaps you have a lot of scripts that already automate your deployments. In that case, select "Create Service Package Only." For those of you interested in the integrated deployment, choose the second radio button, "Deploy your Cloud Service to Windows Azure."
You will also need to select the credentials you would like Visual Studio to use to connect to the cloud and deploy your code. Select the proper entry for each of the three drop downs, credentials, slot, and storage account.
You need to provide a storage account because when you are deploying a cloud app through the service management API (which Visual Studio is using behind the scenes) the code isn't uploaded directly. It is uploaded to a private BLOB container in your storage account, and copied from there to the Azure Fabric Controller for deployment.
You will also need to provide a label for your deployment. This can be anything you want, but is usually a build number or version number. In this case I chose "FurnApp v1."
I also recommend, at least for this example, to check the box that will turn on IntelliTrace. Once you have completed the form you can click OK. You can see what my form looked like in the following screen shot.
Your code will be compiled and a cloud service package will be built. It will be copied to your BLOB container, and the service management API will be called with your credentials to setup the new applications. While all of this is happening a new window will pop up at the bottom of Microsoft Visual Studio. It is the Microsoft Azure Acitivity Log window, and shows you a list of actions that are happening behind the scenes. This might be a deployment, a service status change, or the downloading of logs. In the example screenshot below you can see that Azure is busily deploying our package to production.
The green bar will move as the bits are copied to storage, deployed to the cloud, and then started up. This can take some time depending on your package, and configuration, sometimes up to 20 minutes. While this is happening you can do other work in Microsoft Visual Studio. Once the status says completed you can browse to your newly deployed application.
While integrated deployment does take a few steps to setup, it is really convenient after that. Make sure you remember that each deployment automatically costs an hour, on top of the real compute time you use in Azure. This means you don't want to deploy a new build every 15 minutes as part of your continuous integration process.
The new Microsoft Azure Compute node in the Server Explorer window also lets us track the status of our deployed apps their instances. In this case, when I deployed my app, I chose to have nine web role instances running my code. The Server Explorer window updates automatically as the status of each instance changes. In the following series of screen shots, you can see the progression of the status from initializing to ready to stopped. The first screenshot is a few minutes after I deployed my application. Some instances are already running, and some are still initializing. The second screen shot shows that all of the instances are running smoothly. The third screen shot shows that all of the instances are stopped after I used the portal to suspend my application.
Keep in mind that the status view in Server Explorer is read only. You cannot make changes from within Visual Studio. You will still have to use the Service Management API directly with script, or use the portal, to start, stop and delete instances of your application. With that being said, it is nice to have it right in Visual Studio where you can see the status as they startup or shutdown, instead of having to flip back to the portal all the time.
Using IntelliTrace for Remote Debugging
And now we move to my favorite feature of the new SDK, and that is support for IntelliTrace in Microsoft Azure. IntelliTrace is the killer feature in Microsoft Visual Studio 2010 Ultimate. If you don't have Ultimate, you should download the trial and play with it. But be careful, once you do, you won't want to go back to what you had.
IntelliTrace is a great feature for debugging, regardless of where your code is running. It is basically a DVR for your code that lets you replay the running of your app, in debug mode no matter where it originally ran. The server running your code (in testing or production) needs to be running an IntelliTrace agent. For Microsoft Azure this means you need to be running a version of the Azure Guest OS that has the agent installed (versions 1.3 and 1.4 at this time).
This agent will gather all of the black box data for you when your code runs. This includes call stacks, variable assignments, everything. When you need to figure out why an error happened, you grab the IntelliTrace log and replay the log in Visual Studio as a historical debugging session. You can step through the execution of the code as if it is really happening right then. You can inspect variable values, and basically do all of the things you like to do when you are running the code in real-time debugging on your machine.
This is very handy for cloud based applications because you can't attach a normal debugger to the Microsoft Azure servers. With this new feature, when you have a problem, you can right click on an instance in the Server Explorer and have it download the IntelliTrace logs. Once you have done that you can replay the debug session, look at the error comments, etc. to help you troubleshoot the problem.
I am going to create an error in my website that I uploaded to see this feature in action. I am going to give it a bad connection string for the diagnostics agent by randomly changing some of the characters in the storage key. In older versions of the Microsoft Azure Guest OS (before 1.3) this bad connection string would cause the whole instance to fail to startup. This was a major cause of frustration for people. Now, if the connection string to the diagnostics storage fails, the instance is still allowed to start, and it puts an error into the diagnostics infrastructure logs.
In the real world I would notice that my logs were not being transferred from my instances to my storage, so I would use the IntelliTrace logs and replay to find out what happened. If you right click on the instance you are worried about in your Server Explorer window you can select "View IntelliTrace logs". This will transfer the logs from that instance to your machine. You can then inspect the logs, and even replay a failure if you want.
In the following screenshot you can see the process time lines. This isn't useful with this problem, but this can be very helpful when determining timing, and how processes are related in other common debugging scenarios.
In the next screen shot I have scrolled down to the captured exception information. It shows that there were problems authenticating to the storage service, which is what you would get if you had the wrong key for your storage account. If I double clicked on any of those exceptions, if they were thrown by my code, the right file would be opened, and I could debug through that code as if it was running right now, when it is really just replaying what was recorded.
The release of Windows Azure SDK 1.2 really shows us how things will be in the future. Instead of waiting for new features, better tools, and easier management to be released for three years on a product, we can start to expect releases on a faster schedule, each new release including just a few small bites of new features. The Microsoft Azure team is making sure that these new features are opt-in for your running applications and code. For example, if you didn't want your code running in .NET Framework 4, and wanted to stay on .NET Framework 3.5 sp1, you could for as long as you needed to. You wouldn't have to adopt a new feature until you are ready to do so.
The v1.2 release of the Microsoft Azure SDK is a strong release that really improves the development and deployment story for developers. The addition of IntelliSense and .NET framework 4 support are great leaps forward in working with the cloud. I encourage you to download the new SDK, and take a look.