Robust Cloud Integration with Azure

For the last year I have been busy co-authoring this book on Azure Cloud Integration with my follow co-authors Abhishek Kumarm, Martin Abbott, Gyanendra Kumar Gautam, James Corbould and Ashish Bhanbhani.

Image result for robust cloud integration with azure

It is available on the Packt website here: https://www.packtpub.com/virtualization-and-cloud/robust-cloud-integration-azure

This book will teach you how to design and implement cloud integration using Microsoft Azure. It starts by showing you how to build, deploy, and secure the API app. Next, it introduces you to Logic Apps and helps you quickly start building your integration applications. We’ll then go through the different connectors available for Logic Apps to build your automated business process workflow. Its packed with a lot of information spanning just under 700 pages.

Don’t forget to check out another publication I co-authored back in 2015 with with Mark Brimble, Johann Cooper and Colin Dijkgraaf called SOA Patterns with BizTalk Server 2013 and Microsoft Azure.

SOA Patterns with BizTalk Server 2013 and Microsoft Azure - Second Edition Book Cover

And it is still available from the Packt website here: https://www.packtpub.com/networking-and-servers/soa-patterns-biztalk-server-2013-second-edition

Hope you enjoy reading it, just as I enjoyed writing the content.

Searching through messages in Logic Apps

Unfortunately Logic Apps do not provide an easy option to view the contents of a message unless you go through each log entry and view the outputs as shown below.

image

However there is an alternative method using Log Analytics which comes with Operations Management Suite (OMS). By using Log Search you can search for specific property values within your messages. Below is an example of searching through the diagnostics log of a Logic App for a particular JobId and the results using OMS.

image

 

To start using this feature we need to setup OMS first using the steps below.

1. In the Marketplace search for “Log Analytics” and select.

image

2. Create the OMS Workspace using a suitable name and resource group.

image

3. Next we need to add a storage account for the Logic Apps to store diagnostic data. From the Marketplace, search for “Storage Account” and select it.

image

Create the storage account by providing a name and leave the “Account kind” as “General purpose”.

image

4. Once the storage account is created, we need to link this to the OMS Workspace. Click on the Log Analytics resource that was created in step 2 as shown below.

image

In the properties blade, scroll down to the “Workspace Data Sources” and click on “Storage account logs”.

image

Then click the plus sign to add a storage account. Choose the storage account you created previously.

image

After you have chosen the storage account, select the “Data Type” and chose events. Then click “OK” at the bottom of the page.

image

Now that all the plumbing has been configured we can turn our attention to the Logic App. For this example we are going to create a simple logic app that receives a purchase order and sends it to RequestBin and then returns a status code of OK.

image

Here is an example of the purchase order we are going to post to the Logic App.

{
   "CustomerCode": "CUST1000",
   "Lines": [
      {
         "LineNo": 1,
         "Price": 68.25,
         "ProductCode": "PRD1100",
         "Qty": 1
      }
   ],
   "OrderNo": "1000",
   "Total": 68.25
}

Once we have created the logic app, select code view to add our custom tracked properties on an action. I want to be able to search for orders using either the OrderNo, CustomerCode or the Total order value.

To do this add the highlighted “trackedProperties” section to the action, specifying the attribute name to search on and the path in the message to obtain the value from.

image

Now that the logic app has been created and saved, we need to turn on Diagnostics for this Logic App. Under the Monitoring section of the Logic App, click on Diagnostics and then Diagnostics Settings shown below.

image

Set the “Status” to On and check the “Archive to a storage account”, select the storage account that was provisioned previously and the retention periods to what you require.

image

Now check the  “Send to Log Analytics” and select the OMS Workspace created before. Then click the Save button.

image

Everything should be good to go now. Use something like PostMan to start sending test messages to the Logic App. After a few minutes you should see the tracked properties and their values being written the blob store under the storage account and a container called “insights-logs-workflowruntime”

If you keep drilling down into the containers that matches your logic app name, you will see a file called “PT1H.json”. Inside the file you will see the entries for the tracked properties.

image

To use OMS to search on on of your properties, click on the Log Analytics under your logic app.

image

Once the blade opens click on the OMS Portal link which opens the portal site. On the portal site, click the icon “Get Started”. Then under “Data” and “Custom Fields” you should be able to see your custom tracked properties. Take note of these field names as these will be used in the search query.

image

Now click the Search icon symbol on the left navigation pane and enter the following query “Type=AzureDiagnostics  resource_workflowName_s=Orders” into the search  box and then click search. Note it can take a few minutes before the data turns up in OMS if you just submitted a message to the logic app. The query will list all logics that have been triggered with the Logic App name called “Orders”.

You should get a list of all the triggers related to the “Orders” logic app as shown below. Here I found 118 events in the last day.

image

You can narrow your search down further by modifying the search query. Searching for orders with an order number equal to 1004, you would enter this into the query field “Type=AzureDiagnostics  resource_workflowName_s=Orders trackedProperties_OrderNo_s=1004”.  This will display the records matching the order number.

image

Also by left clicking on the ellipse (…)  next to each field brings up another context menu to provide more filtering options.

image

In conclusion, by using OMS, it provides the ability to search for tracked properties, save common queries and create custom  dashboards. I encourage you to look at all the features available in OMS as we only touched the surface here in this post.

Enjoy.

Securing a MVC Web API using Azure AD and using Client Credential flow

In this blog I will show you how to add authorisation to a MCV Controller,  setup Azure AD as an OAuth 2.0 JWT (Json Web Token)  provider and use an Azure AD endpoint to obtain the access token. I will be using Client Credentials grant flow to access a protected Web API resource.

Client Credential flow uses a Client Id and a Client Secret values. These are sent to an OAuth  token provider to retrieve an access token first. This is ideal for a client application calling a web service resource without a user supplying their login credentials. The following sequence diagram shows how a website would call a web service.

image

Registering the Web API Service

Lets first start by provisioning an Azure Active Directory using the classic portal as we will require the generated Audience and Tenant values for our MVC Web API latter.

image

image

Next add a name for your Directory and a unique Domain Name to use. For this example I called it “connectedcircuits”

image

Once the Domain has been configured it will be shown in your active directory listings.

image

Next we will add an Application to this domain for the Web API we are developing. Now click on Applications. It will have a default application already registered for Office 365 Management APIs.

image

Scroll down to the bottom of the page and click the Add button. It will then show the following dialogue box. Select “Add an application my organisation is developing”

image

Give it a suitable name and select “Web Application and/or Web API”

image

In the next screen you can set the Sign-on URL to “http://localhost” as we won’t be using this URL for  Client Credential flow.

Now add an App ID URI for the API we will be developing. It must be unique within your organisation’s directory. In our scenario I have chosen “https://connectedcircuits.onmicrosoft.com/DemoProtectedAPI” which represents the domain and the name of the Web API we will be building.

image

We can now start on building the MVC Web API. Create a new ASP.NET Web Application project in Visual Studio.

image

Select the Web API template, set the Authentication to “No Authentication” and uncheck “Host in the cloud”. We will setup the authentication manually and host this locally in IIS Express for now.

image

Once the template solution is created install the following Nuget packages to the project.

  • Install-Package Microsoft.Owin
  • Install-Package Microsoft.Owin.Security.ActiveDirectory
  • Install-Package Microsoft.Owin.Host.SystemWeb – this package is required when hosting your API in IIS express.
  • Install-Package Microsoft.AspNet.WebApi.Owin.
  • Install-Package System.IdentityModel.Tokens.Jwt -Version 4.0.2.206221351

After the packages have been installed, add a new partial class called Startup.cs to the App_Start folder.

image

Add the following code into this class. This class reads the configuration settings in the web.config file and  initialises the Owin authentication.

   1: using Microsoft.Owin;

   2: using Owin;

   3: using System.Web.Http;

   4: using Microsoft.Owin.Security.ActiveDirectory;

   5: using System.Configuration;

   6: using System.IdentityModel.Tokens;

   7:

   8: [assembly: OwinStartup(typeof(DemoProtectedAPI.App_Start.Startup))]

   9:

  10: namespace DemoProtectedAPI.App_Start

  11: {

  12:     public partial class Startup

  13:     {

  14:         public void Configuration(IAppBuilder app)

  15:         {

  16:             HttpConfiguration config = new HttpConfiguration();

  17:             ConfigureAuth(app);

  18:             WebApiConfig.Register(config);

  19:

  20:             app.UseWebApi(config);

  21:         }

  22:

  23:         private void ConfigureAuth(IAppBuilder app)

  24:         {

  25:             var tokenValidationParameter = new TokenValidationParameters();

  26:             tokenValidationParameter.ValidAudience = ConfigurationManager.AppSettings["Azure.AD.Audience"];

  27:             app.UseWindowsAzureActiveDirectoryBearerAuthentication(

  28:             new WindowsAzureActiveDirectoryBearerAuthenticationOptions

  29:             {

  30:

  31:                 Tenant = ConfigurationManager.AppSettings["Azure.AD.Tenant"],

  32:                 TokenValidationParameters = tokenValidationParameter

  33:

  34:             });

  35:

  36:         }

  37:     }

  38: }

 

In the web.config add the two keys to the <appSettings> section shown in the code below. The  Tenant value is the  domain name that was specified when creating the directory and the Audience value is the App ID URL  that was given when adding an application to the directory.

   1: <appSettings>

   2:

   3:   <add key="Azure.AD.Tenant"

   4:     value="connectedcircuits.onmicrosoft.com" />

   5:   <add key="Azure.AD.Audience"

   6:     value="https://connectedcircuits.onmicrosoft.com/DemoProtectedAPI" />

   7:

   8: </appSettings>

 

Now to protect the methods in a a Controller, add the [Authorize] attribute to the class as shown below.

image

One final step is to turn on the SSL requirement for this project as oAuth2 relies on using a  secure channel. Click on the project to display the Properties page and set the SSL Enabled to True. Take note of the URL and port number.

image

Now if you try and navigate to the GET Values method you should get an authorization error as shown below:

image

Registering the client app

To be able to get access to the methods exposed in the MVC Controller, you will need to send a bearer token in the Authorization HTTP Header. To get the token we will need to setup another application for the client in the same domain as the Web API Service.

From the Azure Classic Portal, go to the Active Directory resources and select the name of the active directory that was created at the beginning of this blog. Click on Applications and then click on the Add button at the bottom of the page.

image

Again select “Add an application my organization is developing”. In the dialogue box give it a suitable name and set the type to Web Application and/or Web API as shown below.

image

In the next screen, you can set the sign-on URL to http://localhost as it is not used for this type of authentication flow. The APP ID URI should be set to a unique value,  its used as a unique logical identifier for your app.

image

Once it has been created, click “Configure” and scroll to the bottom of the page and click the button “Add application”. This is how we associate the web service you created earlier with your application.

image

Set the “Show” dropdown control to “All Apps” and click the Tick icon. This should display all the apps listed in your domain. Highlight your web service and click on the plus icon.

image

After clicking the plus icon, the app will be shown in the Selected column. Then click the “Tick” icon at the bottom of the page to return back the Configure page.

image

Back on the Configure page, set the delegated permissions on the App you just added from the previous web page as shown below and click save.

image

Now we need to obtain the client key to be used for obtaining the JWT.  Scroll up to the keys section and generate a new key. Note the key won’t be shown until you press the save button at the bottom of the page and you will need to take a copy of this key.

image

Next we will need top take a copy of the client Id which is also on the Configure page.

image

The last piece we need is the endpoint of the OAuth token. This is found by scrolling to the bottom of the page and pressing the View Endpoints button.

image

Now take a copy of the OAuth token endpoint.

image

We now have the 4 pieces of information required to request the JWT.

  1. Web API endpoint  – the specified App ID URI when registering the Web API service.
  2. Client Id – created when registering the client app.
    • 7f1fd29a-d20f-4158-8f47-eff1ea87dc38
  3. Client Key  – generated when registering the client app.
    • OPAfMI4IixoYnGAPiCpLvAvYecH92TXcC1/8wH31HD0=
  4. OAuth 2.0 token endpoint – App endpoints from either the Web API or Client App.https://login.microsoftonline.com/f3301fa6-4178-44a9-b5a9-e945e69e4638/oauth2/token

Using the above information we make a POST to the OAuth token endpoint using the following raw body form content to get a bear token.      resource=<web_api_endpoint>&grant_type=client_credentials&client_id=<client_id>&client_secret=<client_key>

Using a tool like PostMan, POST the following request. Note the generated Client Key needs to be URL encoded and set the header Content-Type: application/x-www-form-urlencoded.

image

Actual Token request message shown below.

POST /f3301fa6-4178-44a9-b5a9-e945e69e4638/oauth2/token HTTP/1.1
Host: login.microsoftonline.com
Content-Type: application/x-www-form-urlencoded
Cache-Control: no-cache

resource=https://connectedcircuits.onmicrosoft.com/DemoProtectedAPI&grant_type=client_credentials&client_id=7f1fd29a-d20f-4158-8f47-eff1ea87dc38&client_secret=OPAfMI4IixoYnGAPiCpLvAvYecH92TXcC1%2F8wH31HD0%3D

If all went well you should get the following response as a JSON Web Token (JWT).{ “token_type”: “Bearer”,
“expires_in”: “3600”,
“ext_expires_in”: “0”,
“expires_on”: “1480196246”,
“not_before”: “1480192346”,
“resource”: “
https://connectedcircuits.onmicrosoft.com/DemoPortectedAPI“,
“access_token”: “<token string>” }

Now to access the Web API service, we need to pass the access_token value in the request Authorisation header as a bearer token. Take note the value of the Authorization header key requires the oAuth token to be prefixed with “Bearer” and a blank space.

image

The raw request is shown below and the bearer token has been truncated for clarity.

GET /api/Values HTTP/1.1
Host: localhost:44312
Authorization: Bearer GET /api/Values HTTP/1.1
Host: localhost:44312
Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiI…
Cache-Control: no-cache
Postman-Token: 1eb00cbd-6eb9-c7b0-182d-a55fcd4f044a

 

Enjoy.

Azure SQL Database Timeout Exceptions

I had came across an issue were the Azure database was throwing timeout exceptions like the one below when under heavy load.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. This failure occurred while attempting to connect to the routing destination. The duration spent while attempting to connect to the original server was – [Pre-Login] initialization=2; handshake=23; [Login] initialization=0; authentication=0; [Post-Login] complete=1;

 

The project  I was working on involved inserting several thousand rows of data from a single message received from an Azure service bus queue. A WebJob reads the messages of the queue and inserts several hundred rows at a time to reduce the transaction size and time.

I had also used the EAB Transient Fault Handling block https://msdn.microsoft.com/en-us/library/hh680934(v=pandp.50).aspx  which wrapped the code that calls the insert stored procedure to handle any network transients that may occur.

Initially I set the retry logic to 1 second. I was unsure why I was getting timeout exception errors as the retry logic should have taken care of this. It was only until I came across this article about transient errors https://azure.microsoft.com/nb-no/documentation/articles/sql-database-connectivity-issues/

The sentence in article that stood out for me was the one highlighted below:

We recommend that you delay for 5 seconds before your first retry. Retrying after a delay shorter than 5 seconds risks overwhelming the cloud service. For each subsequent retry the delay should grow exponentially, up to a maximum of 60 seconds.

 

Also another issue I had was when inserting data into a database table from within a loop. This was causing the Azure database to throttle and the Database Throughput Unit (DTU) to flat line at 100%. I resolved this issue by forcing a 3 second  delay after each iteration of the loop after reading this article from Microsoft https://azure.microsoft.com/en-us/documentation/articles/sql-database-resource-limits/  This delay caused the DTU to now reside around the 50-70% mark.

 

In conclusion, it seems that the short retry period and the loop I had  for inserting a large dataset caused the timeout exception errors to occur. After making the changes to increase the retry period and adding a delay in the looping function, caused the timeout exceptions to disappear.

Using Azure Application Insights with BizTalk. Part 1

One of the great features of Application Insights is the capability to monitor IIS websites and web services. It uses an agent  installed on the IIS server to send telemetry data to Azure Application Insights running in Azure.

In this blog I will setup monitoring of a published BizTalk schema as a WCF service hosted on the server running BizTalk.

First you will need a Microsoft Azure Subscription. Logon to Azure and create a new Application Insight service as shown below. Set the Application Type to ASP.Net web application. This will install the default IIS template on the dashboard for monitoring web sites and web services. You also have the option to add additional monitoring of the other metrics available or delete the existing ones added by the template.

image

Note the location of the data store can only be set to Central US at this time.  When the blade is created, the default template will look very similar to below.

image

Once the service has been created, you will need the generated Instrumentation Key which is available from the properties blade.

image

Next download the Status Monitor from https://azure.microsoft.com/en-us/documentation/articles/app-insights-monitor-performance-live-website-now/ and install it on your BizTalk server which has IIS installed. Execute the file Microsoft.Diagnostics.Agent.StatusMonitor.exe from the default installed  location c:\Program Files\Microsoft Application Insights\Status Monitor.

When it comes to selecting the published WCF service and the telemetry subscription, you will get the following error shown below.

image

This is because the monitor application cannot read the web.config file in the web service directory created by BizTalk.  The namespace added to the Configuration element highlighted below must be temporary removed and the Monitor application executed again to correctly read the web.config file.

image

After the Monitor application has been executed with the namespace removed, it will successfully update the web.config file and add the ApplicationInsights.config to the web service folder. Remember to add the namespace back into the web.config file.

image

If you open the ApplicationInsights.config file you will see the instrumentation key already inserted under the element called InstrumentationKey.

Now when  you start sending requests to the WCF web service endpoint, you will  start seeing metrics being captured under the Performance investigation option as shown below.

image

Also you have the capability to create alerts and to setup monitoring of other metrics if you so desire.

Enjoy.

Testing an Azure Relay Service Bus using Postman

One of my favourite tools for testing RESTFul services is Postman. In this blog I will describe how to use this tool for testing  an Azure relay service bus. Remember if you are hosting the service in IIS you will need to ensure the service is already warm started. Here is a blog by Jeroen Maes describing how to do this.

First we need to generate the Authorization token as described in this article by Microsoft using the SAS keys Shared Access Signature Authentication with Service Bus

Below is some sample code to create a SAS token to be passed in the request header.

Code Snippet
public static string GenerateToken( string resourceUri, string sasKeyName, string sasKey)
{
//set the token lifespan
TimeSpan sinceEpoch = DateTime.UtcNow – new DateTime(1970, 1, 1);
var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 3600);  //1hourstring stringToSign = System.Web.HttpUtility.UrlEncode(resourceUri) + “\n” + expiry;
HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(sasKey));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));//format the sas token
var sasToken = String.Format(CultureInfo.InvariantCulture, “SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}”,
HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, sasKeyName);return sasToken;
}

The function GenerateToken() takes in 3 parameters:
  • resourceUri – This is URL of the web service endpoint in Azure, eg (https://mywebservice.servicebus.windows.net/RequestAvailability)
  • sasKeyName – The name given to the authorization rule in Azure Service Bus.
  • sasKey – The generated SAS key from Azure Service Bus

Calling the GenerateToken()  function will return a string value similar to this: SharedAcessSignature sr=https%3a%2f%2fmywebservice.servicebus.windows.net%2fRequestAvailability&sig=UUEIhOkWX3FzrtBsRia9WFeYxbhMQ9FdppmFMjuJv7U%3d&se=1443495583&skn=MyKeyName

Now we can setup the headers for the request in Postman. Add a header called Authorization and set the value to the string returned from the function GenerateToken().  Now you are ready to send your request to the relay service bus.

image

Enjoy.

Reserving the IP address currently assigned to a Cloud Service

When deploying a cloud service, you will be assigned a random Public Virtual IP (VIP) address.  This address will change if you delete your Web/Worker role and then deploy another version of your code.

With the latest release of the Azure PowerShell Cmdlets, you can now reserve the VIP address that was assigned to your Web/Worker role by using the following cmdlet.

New-AzureReservedIP –ReservedIPName <name of reservation> –Location <region> –ServiceName <name of your service>

eg: New-AzureReservedIP –ReservedIPName “myIPReservation” –Location “Australia East” –ServiceName “myWorkerRoleName”

Check if the reservation has been successful by executing the following cmdlet: Get-AzureReservedIP. Note the InUse should be True and the ServiceName is the name of your Web/Worker role.

image

Now to ensure that your cloud service will always use the reserved IP address with each new deployment, you will need to update the Service Configuration Schema (CSCFG) and add the ReservedIP element.

   1: <ServiceConfiguration serviceName="TCPSocketListener" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="4" osVersion="*" schemaVersion="2015-04.2.6">

   2:   <Role name="WorkerRole1">

   3:     <Instances count="1" />

   4:     <ConfigurationSettings>

   5:     </ConfigurationSettings>

   6:   </Role>

   7:   <NetworkConfiguration>

   8:     <AddressAssignments>

   9:       <ReservedIPs>

  10:         <ReservedIP name="MyCloudServiceName"/>

  11:       </ReservedIPs>

  12:     </AddressAssignments>

  13:   </NetworkConfiguration>

  14: </ServiceConfiguration>

Azure WebJob error – ‘The account credentials for webjoblogging are incorrect’

 

You may encounter this error when running a WebJob locally on your workstation. This is most likely due to a proxy server on your network requiring your authentication credentials and  blocking the outbound request to Azure.

To overcome this issue you can add the following proxy settings to the app.config file.

<system.net>
  <defaultProxy useDefaultCredentials=”true” />
</system.net>

What to use SOAP or REST

Choosing between SOAP and REST style web services for an architectural solution should depend on the consumers of the service in my opinion.  To help me make the right decision I decided to draw up a comparison matrix below showing the pros and cons of each service style.

Feature SOAP REST
Development effort

Having comprehensive toolkits  make development easier.

Toolkits are not required.  However additional work is required to map URI paths the specific handlers.
Describing available Interface definitions. Generally a WSDL is available to describe the available contracts and by using client tools proxies maybe easily generated A document is normally manually written and is  published as a web page. There is a machine readable version called WADL but is not widely used.
Message format XML based format. Has SOAP and WS-* specific markup. This can make the payload quite large. Can craft your own however common formats are XML or JASON. Does not require XML parsing.
Human readable results.
Message Transport Can use a number of transport protocols such as HTTP/S, TCP, SMTP, UDP, JMS etc Normally HTTP/HTTPS. Other protocols are supported with extra development effort.
Message contracts SOAP requires a formal contract to exist between the provider and consumer. 
If  rigid type checking is required then use SOAP.
Focus is on accessing named operations.

Has a form of dynamic contract and relies on documentation. Focus is  on accessing named resources.

Handling of complex domain objects Complex domain models can be easily represented using soap. Not so easy to handle complex models.  Excellent choice if you only require CRUD operations  over a RDBMS.
Transactional support WS* protocol supports transactions which is geared towards SOAP. Has no built in support.  The HTTP protocol cannot provide two-phase commit across distributed transactional resources.
Reliable messaging Built into the WS-* protocol. Has built in successful/retry logic. Clients need to deal with communication failures.
State management Supports both contextual information and conversation state management. The server cannot maintain any state. State management must be handled by the client
Caching No supported HTTP Get operations can be cached.
Message Encoding Supports text and binary encoding Limited to text only
Testing of services Requires unit tests to be developed or 3 rd party test tools. Can simply use a web browser and view the results. 
Security Supports enterprise security  using the WS-Security protocol. Use SOAP if intermediary devices are not trusted. Use SSL protocol for point-to-point.
Also can easily identify the intent of a request based on the HTTP verb.
Client side development complexity Toolkits are required to easily consume the service. Can consumed by any client, even a web browser using Ajax and Javascript.
Maintainability Easier to maintain due to tight data contracts and standards. In the long-run can be much expensive to maintain due to lack of standards
Popularity Mainly in enterprise applications that required WS-* features. Used by most  publically  available  web services.

My conclusion is there is no right or wrong approach for building web services with either SOAP or REST, it depends on the requirements of the consumers.

I tend to lean towards REST for CRUD type web services that integrate with websites and  SOAP for integration between critical enterprise systems that require the WS-* features such as transaction support and reliable communications.

I hope anyone reading this will find this blog helpful in making the correct architectural decision and please let me know I have left anything out.

Developing for the Microsoft Azure platform

These are my own findings whilst developing my first application to be hosted in the cloud. The Azure features I used were Azure SQL, Federated Authentication, Web roles, Worker roles and Queues.

Handling transients from SQL Server

Beware of transient connection issues when connecting to your Azure database. These conditions occur due to normal network related errors, throttling due to excessive requests and controlled load balancing by the AppFabric framework.

Fortunately these types of errors raise specific error codes which can be trappable from your code.  Below are some errors codes to expect.

Error Number Error Message
40197 The service has encountered an error processing your request. Please try again.
40501 The service is currently busy. Retry the request after 10 seconds.
10053 A transport-level error has occurred when receiving results from the server. An established connection was aborted by the software in your host machine.
10054 A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 – An existing connection was forcibly closed by the remote host.)
10060 A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 – A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.)
40613 Database XXXX on server YYYY is not currently available. Please retry the connection later. If the problem persists, contact customer support, and provide them the session tracing ID of ZZZZZ.
40143 The service has encountered an error processing your request. Please try again.
233 The client was unable to establish a connection because of an error during connection initialization process before login. Possible causes include the following: the client tried to connect to an unsupported version of SQL Server; the server was too busy to accept new connections; or there was a resource limitation (insufficient memory or maximum allowed connections) on the server. (provider: TCP Provider, error: 0 – An existing connection was forcibly closed by the remote host.)
64 A connection was successfully established with the server, but then an error occurred during the login process. (provider: TCP Provider, error: 0 – The specified network name is no longer available.)
20 The instance of SQL Server you attempted to connect to does not support encryption.

When developing code to use SQL Azure you should code for transient connection issues. To make things easy, the team from Microsoft patterns and practices have developed  the Microsoft Enterprise Library 5.0 Integration Pack for Windows Azure library.

Logging your application

Its important to add logging to any application deployed to Azure. I use system.diagnostics debug and trace statements extensively throughout my application to emit messages to blob storage for debugging my application at runtime.

To enable some basic logging you need to wire up the Azure diagnostics listener in the app.config file as shown below.

<system.diagnostics>
        <trace>
            <listeners>
                <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, 
                     Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
                    name="AzureDiagnostics">
                    <filter type="" />
                </add>
            </listeners>
        </trace>
    </system.diagnostics>

Add a new setting called Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString to the role file under the Azure project and set the value to your Azure storage account.

image

Then add the following code in the OnStart method to persist the logs to a table called WADLogsTable in Azure storage. The data will be written to storage every minute which is configurable as shown in the code below.

public override bool OnStart()

       {

           // Set the maximum number of concurrent connections 

           ServicePointManager.DefaultConnectionLimit = 12;


           TimeSpan tsOneMinute = TimeSpan.FromMinutes(1);

           DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();

           // Transfer logs to storage every minute

           dmc.Logs.ScheduledTransferPeriod = tsOneMinute;

           // Transfer verbose, critical, etc. logs

           dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

           // Start up the diagnostic manager with the given configuration

           DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", dmc);



           // For information on handling configuration changes

           // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

           RoleEnvironment.Changing += RoleEnvironmentChanging;


           return base.OnStart();

       }


       private void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)

       {

           // If a configuration setting is changing 

           if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))

           {

               // Set e.Cancel to true to restart this role instance 

               e.Cancel = true;

           }

       }

Now in your code use the Trace or Debug statements to log any information you require as shown below.

 Trace.WriteLine("Entry point called");

 

Compile Messages

Always review the warnings emitted after compiling your solution. In one of my projects I was referencing an assembly that was not part of the default core installation of the .Net framework. Luckily I noticed the warning message in the output window.

Ensure you set the Copy Local to true if you need to include a referenced DLL that is required to be packaged with the solution before deploying to Azure.

image

 

Viewing blob and table storage within VS2010

You can view both Blob and Table storage contents within Visual Studio 2010 after installing the Windows Azure Tools for VS2010.

From the menu bar choose View and then Server Explorer. This will open the server explorer window where you can add a new storage account as shown below.

image

Once you have added the account details, you will be able to view the contents of your stores.

VST_SE_DevStore

 

Message Queues

One important thing to remember about Azure queues after you get the message, you must exclusively delete it from the queue after your process finishes. If not the message will reappear on the queue after a configurable period.

The reason why your message reappears is your instance may get shut down half way through processing your data by the Azure AppFabric and then you are at risk of losing data before completing the process. Hence you should delete the message from the queue at the end of your process and not soon after getting the message off the queue.

Also remember the order of delivery is not guaranteed.

You must always build logic in your code to take into account that your process may get shutdown at any time.

 

Windows Live Federated Credentials

The Windows Live authentication only provides a “token string” as one of the SAML claims. Note that is token is unique to your application. If you need to get access to the users email address, you will have to capture that information within your application and persist it yourself.