Enforcing Ordered Delivery using Azure Logic Apps and Service Bus

When consuming messages from an Azure service bus the order may not be guaranteed due to the brokered based messaging scheme where multiple consumers can consume messages from the bus. Sure you can force the Logic App to execute as a single instance but then you sacrifice performance and scalability. You can also use ReceiveAndDelete but then you loose the transactional nature of the bus. Ultimately to ensure a message is consumed in the correct order using the transactional nature of the bus you would add a sequence number to each message and use this to enforce the ordering.

To achieve ordered delivery using Logic Apps, you would need to ensure all related messages are consumed by the same Logic App instance and for this we use the session Id property on the service bus. Below is the full workflow process to force ordered delivery using Logic Apps and session Id’s on the service bus subscription.

image

This scenario is based on a financial institution which requires all monetary transfers to be processed in an ordered fashion.  The key is choosing a suitable session identifier and with this in mind, the account number was the most suitable candidate as we want a single consumer to process all the transactions for a particular account number.

Here we have created a subscription for a topic called AccountTransfers. Note the Enabled sessions is checked.

image

Once the service bus has been configured, we can now dissect the workflow to see how we can achieve ordered delivery.

The workflow is initiated by a pooling Service Bus Connector. The properties of this connector are shown below. The key point here is to set the Session id to “Next Available”. This forces the Logic App to create a new instance for each unique session id value found on the service bus.

image

The next action “ProcessSBMessage” is used to call another logic app which does the processing of the message found on the bus. Here I am just passing the raw base64 encoded message from the Service Bus Trigger action. Using the pattern “separation of concerns” moves the business logic  away from the process of ensuring ordered delivery.

image

Once the message has been sent to the chained Logic App and a response has been returned, we can complete the message from bus with the following action.

image

Next we go into a loop until the exit condition has been satisfied. I am going to use a counter that is incremented if no messages are found on the service bus. If no more messages are found on the service bus after 30 seconds, the loop will exit.

image

The loop inside starts with another service bus connector trigger which gets the messages from the topic subscription. Here we only want to retrieve one message at a time from the service bus using a peek-lock trigger and using the Session Id from the initial service bus trigger “When a message is received in a topic subscription”.  We then check if a message is found in the output body using the expression “@not(equals(length(body(‘Get_messages_from_a_topic_subscription_(peek-lock)’)), 0))

image

If a message is found, the “If True” branch is executed which again calls the same Logic App as before to process the message. Note the indexer to get to the context data as the service bus connector trigger above returns a collection.

image

Once a successful response is received from the ProcessSBMessage Logic App, the message is completed and the LoopCounter variable is reset to zero. Note the lock token is from the service bus connector trigger within the loop and the Session Id is from the initial service bus connector which started the workflow.

image

Below is the code view for setting the lockToken and SessionId of the “Complete the message” action inside the loop.  Take note of the indexer “[0]” before the LockToken element.

image

If no messages are found on the service bus, the False branch is then executed. This simply has a delay action as not to pool too quickly and increments the LoopCounter.

image

The last step is to close the session when the Until loop exists using the Session Id from the initial service bus connector trigger which started the workflow.

image

Now you are ready to the send messages into the service bus. You should see a Logic App spin up for each unique session Id. Remember to set the session Id property on the service bus to some value before sending the message.

Enjoy…

Error updating AX entities using the Dynamics 365 for Operations connector in Logic Apps

When trying to update an entity via the Dynamics 365 connector you may encounter the following error.

{ “status”: 400, “message”: “Only 1 of 2 keys provided for lookup, provide keys for SalesOrderNumber,dataAreaId.”, “source”: “127.0.0.1” }

One would think passing the ItemInternalId guid value which is the primary key for the entity as the Object Id property would be adequate to find the record to update. Seems not by the error being thrown back.

image

 

Apparently you need to supply the 2 keys,  SalesOrderNumber and dataAreaId  which was mentioned in the error response message as the Object Id as shown below. Note the comma between the sales order number (Sales Order) and the dataAreaId (Company)

image

So the item path for the entity to update looking from the code view would look like this:

image

Enjoy…

Fixing syntax errors when porting Logic Apps from the web portal to Visual Studio

After initially designing your Logic App in the Azure Portal, you may wish to port it to Visual Studio 2017 to manage the template under a source control repository and to further develop it from Visual Studio.

After porting the code, you may encounter some of the following errors when trying to save or deploy your Logic App from Visual Studio.

Error: …the string character ‘@’ at position ‘0’ is not expected.

This is fairly easy to identity as Visual Studio highlights the code which it thinks the syntax is incorrect as shown below. Remember this code was ported over from the Azure Portal where it parsed without any issues.

image

It is complaining about the unrecognised function . To get around this issue we need to use the “concat” string function to treat this as a string literal.

The original syntax is here: “ProductCodes”: “[@{outputs(‘Compose_Product_Detail’)}]”

By surrounding the whole value “[@{outputs(‘Compose_Product_Detail’)}]” with concat as shown below resolves this error.

“@concat(‘[‘, outputs(‘Compose_Product_Detail’)’, ‘]’)”

Breaking the designer when parameterising the subscription Id when calling a function

You may have a call-back function defined in your Logic App which has the subscription guid embedded in the code (blanked out for security reasons) similar to below:

image

When trying to parameterise the subscription key by simply adding the function “subscription().subscriptionId” in place of the guid value as shown below:

“function”: {
“id”: “/subscriptions/subscription().subscriptionId/resourceGroup…

You will get the following error when trying to same the changes.

image

To overcome this issue, wrap the value in a concat function as shown below. Note the url has been shorted with “…” to make it readable.

“function”: {
“id”: “[concat(‘/subscriptions/’,subscription().subscriptionId,’/resourceGroups/…/providers/Microsoft.Web/sites/…/functions/Cmn_GuidMapNullValue’)]”

You can also apply this same technique to parameterise other information in the Id key such as the website location.

Enjoy…

Robust Cloud Integration with Azure

For the last year I have been busy co-authoring this book on Azure Cloud Integration with my follow co-authors Abhishek Kumarm, Martin Abbott, Gyanendra Kumar Gautam, James Corbould and Ashish Bhanbhani.

Image result for robust cloud integration with azure

It is available on the Packt website here: https://www.packtpub.com/virtualization-and-cloud/robust-cloud-integration-azure

This book will teach you how to design and implement cloud integration using Microsoft Azure. It starts by showing you how to build, deploy, and secure the API app. Next, it introduces you to Logic Apps and helps you quickly start building your integration applications. We’ll then go through the different connectors available for Logic Apps to build your automated business process workflow. Its packed with a lot of information spanning just under 700 pages.

Don’t forget to check out another publication I co-authored back in 2015 with with Mark Brimble, Johann Cooper and Colin Dijkgraaf called SOA Patterns with BizTalk Server 2013 and Microsoft Azure.

SOA Patterns with BizTalk Server 2013 and Microsoft Azure - Second Edition Book Cover

And it is still available from the Packt website here: https://www.packtpub.com/networking-and-servers/soa-patterns-biztalk-server-2013-second-edition

Hope you enjoy reading it, just as I enjoyed writing the content.

Searching through messages in Logic Apps

Unfortunately Logic Apps do not provide an easy option to view the contents of a message unless you go through each log entry and view the outputs as shown below.

image

However there is an alternative method using Log Analytics which comes with Operations Management Suite (OMS). By using Log Search you can search for specific property values within your messages. Below is an example of searching through the diagnostics log of a Logic App for a particular JobId and the results using OMS.

image

 

To start using this feature we need to setup OMS first using the steps below.

1. In the Marketplace search for “Log Analytics” and select.

image

2. Create the OMS Workspace using a suitable name and resource group.

image

3. Next we need to add a storage account for the Logic Apps to store diagnostic data. From the Marketplace, search for “Storage Account” and select it.

image

Create the storage account by providing a name and leave the “Account kind” as “General purpose”.

image

4. Once the storage account is created, we need to link this to the OMS Workspace. Click on the Log Analytics resource that was created in step 2 as shown below.

image

In the properties blade, scroll down to the “Workspace Data Sources” and click on “Storage account logs”.

image

Then click the plus sign to add a storage account. Choose the storage account you created previously.

image

After you have chosen the storage account, select the “Data Type” and chose events. Then click “OK” at the bottom of the page.

image

Now that all the plumbing has been configured we can turn our attention to the Logic App. For this example we are going to create a simple logic app that receives a purchase order and sends it to RequestBin and then returns a status code of OK.

image

Here is an example of the purchase order we are going to post to the Logic App.

{
   "CustomerCode": "CUST1000",
   "Lines": [
      {
         "LineNo": 1,
         "Price": 68.25,
         "ProductCode": "PRD1100",
         "Qty": 1
      }
   ],
   "OrderNo": "1000",
   "Total": 68.25
}

Once we have created the logic app, select code view to add our custom tracked properties on an action. I want to be able to search for orders using either the OrderNo, CustomerCode or the Total order value.

To do this add the highlighted “trackedProperties” section to the action, specifying the attribute name to search on and the path in the message to obtain the value from.

image

Now that the logic app has been created and saved, we need to turn on Diagnostics for this Logic App. Under the Monitoring section of the Logic App, click on Diagnostics and then Diagnostics Settings shown below.

image

Set the “Status” to On and check the “Archive to a storage account”, select the storage account that was provisioned previously and the retention periods to what you require.

image

Now check the  “Send to Log Analytics” and select the OMS Workspace created before. Then click the Save button.

image

Everything should be good to go now. Use something like PostMan to start sending test messages to the Logic App. After a few minutes you should see the tracked properties and their values being written the blob store under the storage account and a container called “insights-logs-workflowruntime”

If you keep drilling down into the containers that matches your logic app name, you will see a file called “PT1H.json”. Inside the file you will see the entries for the tracked properties.

image

To use OMS to search on on of your properties, click on the Log Analytics under your logic app.

image

Once the blade opens click on the OMS Portal link which opens the portal site. On the portal site, click the icon “Get Started”. Then under “Data” and “Custom Fields” you should be able to see your custom tracked properties. Take note of these field names as these will be used in the search query.

image

Now click the Search icon symbol on the left navigation pane and enter the following query “Type=AzureDiagnostics  resource_workflowName_s=Orders” into the search  box and then click search. Note it can take a few minutes before the data turns up in OMS if you just submitted a message to the logic app. The query will list all logics that have been triggered with the Logic App name called “Orders”.

You should get a list of all the triggers related to the “Orders” logic app as shown below. Here I found 118 events in the last day.

image

You can narrow your search down further by modifying the search query. Searching for orders with an order number equal to 1004, you would enter this into the query field “Type=AzureDiagnostics  resource_workflowName_s=Orders trackedProperties_OrderNo_s=1004”.  This will display the records matching the order number.

image

Also by left clicking on the ellipse (…)  next to each field brings up another context menu to provide more filtering options.

image

In conclusion, by using OMS, it provides the ability to search for tracked properties, save common queries and create custom  dashboards. I encourage you to look at all the features available in OMS as we only touched the surface here in this post.

Enjoy.

Securing a MVC Web API using Azure AD and using Client Credential flow

In this blog I will show you how to add authorisation to a MCV Controller,  setup Azure AD as an OAuth 2.0 JWT (Json Web Token)  provider and use an Azure AD endpoint to obtain the access token. I will be using Client Credentials grant flow to access a protected Web API resource.

Client Credential flow uses a Client Id and a Client Secret values. These are sent to an OAuth  token provider to retrieve an access token first. This is ideal for a client application calling a web service resource without a user supplying their login credentials. The following sequence diagram shows how a website would call a web service.

image

Registering the Web API Service

Lets first start by provisioning an Azure Active Directory using the classic portal as we will require the generated Audience and Tenant values for our MVC Web API latter.

image

image

Next add a name for your Directory and a unique Domain Name to use. For this example I called it “connectedcircuits”

image

Once the Domain has been configured it will be shown in your active directory listings.

image

Next we will add an Application to this domain for the Web API we are developing. Now click on Applications. It will have a default application already registered for Office 365 Management APIs.

image

Scroll down to the bottom of the page and click the Add button. It will then show the following dialogue box. Select “Add an application my organisation is developing”

image

Give it a suitable name and select “Web Application and/or Web API”

image

In the next screen you can set the Sign-on URL to “http://localhost” as we won’t be using this URL for  Client Credential flow.

Now add an App ID URI for the API we will be developing. It must be unique within your organisation’s directory. In our scenario I have chosen “https://connectedcircuits.onmicrosoft.com/DemoProtectedAPI” which represents the domain and the name of the Web API we will be building.

image

We can now start on building the MVC Web API. Create a new ASP.NET Web Application project in Visual Studio.

image

Select the Web API template, set the Authentication to “No Authentication” and uncheck “Host in the cloud”. We will setup the authentication manually and host this locally in IIS Express for now.

image

Once the template solution is created install the following Nuget packages to the project.

  • Install-Package Microsoft.Owin
  • Install-Package Microsoft.Owin.Security.ActiveDirectory
  • Install-Package Microsoft.Owin.Host.SystemWeb – this package is required when hosting your API in IIS express.
  • Install-Package Microsoft.AspNet.WebApi.Owin.
  • Install-Package System.IdentityModel.Tokens.Jwt -Version 4.0.2.206221351

After the packages have been installed, add a new partial class called Startup.cs to the App_Start folder.

image

Add the following code into this class. This class reads the configuration settings in the web.config file and  initialises the Owin authentication.

   1: using Microsoft.Owin;

   2: using Owin;

   3: using System.Web.Http;

   4: using Microsoft.Owin.Security.ActiveDirectory;

   5: using System.Configuration;

   6: using System.IdentityModel.Tokens;

   7:

   8: [assembly: OwinStartup(typeof(DemoProtectedAPI.App_Start.Startup))]

   9:

  10: namespace DemoProtectedAPI.App_Start

  11: {

  12:     public partial class Startup

  13:     {

  14:         public void Configuration(IAppBuilder app)

  15:         {

  16:             HttpConfiguration config = new HttpConfiguration();

  17:             ConfigureAuth(app);

  18:             WebApiConfig.Register(config);

  19:

  20:             app.UseWebApi(config);

  21:         }

  22:

  23:         private void ConfigureAuth(IAppBuilder app)

  24:         {

  25:             var tokenValidationParameter = new TokenValidationParameters();

  26:             tokenValidationParameter.ValidAudience = ConfigurationManager.AppSettings["Azure.AD.Audience"];

  27:             app.UseWindowsAzureActiveDirectoryBearerAuthentication(

  28:             new WindowsAzureActiveDirectoryBearerAuthenticationOptions

  29:             {

  30:

  31:                 Tenant = ConfigurationManager.AppSettings["Azure.AD.Tenant"],

  32:                 TokenValidationParameters = tokenValidationParameter

  33:

  34:             });

  35:

  36:         }

  37:     }

  38: }

 

In the web.config add the two keys to the <appSettings> section shown in the code below. The  Tenant value is the  domain name that was specified when creating the directory and the Audience value is the App ID URL  that was given when adding an application to the directory.

   1: <appSettings>

   2:

   3:   <add key="Azure.AD.Tenant"

   4:     value="connectedcircuits.onmicrosoft.com" />

   5:   <add key="Azure.AD.Audience"

   6:     value="https://connectedcircuits.onmicrosoft.com/DemoProtectedAPI" />

   7:

   8: </appSettings>

 

Now to protect the methods in a a Controller, add the [Authorize] attribute to the class as shown below.

image

One final step is to turn on the SSL requirement for this project as oAuth2 relies on using a  secure channel. Click on the project to display the Properties page and set the SSL Enabled to True. Take note of the URL and port number.

image

Now if you try and navigate to the GET Values method you should get an authorization error as shown below:

image

Registering the client app

To be able to get access to the methods exposed in the MVC Controller, you will need to send a bearer token in the Authorization HTTP Header. To get the token we will need to setup another application for the client in the same domain as the Web API Service.

From the Azure Classic Portal, go to the Active Directory resources and select the name of the active directory that was created at the beginning of this blog. Click on Applications and then click on the Add button at the bottom of the page.

image

Again select “Add an application my organization is developing”. In the dialogue box give it a suitable name and set the type to Web Application and/or Web API as shown below.

image

In the next screen, you can set the sign-on URL to http://localhost as it is not used for this type of authentication flow. The APP ID URI should be set to a unique value,  its used as a unique logical identifier for your app.

image

Once it has been created, click “Configure” and scroll to the bottom of the page and click the button “Add application”. This is how we associate the web service you created earlier with your application.

image

Set the “Show” dropdown control to “All Apps” and click the Tick icon. This should display all the apps listed in your domain. Highlight your web service and click on the plus icon.

image

After clicking the plus icon, the app will be shown in the Selected column. Then click the “Tick” icon at the bottom of the page to return back the Configure page.

image

Back on the Configure page, set the delegated permissions on the App you just added from the previous web page as shown below and click save.

image

Now we need to obtain the client key to be used for obtaining the JWT.  Scroll up to the keys section and generate a new key. Note the key won’t be shown until you press the save button at the bottom of the page and you will need to take a copy of this key.

image

Next we will need top take a copy of the client Id which is also on the Configure page.

image

The last piece we need is the endpoint of the OAuth token. This is found by scrolling to the bottom of the page and pressing the View Endpoints button.

image

Now take a copy of the OAuth token endpoint.

image

We now have the 4 pieces of information required to request the JWT.

  1. Web API endpoint  – the specified App ID URI when registering the Web API service.
  2. Client Id – created when registering the client app.
    • 7f1fd29a-d20f-4158-8f47-eff1ea87dc38
  3. Client Key  – generated when registering the client app.
    • OPAfMI4IixoYnGAPiCpLvAvYecH92TXcC1/8wH31HD0=
  4. OAuth 2.0 token endpoint – App endpoints from either the Web API or Client App.https://login.microsoftonline.com/f3301fa6-4178-44a9-b5a9-e945e69e4638/oauth2/token

Using the above information we make a POST to the OAuth token endpoint using the following raw body form content to get a bear token.      resource=<web_api_endpoint>&grant_type=client_credentials&client_id=<client_id>&client_secret=<client_key>

Using a tool like PostMan, POST the following request. Note the generated Client Key needs to be URL encoded and set the header Content-Type: application/x-www-form-urlencoded.

image

Actual Token request message shown below.

POST /f3301fa6-4178-44a9-b5a9-e945e69e4638/oauth2/token HTTP/1.1
Host: login.microsoftonline.com
Content-Type: application/x-www-form-urlencoded
Cache-Control: no-cache

resource=https://connectedcircuits.onmicrosoft.com/DemoProtectedAPI&grant_type=client_credentials&client_id=7f1fd29a-d20f-4158-8f47-eff1ea87dc38&client_secret=OPAfMI4IixoYnGAPiCpLvAvYecH92TXcC1%2F8wH31HD0%3D

If all went well you should get the following response as a JSON Web Token (JWT).{ “token_type”: “Bearer”,
“expires_in”: “3600”,
“ext_expires_in”: “0”,
“expires_on”: “1480196246”,
“not_before”: “1480192346”,
“resource”: “
https://connectedcircuits.onmicrosoft.com/DemoPortectedAPI“,
“access_token”: “<token string>” }

Now to access the Web API service, we need to pass the access_token value in the request Authorisation header as a bearer token. Take note the value of the Authorization header key requires the oAuth token to be prefixed with “Bearer” and a blank space.

image

The raw request is shown below and the bearer token has been truncated for clarity.

GET /api/Values HTTP/1.1
Host: localhost:44312
Authorization: Bearer GET /api/Values HTTP/1.1
Host: localhost:44312
Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiI…
Cache-Control: no-cache
Postman-Token: 1eb00cbd-6eb9-c7b0-182d-a55fcd4f044a

 

Enjoy.

Azure SQL Database Timeout Exceptions

I had came across an issue were the Azure database was throwing timeout exceptions like the one below when under heavy load.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. This failure occurred while attempting to connect to the routing destination. The duration spent while attempting to connect to the original server was – [Pre-Login] initialization=2; handshake=23; [Login] initialization=0; authentication=0; [Post-Login] complete=1;

 

The project  I was working on involved inserting several thousand rows of data from a single message received from an Azure service bus queue. A WebJob reads the messages of the queue and inserts several hundred rows at a time to reduce the transaction size and time.

I had also used the EAB Transient Fault Handling block https://msdn.microsoft.com/en-us/library/hh680934(v=pandp.50).aspx  which wrapped the code that calls the insert stored procedure to handle any network transients that may occur.

Initially I set the retry logic to 1 second. I was unsure why I was getting timeout exception errors as the retry logic should have taken care of this. It was only until I came across this article about transient errors https://azure.microsoft.com/nb-no/documentation/articles/sql-database-connectivity-issues/

The sentence in article that stood out for me was the one highlighted below:

We recommend that you delay for 5 seconds before your first retry. Retrying after a delay shorter than 5 seconds risks overwhelming the cloud service. For each subsequent retry the delay should grow exponentially, up to a maximum of 60 seconds.

 

Also another issue I had was when inserting data into a database table from within a loop. This was causing the Azure database to throttle and the Database Throughput Unit (DTU) to flat line at 100%. I resolved this issue by forcing a 3 second  delay after each iteration of the loop after reading this article from Microsoft https://azure.microsoft.com/en-us/documentation/articles/sql-database-resource-limits/  This delay caused the DTU to now reside around the 50-70% mark.

 

In conclusion, it seems that the short retry period and the loop I had  for inserting a large dataset caused the timeout exception errors to occur. After making the changes to increase the retry period and adding a delay in the looping function, caused the timeout exceptions to disappear.

Using Azure Application Insights with BizTalk. Part 1

One of the great features of Application Insights is the capability to monitor IIS websites and web services. It uses an agent  installed on the IIS server to send telemetry data to Azure Application Insights running in Azure.

In this blog I will setup monitoring of a published BizTalk schema as a WCF service hosted on the server running BizTalk.

First you will need a Microsoft Azure Subscription. Logon to Azure and create a new Application Insight service as shown below. Set the Application Type to ASP.Net web application. This will install the default IIS template on the dashboard for monitoring web sites and web services. You also have the option to add additional monitoring of the other metrics available or delete the existing ones added by the template.

image

Note the location of the data store can only be set to Central US at this time.  When the blade is created, the default template will look very similar to below.

image

Once the service has been created, you will need the generated Instrumentation Key which is available from the properties blade.

image

Next download the Status Monitor from https://azure.microsoft.com/en-us/documentation/articles/app-insights-monitor-performance-live-website-now/ and install it on your BizTalk server which has IIS installed. Execute the file Microsoft.Diagnostics.Agent.StatusMonitor.exe from the default installed  location c:\Program Files\Microsoft Application Insights\Status Monitor.

When it comes to selecting the published WCF service and the telemetry subscription, you will get the following error shown below.

image

This is because the monitor application cannot read the web.config file in the web service directory created by BizTalk.  The namespace added to the Configuration element highlighted below must be temporary removed and the Monitor application executed again to correctly read the web.config file.

image

After the Monitor application has been executed with the namespace removed, it will successfully update the web.config file and add the ApplicationInsights.config to the web service folder. Remember to add the namespace back into the web.config file.

image

If you open the ApplicationInsights.config file you will see the instrumentation key already inserted under the element called InstrumentationKey.

Now when  you start sending requests to the WCF web service endpoint, you will  start seeing metrics being captured under the Performance investigation option as shown below.

image

Also you have the capability to create alerts and to setup monitoring of other metrics if you so desire.

Enjoy.

Testing an Azure Relay Service Bus using Postman

One of my favourite tools for testing RESTFul services is Postman. In this blog I will describe how to use this tool for testing  an Azure relay service bus. Remember if you are hosting the service in IIS you will need to ensure the service is already warm started. Here is a blog by Jeroen Maes describing how to do this.

First we need to generate the Authorization token as described in this article by Microsoft using the SAS keys Shared Access Signature Authentication with Service Bus

Below is some sample code to create a SAS token to be passed in the request header.

Code Snippet
public static string GenerateToken( string resourceUri, string sasKeyName, string sasKey)
{
//set the token lifespan
TimeSpan sinceEpoch = DateTime.UtcNow – new DateTime(1970, 1, 1);
var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 3600);  //1hourstring stringToSign = System.Web.HttpUtility.UrlEncode(resourceUri) + “\n” + expiry;
HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(sasKey));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));//format the sas token
var sasToken = String.Format(CultureInfo.InvariantCulture, “SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}”,
HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, sasKeyName);return sasToken;
}

The function GenerateToken() takes in 3 parameters:
  • resourceUri – This is URL of the web service endpoint in Azure, eg (https://mywebservice.servicebus.windows.net/RequestAvailability)
  • sasKeyName – The name given to the authorization rule in Azure Service Bus.
  • sasKey – The generated SAS key from Azure Service Bus

Calling the GenerateToken()  function will return a string value similar to this: SharedAcessSignature sr=https%3a%2f%2fmywebservice.servicebus.windows.net%2fRequestAvailability&sig=UUEIhOkWX3FzrtBsRia9WFeYxbhMQ9FdppmFMjuJv7U%3d&se=1443495583&skn=MyKeyName

Now we can setup the headers for the request in Postman. Add a header called Authorization and set the value to the string returned from the function GenerateToken().  Now you are ready to send your request to the relay service bus.

image

Enjoy.

Reserving the IP address currently assigned to a Cloud Service

When deploying a cloud service, you will be assigned a random Public Virtual IP (VIP) address.  This address will change if you delete your Web/Worker role and then deploy another version of your code.

With the latest release of the Azure PowerShell Cmdlets, you can now reserve the VIP address that was assigned to your Web/Worker role by using the following cmdlet.

New-AzureReservedIP –ReservedIPName <name of reservation> –Location <region> –ServiceName <name of your service>

eg: New-AzureReservedIP –ReservedIPName “myIPReservation” –Location “Australia East” –ServiceName “myWorkerRoleName”

Check if the reservation has been successful by executing the following cmdlet: Get-AzureReservedIP. Note the InUse should be True and the ServiceName is the name of your Web/Worker role.

image

Now to ensure that your cloud service will always use the reserved IP address with each new deployment, you will need to update the Service Configuration Schema (CSCFG) and add the ReservedIP element.

   1: <ServiceConfiguration serviceName="TCPSocketListener" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="4" osVersion="*" schemaVersion="2015-04.2.6">

   2:   <Role name="WorkerRole1">

   3:     <Instances count="1" />

   4:     <ConfigurationSettings>

   5:     </ConfigurationSettings>

   6:   </Role>

   7:   <NetworkConfiguration>

   8:     <AddressAssignments>

   9:       <ReservedIPs>

  10:         <ReservedIP name="MyCloudServiceName"/>

  11:       </ReservedIPs>

  12:     </AddressAssignments>

  13:   </NetworkConfiguration>

  14: </ServiceConfiguration>