Enforcing Ordered Delivery using Azure Logic Apps and Service Bus

When consuming messages from an Azure service bus the order may not be guaranteed due to the brokered based messaging scheme where multiple consumers can consume messages from the bus. Sure you can force the Logic App to execute as a single instance but then you sacrifice performance and scalability. You can also use ReceiveAndDelete but then you loose the transactional nature of the bus. Ultimately to ensure a message is consumed in the correct order using the transactional nature of the bus you would add a sequence number to each message and use this to enforce the ordering.

To achieve ordered delivery using Logic Apps, you would need to ensure all related messages are consumed by the same Logic App instance and for this we use the session Id property on the service bus. Below is the full workflow process to force ordered delivery using Logic Apps and session Id’s on the service bus subscription.

image

This scenario is based on a financial institution which requires all monetary transfers to be processed in an ordered fashion.  The key is choosing a suitable session identifier and with this in mind, the account number was the most suitable candidate as we want a single consumer to process all the transactions for a particular account number.

Here we have created a subscription for a topic called AccountTransfers. Note the Enabled sessions is checked.

image

Once the service bus has been configured, we can now dissect the workflow to see how we can achieve ordered delivery.

The workflow is initiated by a pooling Service Bus Connector. The properties of this connector are shown below. The key point here is to set the Session id to “Next Available”. This forces the Logic App to create a new instance for each unique session id value found on the service bus.

image

The next action “ProcessSBMessage” is used to call another logic app which does the processing of the message found on the bus. Here I am just passing the raw base64 encoded message from the Service Bus Trigger action. Using the pattern “separation of concerns” moves the business logic  away from the process of ensuring ordered delivery.

image

Once the message has been sent to the chained Logic App and a response has been returned, we can complete the message from bus with the following action.

image

Next we go into a loop until the exit condition has been satisfied. I am going to use a counter that is incremented if no messages are found on the service bus. If no more messages are found on the service bus after 30 seconds, the loop will exit.

image

The loop inside starts with another service bus connector trigger which gets the messages from the topic subscription. Here we only want to retrieve one message at a time from the service bus using a peek-lock trigger and using the Session Id from the initial service bus trigger “When a message is received in a topic subscription”.  We then check if a message is found in the output body using the expression “@not(equals(length(body(‘Get_messages_from_a_topic_subscription_(peek-lock)’)), 0))

image

If a message is found, the “If True” branch is executed which again calls the same Logic App as before to process the message. Note the indexer to get to the context data as the service bus connector trigger above returns a collection.

image

Once a successful response is received from the ProcessSBMessage Logic App, the message is completed and the LoopCounter variable is reset to zero. Note the lock token is from the service bus connector trigger within the loop and the Session Id is from the initial service bus connector which started the workflow.

image

Below is the code view for setting the lockToken and SessionId of the “Complete the message” action inside the loop.  Take note of the indexer “[0]” before the LockToken element.

image

If no messages are found on the service bus, the False branch is then executed. This simply has a delay action as not to pool too quickly and increments the LoopCounter.

image

The last step is to close the session when the Until loop exists using the Session Id from the initial service bus connector trigger which started the workflow.

image

Now you are ready to the send messages into the service bus. You should see a Logic App spin up for each unique session Id. Remember to set the session Id property on the service bus to some value before sending the message.

Enjoy…

Error updating AX entities using the Dynamics 365 for Operations connector in Logic Apps

When trying to update an entity via the Dynamics 365 connector you may encounter the following error.

{ “status”: 400, “message”: “Only 1 of 2 keys provided for lookup, provide keys for SalesOrderNumber,dataAreaId.”, “source”: “127.0.0.1” }

One would think passing the ItemInternalId guid value which is the primary key for the entity as the Object Id property would be adequate to find the record to update. Seems not by the error being thrown back.

image

 

Apparently you need to supply the 2 keys,  SalesOrderNumber and dataAreaId  which was mentioned in the error response message as the Object Id as shown below. Note the comma between the sales order number (Sales Order) and the dataAreaId (Company)

image

So the item path for the entity to update looking from the code view would look like this:

image

Enjoy…

Fixing syntax errors when porting Logic Apps from the web portal to Visual Studio

After initially designing your Logic App in the Azure Portal, you may wish to port it to Visual Studio 2017 to manage the template under a source control repository and to further develop it from Visual Studio.

After porting the code, you may encounter some of the following errors when trying to save or deploy your Logic App from Visual Studio.

Error: …the string character ‘@’ at position ‘0’ is not expected.

This is fairly easy to identity as Visual Studio highlights the code which it thinks the syntax is incorrect as shown below. Remember this code was ported over from the Azure Portal where it parsed without any issues.

image

It is complaining about the unrecognised function . To get around this issue we need to use the “concat” string function to treat this as a string literal.

The original syntax is here: “ProductCodes”: “[@{outputs(‘Compose_Product_Detail’)}]”

By surrounding the whole value “[@{outputs(‘Compose_Product_Detail’)}]” with concat as shown below resolves this error.

“@concat(‘[‘, outputs(‘Compose_Product_Detail’)’, ‘]’)”

Breaking the designer when parameterising the subscription Id when calling a function

You may have a call-back function defined in your Logic App which has the subscription guid embedded in the code (blanked out for security reasons) similar to below:

image

When trying to parameterise the subscription key by simply adding the function “subscription().subscriptionId” in place of the guid value as shown below:

“function”: {
“id”: “/subscriptions/subscription().subscriptionId/resourceGroup…

You will get the following error when trying to same the changes.

image

To overcome this issue, wrap the value in a concat function as shown below. Note the url has been shorted with “…” to make it readable.

“function”: {
“id”: “[concat(‘/subscriptions/’,subscription().subscriptionId,’/resourceGroups/…/providers/Microsoft.Web/sites/…/functions/Cmn_GuidMapNullValue’)]”

You can also apply this same technique to parameterise other information in the Id key such as the website location.

Enjoy…

Robust Cloud Integration with Azure

For the last year I have been busy co-authoring this book on Azure Cloud Integration with my follow co-authors Abhishek Kumarm, Martin Abbott, Gyanendra Kumar Gautam, James Corbould and Ashish Bhanbhani.

Image result for robust cloud integration with azure

It is available on the Packt website here: https://www.packtpub.com/virtualization-and-cloud/robust-cloud-integration-azure

This book will teach you how to design and implement cloud integration using Microsoft Azure. It starts by showing you how to build, deploy, and secure the API app. Next, it introduces you to Logic Apps and helps you quickly start building your integration applications. We’ll then go through the different connectors available for Logic Apps to build your automated business process workflow. Its packed with a lot of information spanning just under 700 pages.

Don’t forget to check out another publication I co-authored back in 2015 with with Mark Brimble, Johann Cooper and Colin Dijkgraaf called SOA Patterns with BizTalk Server 2013 and Microsoft Azure.

SOA Patterns with BizTalk Server 2013 and Microsoft Azure - Second Edition Book Cover

And it is still available from the Packt website here: https://www.packtpub.com/networking-and-servers/soa-patterns-biztalk-server-2013-second-edition

Hope you enjoy reading it, just as I enjoyed writing the content.

Searching through messages in Logic Apps

Unfortunately Logic Apps do not provide an easy option to view the contents of a message unless you go through each log entry and view the outputs as shown below.

image

However there is an alternative method using Log Analytics which comes with Operations Management Suite (OMS). By using Log Search you can search for specific property values within your messages. Below is an example of searching through the diagnostics log of a Logic App for a particular JobId and the results using OMS.

image

 

To start using this feature we need to setup OMS first using the steps below.

1. In the Marketplace search for “Log Analytics” and select.

image

2. Create the OMS Workspace using a suitable name and resource group.

image

3. Next we need to add a storage account for the Logic Apps to store diagnostic data. From the Marketplace, search for “Storage Account” and select it.

image

Create the storage account by providing a name and leave the “Account kind” as “General purpose”.

image

4. Once the storage account is created, we need to link this to the OMS Workspace. Click on the Log Analytics resource that was created in step 2 as shown below.

image

In the properties blade, scroll down to the “Workspace Data Sources” and click on “Storage account logs”.

image

Then click the plus sign to add a storage account. Choose the storage account you created previously.

image

After you have chosen the storage account, select the “Data Type” and chose events. Then click “OK” at the bottom of the page.

image

Now that all the plumbing has been configured we can turn our attention to the Logic App. For this example we are going to create a simple logic app that receives a purchase order and sends it to RequestBin and then returns a status code of OK.

image

Here is an example of the purchase order we are going to post to the Logic App.

{
   "CustomerCode": "CUST1000",
   "Lines": [
      {
         "LineNo": 1,
         "Price": 68.25,
         "ProductCode": "PRD1100",
         "Qty": 1
      }
   ],
   "OrderNo": "1000",
   "Total": 68.25
}

Once we have created the logic app, select code view to add our custom tracked properties on an action. I want to be able to search for orders using either the OrderNo, CustomerCode or the Total order value.

To do this add the highlighted “trackedProperties” section to the action, specifying the attribute name to search on and the path in the message to obtain the value from.

image

Now that the logic app has been created and saved, we need to turn on Diagnostics for this Logic App. Under the Monitoring section of the Logic App, click on Diagnostics and then Diagnostics Settings shown below.

image

Set the “Status” to On and check the “Archive to a storage account”, select the storage account that was provisioned previously and the retention periods to what you require.

image

Now check the  “Send to Log Analytics” and select the OMS Workspace created before. Then click the Save button.

image

Everything should be good to go now. Use something like PostMan to start sending test messages to the Logic App. After a few minutes you should see the tracked properties and their values being written the blob store under the storage account and a container called “insights-logs-workflowruntime”

If you keep drilling down into the containers that matches your logic app name, you will see a file called “PT1H.json”. Inside the file you will see the entries for the tracked properties.

image

To use OMS to search on on of your properties, click on the Log Analytics under your logic app.

image

Once the blade opens click on the OMS Portal link which opens the portal site. On the portal site, click the icon “Get Started”. Then under “Data” and “Custom Fields” you should be able to see your custom tracked properties. Take note of these field names as these will be used in the search query.

image

Now click the Search icon symbol on the left navigation pane and enter the following query “Type=AzureDiagnostics  resource_workflowName_s=Orders” into the search  box and then click search. Note it can take a few minutes before the data turns up in OMS if you just submitted a message to the logic app. The query will list all logics that have been triggered with the Logic App name called “Orders”.

You should get a list of all the triggers related to the “Orders” logic app as shown below. Here I found 118 events in the last day.

image

You can narrow your search down further by modifying the search query. Searching for orders with an order number equal to 1004, you would enter this into the query field “Type=AzureDiagnostics  resource_workflowName_s=Orders trackedProperties_OrderNo_s=1004”.  This will display the records matching the order number.

image

Also by left clicking on the ellipse (…)  next to each field brings up another context menu to provide more filtering options.

image

In conclusion, by using OMS, it provides the ability to search for tracked properties, save common queries and create custom  dashboards. I encourage you to look at all the features available in OMS as we only touched the surface here in this post.

Enjoy.