Using an Azure APIM Policy to call an OAuth endpoint and cache the token

Recently I have been involved as the Integration Architect and tech lead for a project to integrate between MS Dynamics 356  AX/CRM and 3rd party systems.

One of the challenges was to provide single endpoints for internal systems calling web services outside of the organisation that utilised OAuth2.0.  This is where a consumer normally has to call an OAuth token endpoint first and then append the token to the request before calling the actual web service.

By using APIM as a proxy and policies in APIM, I managed to achieve the goal of providing a single URL endpoint for the consumer. The policy initially gets the token from the authorisation endpoint, caches the token and then passes the token to the web service being called. This process is known as fragment caching, where parts of the responses are cached for subsequent requests. Also by caching the bearer part of the token improved the performance significantly for subsequent calls.

Below were the steps I used to add a web API to create transfers orders in Dynamics AX and a policy using the Azure APIM management portal.

1. First create the properties for the oAuth clientId and client secret. A good tip is to prefix the property name and set the Tags with the name of the API you are calling. This helps latter on when you have multiple properties to manage.

 

image

The Properties page should look like something like below after adding your custom properties:

 

image

2. Next, go to the API’s page to either add or import the API’s definition you are calling. Here I am just going to add the API manually by first entering the name, API endpoint address and the public facing URL suffix. Remember to add it to an existing product.

image

3. After the API has been created we will then add the operations. For this demo I will just add one operation to create a transfer order.

image

4. Now that the API we wish to call has been added to the APIM service we can finally pay our attention to creating a policy which does the background task of obtaining the OAuth token. Here I will setup a policy on the CreateTransferOrder operation of the Web API. Once selected, click the ADD POLICY button to begin creating the policy from the template.

image

5. The “<inbound>” policy section is applied to the incoming request before forwarding the request to the backend service. This is where we will check the cache for the authorisation token and if no hit, then we call the OAuth token endpoint to obtain a token.

   1: <inbound>

   2:     <cache-lookup-value key="token-{{Dev-Web1-ClientId}}" variable-name="bearerToken" />

   3:     <choose>

   4:         <when condition="@(!context.Variables.ContainsKey("bearerToken"))">

   5:             <send-request mode="new" response-variable-name="oauthResponse" timeout="20" ignore-error="false">

   6:                 <set-url>https://login.microsoftonline.com/xxxxxx.onmicrosoft.com/oauth2/token</set-url>

   7:                 <set-method>POST</set-method>

   8:                 <set-header name="Content-Type" exists-action="override">

   9:                     <value>application/x-www-form-urlencoded</value>

  10:                     <!-- for multiple headers with the same name add additional value elements -->

  11:                 </set-header>

  12:                 <set-body>@("grant_type=client_credentials&client_id={{Dev-Web1-ClientId}}&client_secret={{Dev-Web1-ClientSecret}}&resource=https%3A%2F%2Fxxxx.xxxxx.ax.dynamics.com")</set-body>

  13:             </send-request>

  14:             <set-variable name="accessToken" value="@((string)((IResponse)context.Variables["oauthResponse"]).Body.As<JObject>()["access_token"])" />

  15:             <!-- Store result in cache -->

  16:             <cache-store-value key="token-{{Dev-Web1-ClientId}}" value="@((string)context.Variables["AccessToken"])" duration="3600" />

  17:         </when>

  18:     </choose>

  19: </inbound>

Lets go through the key points of this definition below.

  • Line 2: – Assigns the value in cache to the context variable called “bearerToken”. On first entry, the cache value will be null and the variable will not be created. Note I am adding the ClientId as part of the cache key name to keep it unique. Property values are accessed by surrounding the key name with double braces. eg  {{myPropertyName}}
  • Line 4: – Checks if the context variable collection contains a key called “bearerToken” and if not found executes the code between the opening and closing “<when>” XML elements.
  • Line 5: – Initiates the request to the OAuth endpoint with a response timeout of 20 seconds. This will put the response message into the variable called “oauthResponse”
  • Line 6: – Is where you set the URL to send the request to. In this scenario I am using the Azure AD OAuth token endpoint below as our STS service: https://login.microsoftonline.com/xxxxxx.onmicrosoft.com/oauth2/token
  • Line 12: – This is where you define the body payload for the request and this is defined  as a typical client credentials grant type payload. Here I am getting the values for the client Id and secret from the user definable properties set in the Properties page. The resource parameter is just hardcoded to the Urlencoded resource URL of the API but can also parameterised.

<set-body>@(“grant_type=client_credentials&client_id={{Dev-Web1-ClientId}}&client_secret={{Dev-Web1-ClientSecret}}&resource=https%3A%2F%2Fxxxx.xxxxx.ax.dynamics.com”)</set-body>

  • Line 14: – Casts the response as a JSON object to allow the retrieval of the “access_token” value using an indexer and assigns it to the context variable “accessToken”.
  • Line 16: – Is where we add the contents of the variable “accessToken” into cache for a period of 3600 seconds.

6. Now that the “<inbound>” section has been completed, we can look at the “<backend>” section of the policy. This is where the policy forwards your request to the backend web service as defined in the API configuration page.

   1: <backend>

   2:     <send-request mode="copy" response-variable-name="transferWSResponse" timeout="20" ignore-error="false">

   3:         <set-method>POST</set-method>

   4:         <set-header name="Authorization" exists-action="override">

   5:             <value>@("Bearer " + (string)context.Variables["bearerToken"])</value>

   6:         </set-header>

   7:         <set-header name="Ocp-Apim-Subscription-Key" exists-action="delete" />

   8:         <set-header name="Content-Type" exists-action="override">

   9:             <value>application/json</value>

  10:         </set-header>

  11:     </send-request>

  12: </backend>

Lets go through this section as before.

  • Line 2: – Creates the request to the backend web service. Here we are placing the response from the web service into the variable called “transferWSResponse”.
  • Line 4: – Is the creating the “Authorization” header to be sent with the request.
  • Line 5: – Adds the bearer token value from the context variable “bearerToken” the authorisation header.
  • Line 7: – Removes the APIM subscription from being forwarded to the backend web service.

7. Now we need to return the response message from the backend web service to the caller. This is done in the “<outbound>” policy section. Here we just simply return the value of the variable “transferWSResponse” back to the caller.

   1: <outbound>

   2:     <return-response response-variable-name="transferWSResponse">

   3:     </return-response>

   4:     <base />

   5: </outbound>

That’s the whole policy defined which will call the OAuth endpoint the get the token and cache it for subsequent calls.

Using the tracing feature in APIM, when the first request is made, the cache will be null and the variable will not be set as shown below.

image

The next trace shows any subsequent requests will hit the cache and set the context variable to the bearer token until it expires.

image

One important note about retrieving data from cache is its an out-of-process call and can add tens of milliseconds onto a request.

Working with the APIM polices can make a huge impact on your API development efforts as logging and access management can be off-loaded using the polices available in APIM. A full list of expressions used in polices can be found here: https://docs.microsoft.com/en-us/azure/api-management/api-management-policy-expressions

Enjoy…

APIM backup & restore using Azure Automation Services

In this post I will describe the steps of using PowerShell scripts to backup APIM and using the Automation service to schedule the backup every month. The restore function also allows you to restore  APIM into another resource group or APIM service. For the project I am working on now, this is what I am doing to move the configuration settings between each environment.

First you need to create a blob store which ideally should be Read-Access geo-redundant storage (RA-GRS). This is where the APIM backups will be stored. After the blob store has been provisioned, create a container for the backup file as shown below.

image

Once the container is created, take note of the Storage account name and Access key for the blob store. These values will be used in the PowerShell script later.

image

Next provision an Azure Automation service and ensure the Create Azure Run As account is set to “yes”.

image

Once it has been provisioned, ensure the modules have been updated by clicking on the “Modules” link on the left hand navigation panel and then “Update Azure Modules”. Note this does take a while to complete.

image

After the update has been completed, click the “Browse gallery” link and in search textbox type “apim”. Once found,  double click on the row to open the import blade.

image

Now click the Import icon to import the cmdlet. This can take several minutes to import.

image

After the PowerShell module has been imported, create a new Runbook and ensure the type has been set to “PowerShell”. Then click the Create button at the bottom of the page.

image

This will open up a new blade where we can add and test the PowerShell script to backup the APIM settings.

image

Now add the following script below into the text editor and remember to update the variables with your environment settings. Once you have added the script, click the “Save” button and then the “Test pane” button to ensure the script runs successfully.

   1: Disable-AzureDataCollection

   2: Write-Output "Starting backup of APIM..."

   3:

   4: # sign in non-interactively using the service principal

   5: $connectionName = "AzureRunAsConnection";

   6: $storageAccountName = "apimstorebackup";

   7: $storageAccountKey = "<storage account key>";

   8: $resourceGroupName = "APIMService";

   9: $apimName = "apimmanager";

  10: $targetContainerName = "backup";

  11: $targetBlobName "AzureAPIM.apimbackup"

  12: try

  13: {

  14:     # Get the connection "AzureRunAsConnection "

  15:     $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

  16:

  17:     Write-Output "Logging in to Azure..."

  18:     Add-AzureRmAccount `

  19:         -ServicePrincipal `

  20:         -TenantId $servicePrincipalConnection.TenantId `

  21:         -ApplicationId $servicePrincipalConnection.ApplicationId `

  22:         -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint

  23: }

  24: catch {

  25:     if (!$servicePrincipalConnection)

  26:     {

  27:         $ErrorMessage = "Connection $connectionName not found."

  28:         throw $ErrorMessage

  29:     } else{

  30:         Write-Error -Message $_.Exception

  31:         throw $_.Exception

  32:     }

  33: }

  34:

  35: $sourceContext = (New-AzureStorageContext -StorageAccountName $storageAccountName  -StorageAccountKey $storageAccountKey);

  36:

  37: Write-Output "Starting backup of APIM instance";

  38: Backup-AzureRmApiManagement `

  39:             -ResourceGroupName $resourceGroupName `

  40:             -Name $apimName `

  41:             -StorageContext $sourceContext `

  42:             -TargetContainerName $targetContainerName `

  43:             -TargetBlobName $targetBlobName;

  44:

  45: Write-Output "Backup of APIM completed.";

Here are the description of the variables:

  • $connectionName = “AzureRunAsConnection” – this is the default connection account that was created when the Automation service was provisioned.
  • $storageAccountName = “apimstorebackup” – name of the blob storage account that was created in the first step.
  • $storageAccountKey = “<storage account key>” – the blob store access key obtained from the portal.
  • $resourceGroupName = “APIMService” –  name of the Azure resource group.
  • $apimName = “apimmanager” – the name of the APIM service.
  • $targetContainerName = “backup”  – name of the backup container in blob store.
  • $tartgetBlobName = “AzureAPIM.apimbackup” – file name of the backup file.  This can be omitted and will create a default filename {apimName}-{yyyy-MM-dd-HH-mm}.apimbackup

Once you have confirmed the script executes without any errors, you can now set up a recurring schedule by creating a new schedule in the Automation service blade under Shared Resources.

imageimage

Next you need to link your Runbook to this schedule by double clicking on your runbook name and then the schedule button on the top menu. This will open another blade where you can view all your schedules that you can select from.

image

That is the automated back process completed now. Below is the PowerShell script required to restored the backup file.

#get the storgae context
$sourceContext = (New-AzureStorageContext `
-StorageAccountName “<blob storage name>” `
-StorageAccountKey “<blob storage account key from Azure portal>”)

#restore the backup
Restore-AzureRmApiManagement -ResourceGroupName “<name of resource group>” `
-Name “<name of the APIM service>” `
-StorageContext $sourceContext  `
-SourceContainerName “<blob storage container name>” `
-SourceBlobName “<backup file name>”

More details on these scripts can be found here: https://docs.microsoft.com/en-us/powershell/module/azurerm.apimanagement/restore-azurermapimanagement?view=azurermps-4.3.1

Enjoy.