Using Global Parameter Files in a CI/CD Pipeline

When developing a solution that has multiple projects and parameter files, more than likely these parameter files will have some common values shared between them. Examples of common values are the environment name, connection strings, configuration settings etc.

A good example of this scenario are Logic App solutions that may have multiple projects. These are typically structured as shown below where each project may several parameter files, one for each environment. Each of these parameter files will have different configuration settings for each of the 3 environments but are common across all the projects.

Keeping track of the multiple parameter files can be a maintenance issue and prone to misconfiguration errors. An alternative is to use a global parameter file which contains all the common values used across the projects. This global file will overwrite the matching parameter value in each of the referenced projects when the projects are built inside a CI/CD pipeline.  

By using global parameter files, the solution will now look similar to that shown below. Here all the common values for each environment are placed in a single global parameter file. This now simplifies the solution as there is now only one parameter file under each project and all the shared parameter values are now in a global parameter file. The default global values for the parameter files under each Logic App project will typically be set to the development environment values.


The merging of the global parameter files is managed by the PowerShell script below.

# First parameter is the source param and the second is the destination param file.
param ($globalParamFilePath,$baseParamFilePath)

# Read configuration files
$globalParams = Get-Content -Raw -Path $globalParamFilePath | ConvertFrom-Json
$baseParams = Get-Content -Raw -Path $baseParamFilePath | ConvertFrom-Json

foreach ($i in $globalParams.parameters.PSObject.Properties)
{
  $baseParams.parameters | Add-Member -Name $i.Name -Value $i.Value  -MemberType NoteProperty -force
}

# Output to console and overwrite base parameter file
$baseParams | ConvertTo-Json -depth 100 |Tee-Object $baseParamFilePath

The script is implemented in the release pipeline to merge the parameter files during the build stage. A full working CI/CD pipeline sample project can be downloaded from my GitHub repo here https://github.com/connectedcircuits/globalParams

In the solution mentioned above,  I have 2 Logic App projects where the parameter files have the following content.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "environment":{
      "value": dev
    },
    "businessUnitName":{
      "value": "n/a"
    }
  }
}

And the contents of the global parameter file is listed here:-

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
      "environment":{
        "value": "sit"
      },
      "businessUnitName":{
        "value": "Accounting Department"
      }
    }
  }

Running the release pipeline produces the following merged file which is used by the pipeline to deploy the Logic Apps to Azure.

The Logic App resource file uses these parameters to create some tags and appends the environment variable to the Logic App name as shown below.

Using Environment variables available in Azure DevOPs is another option, but I like to keep the parameter values in code rather than have them scattered across the repo and environment variables.

Enjoy…

Sending JSON messages from a BizTalk2013 adaptor

I had a requirement where I needed to send a message to a web API that only accepted messages as JSON.

BizTalk2013  provides a WCF-WebHttp adaptor for sending sending XML messages to a REST endpoint but  not in JSON format. There are  rumours  this will be remedied in BizTalk2013R2, unfortunately I required a solution now.

So in the meantime I will use the the Json.NET component  to convert the XML  message to JSON. I also found this blog from http://blog.quicklearn.com/2013/09/06/biztalk-server-2013-support-for-restful-services-part-45/ to convert XML messages to JSON  using the Json.net in a custom pipeline component which I ended up using with some mods.

 

When using this pipeline remember to set the Outbound HTTP Headers as shown below:

image

 

This all worked a treat until during the unit testing phase I created a sample XML document with only one repeating element.This caused the  following error from the Web API.

Can not deserialize instance of java.util.ArrayList out of VALUE_STRING token at [Source: java.io.StringReader@6b270743; line: 1, column: 152] (through reference chain: model.ExtContact["names"])

As it turned out, the Web API required all unbound elements to be sent as an array which should have looked like this “names”: [“Mahindra Morar”] but instead was being sent as “names”:”Mahindra Morar”

After reading the documentation on the Json.Net component there is an option to always force a json array for an element. This is done by adding the following attribute json:Array=”true” on the unbound element.

Here is a code snippet from the documentation that shows how it is done.

image

 

Now I though I could just simply import a schema with the element and attribute which produced the following schema output.

<ns0:Root xmlns:ns0=”http://BizTalk_Server_Project1.People”>
<ns1:Name Array=”true” xmlns:ns1=”http://james.newtonking.com/projects/json”>Tom Lee</ns1:Name>
</ns0:Root>

However the Json.net component did not like having the namespace declared inline with the unbound element when it was generated by BizTalk. The work around for this was to add the attribute to the elements in the custom pipeline component, use the XMLTranslatorStream to remove the inline namespaces or debug the source code. I ended up using the first option as I already had the XML document loaded in a XMLDoument type. To keep the solution a bit more flexible, I exposed a property on the pipeline to accept a list of comma separated xpath elements that required to be outputted as json arrays.

The Execute method  in the code from http://blog.quicklearn.com/2013/09/06/biztalk-server-2013-support-for-restful-services-part-45/ was modified as shown below.

public Microsoft.BizTalk.Message.Interop.IBaseMessage Execute(Microsoft.BizTalk.Component.Interop.IPipelineContext pc, Microsoft.BizTalk.Message.Interop.IBaseMessage inmsg)
        {

            #region Handle CORS Requests
            // Detect of the incoming message is an HTTP CORS request
            // http://www.w3.org/TR/cors/

            object httpMethod = null;
            httpMethod = inmsg.Context.Read(HTTP_METHOD_PROPNAME, WCF_PROPERTIES_NS);

            if (httpMethod != null && (httpMethod as string) == OPTIONS_METHOD)
            {
                // Remove the message body before returning
                var emptyOutputStream = new VirtualStream();
                inmsg.BodyPart.Data = emptyOutputStream;

                return inmsg;
            }
            #endregion


            // Make message seekable
            if (!inmsg.BodyPart.Data.CanSeek)
            {
                var originalStream = inmsg.BodyPart.Data;
                Stream seekableStream = new ReadOnlySeekableStream(originalStream);
                inmsg.BodyPart.Data = seekableStream;
                pc.ResourceTracker.AddResource(originalStream);
            }

            // Here again we are loading the entire document into memory
            // this is still a bad plan, and shouldn't be done in production
            // if you expect larger message sizes

            XmlDocument xmlDoc = new XmlDocument();
            xmlDoc.Load(inmsg.BodyPart.Data);
            
            //Get the array list of the elements required to be outputted as json arrays from the exposed pipeline property.
            string[] elementList = arrayXpathElementList.Split(',');

            //add the namespace required to force an element.
            xmlDoc.DocumentElement.SetAttribute("xmlns:json", "http://james.newtonking.com/projects/json");

            if (xmlDoc.FirstChild.LocalName == "xml")
                xmlDoc.RemoveChild(xmlDoc.FirstChild);

            
            for (int indexElement = 0; indexElement < elementList.Length; indexElement++)
            {
                XmlNodeList dataNodes = xmlDoc.SelectNodes(elementList[indexElement]);
                foreach (XmlNode node in dataNodes)
                {
                    //Add the attribute required on the element to force the creation of an json array. 
                    string ns = node.GetNamespaceOfPrefix("json");
                    XmlNode attr = xmlDoc.CreateNode(XmlNodeType.Attribute, "Array", ns);
                    attr.Value = "true";
                    node.Attributes.SetNamedItem(attr);
                }
            }
            // Remove any root-level attributes added in the process of creating the XML
            // (Think xmlns attributes that have no meaning in JSON)
            

            string jsonString =  JsonConvert.SerializeXmlNode(xmlDoc, Newtonsoft.Json.Formatting.Indented, true);

            #region Handle JSONP Request
            // Here we are detecting if there has been any value promoted to the jsonp callback property
            // which will contain the name of the function that should be passed the JSON data returned
            // by the service.

            object jsonpCallback = inmsg.Context.Read(JSONP_CALLBACK_PROPNAME, JSON_SCHEMAS_NS);
            string jsonpCallbackName = (jsonpCallback ?? (object)string.Empty) as string;

            if (!string.IsNullOrWhiteSpace(jsonpCallbackName))
                jsonString = string.Format("{0}({1});", jsonpCallbackName, jsonString);
            #endregion

            var outputStream = new VirtualStream(new MemoryStream(Encoding.UTF8.GetBytes(jsonString)));
            inmsg.BodyPart.Data = outputStream;

            return inmsg;
        }

        #endregion        
    }

After making the changes above, the component is now able to covert a unbound element with only one value to a json array correctly.

Enjoy.