Automate Resource Management with Azure DevOps Pipeline Schedules

In today’s fast-paced business environment, managing cloud resources efficiently is crucial to optimizing costs and enhancing security. One effective strategy is to provision resources at the start of the business day and decommission them at the end of the day. This approach helps in reducing hosting costs and minimizing the surface area of any security risks.

Why Automate Resource Management?

Many businesses operate on a standard business day schedule, typically from 9 AM to 5 PM. However, resources such as virtual machines, databases, and storage accounts often remain active 24/7. This continuous operation can lead to unnecessary costs and increased security risks, especially if the resources are not adequately monitored outside business hours.

Automating the provisioning and decommissioning of resources provides several benefits:

  • Cost Reduction: By only running resources during business hours, you can significantly reduce hosting costs.
  • Security: Decommissioning resources when not in use reduces the attack surface, thus minimizing potential security risks.
  • Efficiency: Automation reduces the manual overhead of managing resources, allowing your team to focus on more critical tasks.

In this blog, we will explore how you can use Azure DevOps pipeline schedules to automate this process.

The Solution

By leveraging pipeline schedules, businesses can automate the provisioning of resources at the start of the business day and decommission them at the end of the day. More about pipeline schedules can be found here – https://learn.microsoft.com/en-us/azure/devops/pipelines/process/scheduled-triggers?view=azure-devops&tabs=yaml

Schedules are added to your pipelines as a cron job where you can specify the time and the days that the pipeline will be triggered. The sample cron job below will trigger the pipeline to run at 7.00am UTC for every work day.

The solution involves two pipelines, one is used for provisioning your resources which is a typical YAML pipeline that you would normally develop to provision your resources. The other pipeline is used for decommissioning your resources. This pipeline uses Azure CLI commands to delete resources by type which you can specify as a variable string. In the example below, this will delete all VM’s, NIC’s and Storage Accounts in the specified Azure resource group.

Source code for a working example can be found herehttps://github.com/connectedcircuits/schedule-pipelines. This sample will provision a storage account at 7:00am NZ time and then decommission the resources in the specified Azure Resource Group at 6:00pm NZ time every working day.

You can also add approval gates to the pipelines for added control of provisioning or decommissioning the resources.

Conclusion

By leveraging Azure DevOps pipeline schedules, you can automate the provisioning and decommissioning of resources to align with your business hours. This not only helps in reducing hosting costs but also minimizes the surface area of any security risks. Implement this strategy in your workflow to ensure efficient and secure resource management.

Start automating your resource management today to maximize efficiency and security in your cloud environment!

Enjoy…

DevOps Pipeline Token Replacement Template

There are times when you simply need to replace tokens with actual values in text files during the deployment phase of a solution. These text files may be parameter or app setting files or even infrastructure as  code files such as ARM/Bicep templates.

Here I will be building a reusable template to insert the pipeline build number as a Tag on some LogicApps every time the resources are deployed from the release pipeline. Using the process describe below can easily be used to replace user defined tokens in other types of text files by supplying the path to the file to search in, the token signature and the replacement value.

I used the PS script below to read in the text file contents and then search and replace the token with the required value. The PS script uses several parameters so it may be reused throughout the release pipeline for many different text file types and tokens.

The PS script is called from a template file below which takes a list of files to search as one of the template parameters. This allows me to search for the token across multiple files in one hit. The template iterates through each file calling the PS script and passing the file path and the other required parameters.

The yml release pipeline file is shown below which calls the template at line 19 to replace the tokens in the LogicApp.parameters.json files. In this scenario, the token name to search for is specified at line 24.  You need to ensure the chosen token name does not clash with any other valid text.

The last step at line 27 deploys the LogicApps to the specified resource group.

Executing the pipeline will read the following LogicApp parameter file from the source code folder on the build agent and then replace the buildNumber token with the actual value.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "environment":{
      "value": null
    },
    "businessUnitName":{
      "value": "n/a"
    },
    "buildNumber":{
      "value": "%_buildNumber_%"
    }
  }
}

After running through the token replacement step, the buildNumber is updated with the desired value.

The full source code for this article is available on Github here: https://github.com/connectedcircuits/tokenreplacement. I tend to use this template with my other blog about using global parameter files here: https://connectedcircuits.blog/2022/11/17/using-global-parameter-files-in-a-ci-cd-pipeline/

Enjoy…

Using Global Parameter Files in a CI/CD Pipeline

When developing a solution that has multiple projects and parameter files, more than likely these parameter files will have some common values shared between them. Examples of common values are the environment name, connection strings, configuration settings etc.

A good example of this scenario are Logic App solutions that may have multiple projects. These are typically structured as shown below where each project may several parameter files, one for each environment. Each of these parameter files will have different configuration settings for each of the 3 environments but are common across all the projects.

Keeping track of the multiple parameter files can be a maintenance issue and prone to misconfiguration errors. An alternative is to use a global parameter file which contains all the common values used across the projects. This global file will overwrite the matching parameter value in each of the referenced projects when the projects are built inside a CI/CD pipeline.  

By using global parameter files, the solution will now look similar to that shown below. Here all the common values for each environment are placed in a single global parameter file. This now simplifies the solution as there is now only one parameter file under each project and all the shared parameter values are now in a global parameter file. The default global values for the parameter files under each Logic App project will typically be set to the development environment values.


The merging of the global parameter files is managed by the PowerShell script below.

# First parameter is the source param and the second is the destination param file.
param ($globalParamFilePath,$baseParamFilePath)

# Read configuration files
$globalParams = Get-Content -Raw -Path $globalParamFilePath | ConvertFrom-Json
$baseParams = Get-Content -Raw -Path $baseParamFilePath | ConvertFrom-Json

foreach ($i in $globalParams.parameters.PSObject.Properties)
{
  $baseParams.parameters | Add-Member -Name $i.Name -Value $i.Value  -MemberType NoteProperty -force
}

# Output to console and overwrite base parameter file
$baseParams | ConvertTo-Json -depth 100 |Tee-Object $baseParamFilePath

The script is implemented in the release pipeline to merge the parameter files during the build stage. A full working CI/CD pipeline sample project can be downloaded from my GitHub repo here https://github.com/connectedcircuits/globalParams

In the solution mentioned above,  I have 2 Logic App projects where the parameter files have the following content.

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "environment":{
      "value": dev
    },
    "businessUnitName":{
      "value": "n/a"
    }
  }
}

And the contents of the global parameter file is listed here:-

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
      "environment":{
        "value": "sit"
      },
      "businessUnitName":{
        "value": "Accounting Department"
      }
    }
  }

Running the release pipeline produces the following merged file which is used by the pipeline to deploy the Logic Apps to Azure.

The Logic App resource file uses these parameters to create some tags and appends the environment variable to the Logic App name as shown below.

Using Environment variables available in Azure DevOPs is another option, but I like to keep the parameter values in code rather than have them scattered across the repo and environment variables.

Enjoy…