When trying to execute a long running SQL query using the Logic App SQL Connector, you will most likely hit the 2 minute execution timeout limit on the connector.
Even if you move the SQL query to an Azure function and use ADO.Net to access the backend database, you will hit the 10 minute execution timeout limit of the function when using the consumption plan.
An Alternative Approach
The solution here is to use the Azure Automation service to execute a PowerShell script which calls a SQL query. There is now a new connector available in Logic Apps to execute an Automation Runbook which you have the option to wait until it completes before moving onto the next action shape in the workflow.
Implementation
First find and add an Automation Account to your Azure resource group. Set the Create Azure Run As account to No.
Once the Automation Account has been created, click on the Credentials on the left menu pane to store the SQL Server login credentials.
Then add the SQL username and password and click Create. We will be using this as the Credential parameter in our PowerShell script.
Now click on Runbooks to create a new runbook. This is where we add a PowerShell script to execute.
Add a name for the Runbook and ensure the Runbook type is set to PowerShell Workflow, then click the Create button at the bottom of the page.
After the Runbook has been created, click on the Edit icon to add the PowerShell script.
Inside the PowerShell editor, paste the following code. Basically we are using ADO.Net object to create a SQL connection and to execute the stored procedure. To ensure the command object does not timeout, set the CommandTimeout = 0.
workflow ExecuteSQLStoredProcedure
{
param(
[parameter(Mandatory=$True)]
[string] $SqlServer,
[parameter(Mandatory=$False)]
[int] $SqlServerPort=1433,
[parameter(Mandatory=$True)]
[string] $Database,
[parameter(Mandatory=$True)]
[string] $Sproc,
[parameter(Mandatory=$True)]
[PSCredential] $SqlCredential
)
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password
inlinescript {
# Define the connection to the SQL Database
$SQLConn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$using:SqlServer,$using:SqlServerPort;Database=$using:Database;User ID=$using:SqlUsername;Password=$using:SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")
# Open the SQL connection
$SQLConn.Open()
# Define the SQL command to run.
$Cmd = New-Object System.Data.SQLClient.SqlCommand
$Cmd.Connection = $SQLConn
$Cmd.CommandTimeout=0
$Cmd.CommandType = [System.Data.CommandType]::StoredProcedure
$Cmd.CommandText = $using:Sproc
$Cmd.ExecuteNonQuery()
# Close the SQL connection
$SQLConn.Close()
}
}
After the code has been pasted into the editor, save and remember to publish it after it has been saved.
Now to add a Logic App to execute the Runbook. For this demo, I will simply use the scheduler connector to fire the Logic App. In the designer add a Recurrence trigger and for the next action, search for automation and select the Create Job Azure Automation.
The parameters defined in the PowerShell script will be shown in the properties page of the connector as shown below. I will be executing a stored procedure called “dbo.LongProcess” which has a WaitFor Delay ‘00:15’ T-SQL statement. This will simulate a long running query of 15 minutes.
To get the results of the SQL query add the Azure Automation Get job output shape after the Create job shape.
Testing
To confirm the Logic App and SQL query does not timeout, run the trigger manually. After a period of 15 minutes, which is the delay period defined in the stored procedure, the Logic App will complete as shown below.
The output of the stored procedure will be shown in the Get job output shape.
To return a recordset as JSON from the PowerShell script, replace the code $Cmd.ExecuteNonQuery() with the following:
$reader = $Cmd.ExecuteReader()
$table = new-object "System.Data.DataTable"
$table.Load($reader)
#--Exporting data to the screen --#
$table | select $table.Columns.ColumnName | ConvertTo-Json
Enjoy…
How would you pass a parameter to the stored procedure ?
Hi Miles,
First you add the variables for your sproc parameters inside the “param” code block of the PS script show below.
param(
…
[parameter(Mandatory=$True)]
[string] $MyValue1,
[parameter(Mandatory=$True)]
[int] $MyValue2,
…
)
Then add code similar below for each parameter after the line “$Cmd.CommandText = $using:Sproc”.
…
$Cmd.CommandText = $using:Sproc
# Add Parameters
$Cmd.Parameters.Add(“@MyParam1″, [System.Data.SqlDbType]”varchar”, 20)
$Cmd.Parameters[“@MyParam1”].Value = $MyValue1
$Cmd.Parameters.Add(“@MyParam2″, [System.Data.SqlDbType]”Int”)
$Cmd.Parameters[“@MyParam2”].Value = $MyValue2
Cheers.
Thanks mmorar, works like a dream. Is it possible to return results from the proc ?
Say I do a select statement at the end of the sp and want to use the results in the logic app ?
Hi Miles, you sure can return a recordset from a sproc. This is described in the Testing section of this blog where I create a data table and fill it using a data reader before returning the data table as JSON.
If you want to reference a parameter’s value within the inline codeblock, I think you need to use $using. So the correct syntax is:
# Add Parameters
$Cmd.Parameters.Add(“@MyParam1″, [System.Data.SqlDbType]”varchar”, 20)
$Cmd.Parameters[“@MyParam1”].Value = $using:MyValue1
This is because the InlineScript activity runs a block of commands in a separate, non-workflow session and returns its output to the workflow. While commands in a workflow are sent to Windows Workflow Foundation for processing, commands in an InlineScript block are processed by Windows PowerShell.
If you wish to test your parameter values within the InlineScript activity you can write to output stream like this:
Write-output “The MyValue1 value supplied was ‘$using:MyValue1’.”
Trap for young players.. make sure you have the correct ascii quotes in your statements too.. ie use ” not ″