Connecting an Azure WebApp to a SQL Server VM inside a VNet

This article is about connecting an Azure WebApp to a SQL Server VM which is hosted inside an Azure Virtual Network. Typically a SQL Server VM would be hosted inside an Azure Virtual Network (VNet)  to isolate it from the internet by blocking all inbound and outbound internet traffic using a Network Security Group (NSG). In this scenario, connectivity  to the SQL Database is achieved by using the new VNet Integration feature found on the App Service component. Using this feature removes the requirement of an App Service Environment (ASE) for the WebApp thus reducing overall hosting costs.

Using VNet integration provides private outbound access from your App Service to resources in your VNet using the RFC1918 internal IP address allocation range (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) by default.

 

Scenario

A web application is hosted in a WebApp which requires a connection to the SQL Database hosted inside a VNet.

The network topology of this scenario is shown below which uses the Regional VNet Integration option where both the WebApp and SQL VM are in the same region. Here we have a VNet called Backend which has two subnets, one for the VNet Integration used for delegating called IntegDeleg and the other to host the SQL Server VM called DataStore.

 

image

 

Configuration Walkthrough

The following are the sequence of steps used to setup VNet Integration between a Web App and SQL Server with the assumption the SQL Server VM is already hosted inside a VNet.

1. Adding a VNet subnet

2. Provisioning  an AppSvc Plan

3. Provisioning  a WebApp

4. Setting up the VNet Integration

5. Validating SQL Server Security Settings

6. Testing connectivity to the SQL Server

7. Updating the Web App SQL Connection String

 

1. Adding a VNet Subnet

A dedicated subnet used by the VNet Integration feature is required to be added to the existing VNet hosting the SQL Server VM. The IP range should match the maximum number of AppSvc plan instances when fully scaled out as each instance would require a IP address. For this scenario I will be using a /27  prefix giving a total range of 32 address, however  5 address are used internally by Azure leaving 27 usable addresses for each plan instance.

 

image

 

2. Provisioning App Svc Plan

To use VNet Integration, you will need to provision an App Service plan using newer V2 scale units. Note if you are currently using V1 App Services, you will need to provision a new plan using V2 and migrate you apps to this new plan.

To confirm if you have selected the newer V2 App Services, the Premium plans should be shown as P1V2, P2V2 and P3V2. Here I will be using a Standard Plan S1 for this scenario highlighted below.

image

 

3. Provisioning Web App

Create a new Web App and ensure it is in the same region as the VNet. Also ensure you have selected the  App Service Plan you created above.

image

 

4. Enable VNet Integration

Under the Web App that was provisioned above, click on the Networking menu item to view the networking options and then click on “Click here to configure” under the VNet Integration heading.

image

 

Once the VNet Configuration form opens, click on the “Add VNet” to open the Network Feature Status blade. Select the VNet that hosts the SQL Server and then the Subnet that was created earlier for the VNet Integration. Then press OK to save the changes.

 image

 

After you hit OK, the VNet Integration should be connected and ready for testing the connectivity. Remember the VNet Integration will route all outbound RFC1918 traffic from the WebApp into your VNet.

 

image

 

5. Validating SQL Server Security Settings

To reduce the surface area of attack, ensure the SQL Server can only accept connections within the VNet. This is done by setting the “SQL connectivity” option to Private (within Virtual Network) under the Security menu of the SQL Virtual Machine.

 

image

 

Also check the NSG attached to the SQL Server VM to ensure there is a rule to disable all outbound internet traffic. If there is a inbound rule called default-allow-sql as highlighted below, it is advisable to delete this rule if not required. This inbound rule default-allow-sql is normally created when the security on the SQL Server VM allows SQL connectivity via Public (Internet) connections.

 

image

 

6. Testing connectivity

To check connectivity between the Web App and the SQL server, we can use the  tcpping command from a console window. Go to the Web App that was created previously and click on the Console menu item to open a console window similar to below.

image

In the console window type the command tcpping <sql vm private ip address>:1433. All going well you should get a reply similar to that below where 10.0.2.4 was the private IP address of my SQL Server VM using the default port 1433.

image

 

7. Updating the Web App SQL Connection String

Once the connectivity has been verified, the next step is to update the connection string on the Web App to use the private IP address of the SQL Server VM. Typically the connection string should look something like this:- Server=10.0.2.4;Database=coreDb;User Id=myusername;Password=mypassword;MultipleActiveResultSets=true

After the connection string has been updated to use the private IP address, you should be able to test your application. Here I am just adding some new tasks in the TodoList web application and confirming the records are written to the database.

 image

 

Conclusion

VNet Integration provides an easy and cost effective solution to access databases hosted within a VNet without resorting to a dedicated  ASE. Also using rules on the NSG applied to the SQL Server VM provides the capability to block all internet traffic and allow only RFC1918 internal addresses to have access.

More information about VNet Integration can be found on the MS docs site https://docs.microsoft.com/en-us/azure/app-service/web-sites-integrate-with-vnet.

Enjoy…

Azure SQL Database Timeout Exceptions

I had came across an issue were the Azure database was throwing timeout exceptions like the one below when under heavy load.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. This failure occurred while attempting to connect to the routing destination. The duration spent while attempting to connect to the original server was – [Pre-Login] initialization=2; handshake=23; [Login] initialization=0; authentication=0; [Post-Login] complete=1;

 

The project  I was working on involved inserting several thousand rows of data from a single message received from an Azure service bus queue. A WebJob reads the messages of the queue and inserts several hundred rows at a time to reduce the transaction size and time.

I had also used the EAB Transient Fault Handling block https://msdn.microsoft.com/en-us/library/hh680934(v=pandp.50).aspx  which wrapped the code that calls the insert stored procedure to handle any network transients that may occur.

Initially I set the retry logic to 1 second. I was unsure why I was getting timeout exception errors as the retry logic should have taken care of this. It was only until I came across this article about transient errors https://azure.microsoft.com/nb-no/documentation/articles/sql-database-connectivity-issues/

The sentence in article that stood out for me was the one highlighted below:

We recommend that you delay for 5 seconds before your first retry. Retrying after a delay shorter than 5 seconds risks overwhelming the cloud service. For each subsequent retry the delay should grow exponentially, up to a maximum of 60 seconds.

 

Also another issue I had was when inserting data into a database table from within a loop. This was causing the Azure database to throttle and the Database Throughput Unit (DTU) to flat line at 100%. I resolved this issue by forcing a 3 second  delay after each iteration of the loop after reading this article from Microsoft https://azure.microsoft.com/en-us/documentation/articles/sql-database-resource-limits/  This delay caused the DTU to now reside around the 50-70% mark.

 

In conclusion, it seems that the short retry period and the loop I had  for inserting a large dataset caused the timeout exception errors to occur. After making the changes to increase the retry period and adding a delay in the looping function, caused the timeout exceptions to disappear.

Using SQL2008 Data Table Type & BizTalk to insert parent-child rows

I wanted a way to insert a parent row with multiple child rows in one single transaction into a database using BizTalk 2010.

The solution I came up with uses the table data type in SQL 2008 to insert parent-child relationship records and the BizTalk WCF SQL Adaptor. The data I will be inserting will consist of a Sales Order header record with several sales item lines.

Below are the steps I used to accomplish this.

1. Create the tables for the Sales header record and the Sales Item records.

CREATE TABLE [dbo].[SalesHeader]
(
    [SalesHeader] [int] IDENTITY(1,1) NOT NULL,
    [OrderNumb] [nvarchar](50) NOT NULL,
    [OrderDate] [datetime] NOT NULL
)

CREATE TABLE [dbo].[SalesItem]
(
    [SaleItem] [int] IDENTITY(1,1) NOT NULL,
    [SalesHeaderID] [int] NOT NULL,
    [ItemCode] [nvarchar](255) NOT NULL,
    [Qty] [int] NOT NULL,
    [Price] [money] NOT NULL
)

go
--add the FK Cconstraint
ALTER TABLE [dbo].[SalesItem]  WITH CHECK ADD  CONSTRAINT [FK_SalesItem_SalesHeader] FOREIGN KEY([SalesHeaderID])
REFERENCES [dbo].[SalesHeader] ([SalesHeader])
ALTER TABLE [dbo].[SalesItem] CHECK CONSTRAINT [FK_SalesItem_SalesHeader]
go

2. Create the table data types in SQL. One for the header record and another for the order lines. These new table types in SQL2008 allow you to pass multiple rows in as a single parameter. In the old  days you would have to pass the data in as a XML document or use updategrams.

CREATE TYPE [dbo].[SalesHeaderTableType] AS TABLE(
    [OrderNumb] [nvarchar](255) NULL,
    [OrderDate] [datetime] NULL
)
go
CREATE TYPE [dbo].[SalesItemTableType] AS TABLE(
    [ItemCode] [nvarchar](255) NULL,
    [Qty] [int] NULL,
    [Price] [money] NULL
)
go

3. Create the stored procedure to insert the data into the tables using the table data types as the input parameters. Also I am using the scope_identity function to get the parent header id. Note table parameters can only be used as read only.

CREATE procedure [dbo].[InsertSalesOrder]
(
    @SalesHeader as dbo.SalesHeaderTableType READONLY,
    @SalesItem as dbo.SalesItemTableType READONLY
)
as

declare @SalesHeaderID int

--add the sales header record
insert into dbo.SalesHeader(OrderNumb,OrderDate)
select OrderNumb, OrderDate from @SalesHeader
select @SalesHeaderID = scope_identity()

--add the line items
insert into dbo.SalesItem (SalesHeaderID,ItemCode,Qty,Price)
select @SalesHeaderID,ItemCode,Qty,Price from @SalesItem

return @SalesHeaderID
go

4. The next step is to generate the schema for the stored procedure created above using the “Consume Adapter Service” wizard.

image

This will create the following 3 files Sales.DataSetSchema.xsd, Sales.Procedure.dbo.xsd and Sales.TableType.dbo.xsd

5. Create the schema for the input data which is just a simple Header – Detail  data structure as shown below. Note the Detail node ‘Max Occurs’ is set to unbound as there can be multiple detail line items.

image

6. Create map from the input data schema to the generated schema for the stored procedure. The functiod in the map connecting the Detail node to the SaleItemTableType is just a Looping shape.

image

7. Create a simple orchestration to write the sales data to the database. This orchestration just simply picks up the xml sales data from a pickup folder and inserts the data using a static resquest/response port. The response is simply the identity id returned from the stored procedure.  The response is then written to a drop folder as xml.

image

8. Compile and deploy the application to BizTalk. The request/response port is configured as a WCF SQL.

image

The SOAP Action header of the WCF SQL Transport Properties are configured as below

image

That’s pretty much it. Just drop a test message and check the tables have been populated.

Below is the results after dropping the test message in the pickup folder a few times.

image

 

Enjoy.