Using SQL2008 Data Table Type & BizTalk to insert parent-child rows

I wanted a way to insert a parent row with multiple child rows in one single transaction into a database using BizTalk 2010.

The solution I came up with uses the table data type in SQL 2008 to insert parent-child relationship records and the BizTalk WCF SQL Adaptor. The data I will be inserting will consist of a Sales Order header record with several sales item lines.

Below are the steps I used to accomplish this.

1. Create the tables for the Sales header record and the Sales Item records.

CREATE TABLE [dbo].[SalesHeader]
(
    [SalesHeader] [int] IDENTITY(1,1) NOT NULL,
    [OrderNumb] [nvarchar](50) NOT NULL,
    [OrderDate] [datetime] NOT NULL
)

CREATE TABLE [dbo].[SalesItem]
(
    [SaleItem] [int] IDENTITY(1,1) NOT NULL,
    [SalesHeaderID] [int] NOT NULL,
    [ItemCode] [nvarchar](255) NOT NULL,
    [Qty] [int] NOT NULL,
    [Price] [money] NOT NULL
)

go
--add the FK Cconstraint
ALTER TABLE [dbo].[SalesItem]  WITH CHECK ADD  CONSTRAINT [FK_SalesItem_SalesHeader] FOREIGN KEY([SalesHeaderID])
REFERENCES [dbo].[SalesHeader] ([SalesHeader])
ALTER TABLE [dbo].[SalesItem] CHECK CONSTRAINT [FK_SalesItem_SalesHeader]
go

2. Create the table data types in SQL. One for the header record and another for the order lines. These new table types in SQL2008 allow you to pass multiple rows in as a single parameter. In the old  days you would have to pass the data in as a XML document or use updategrams.

CREATE TYPE [dbo].[SalesHeaderTableType] AS TABLE(
    [OrderNumb] [nvarchar](255) NULL,
    [OrderDate] [datetime] NULL
)
go
CREATE TYPE [dbo].[SalesItemTableType] AS TABLE(
    [ItemCode] [nvarchar](255) NULL,
    [Qty] [int] NULL,
    [Price] [money] NULL
)
go

3. Create the stored procedure to insert the data into the tables using the table data types as the input parameters. Also I am using the scope_identity function to get the parent header id. Note table parameters can only be used as read only.

CREATE procedure [dbo].[InsertSalesOrder]
(
    @SalesHeader as dbo.SalesHeaderTableType READONLY,
    @SalesItem as dbo.SalesItemTableType READONLY
)
as

declare @SalesHeaderID int

--add the sales header record
insert into dbo.SalesHeader(OrderNumb,OrderDate)
select OrderNumb, OrderDate from @SalesHeader
select @SalesHeaderID = scope_identity()

--add the line items
insert into dbo.SalesItem (SalesHeaderID,ItemCode,Qty,Price)
select @SalesHeaderID,ItemCode,Qty,Price from @SalesItem

return @SalesHeaderID
go

4. The next step is to generate the schema for the stored procedure created above using the “Consume Adapter Service” wizard.

image

This will create the following 3 files Sales.DataSetSchema.xsd, Sales.Procedure.dbo.xsd and Sales.TableType.dbo.xsd

5. Create the schema for the input data which is just a simple Header – Detail  data structure as shown below. Note the Detail node ‘Max Occurs’ is set to unbound as there can be multiple detail line items.

image

6. Create map from the input data schema to the generated schema for the stored procedure. The functiod in the map connecting the Detail node to the SaleItemTableType is just a Looping shape.

image

7. Create a simple orchestration to write the sales data to the database. This orchestration just simply picks up the xml sales data from a pickup folder and inserts the data using a static resquest/response port. The response is simply the identity id returned from the stored procedure.  The response is then written to a drop folder as xml.

image

8. Compile and deploy the application to BizTalk. The request/response port is configured as a WCF SQL.

image

The SOAP Action header of the WCF SQL Transport Properties are configured as below

image

That’s pretty much it. Just drop a test message and check the tables have been populated.

Below is the results after dropping the test message in the pickup folder a few times.

image

 

Enjoy.

Workflow 4.0 Re-Hosting

I had the privilege to work with Microsoft on extending their AppFrabric Workflow Manager.

The AppFabric Starter Kit is designed to be a reference implementation; a secondary objective is to provide sample code that developers could use to manage the life-cycle of the .NET 4 Workflow Services on Windows Server AppFabric. It does this by providing a complete client-side application that enables design of .NET Windows Workflow Services outside of the Visual Studio IDE; deployment of these Services from the Client to the AppFabric server; versioning of the deployed service definitions; and finally management of Service Instances in a custom monitoring console.

You can download it from here http://code.msdn.microsoft.com/Windows-Server-AppFabric-332d1ebc

RunAs command line for launching IE

I have used this on many occasions to run-up IE under different user credentials.

Below is a code snippet to use the the “RunAs.exe” utility.

C:\Windows\System32\runas.exe /user:mylogname@abc.com /savecred “C:\Program Files\Internet Explorer\iexplore.exe”

I normally create a batch file on my desktop to execute this command line. Use the switch /savecred to prompt for the password and once entered it will save it for latter use.

SQL2008 MARS & Parallel Framework in .Net4

I had a scenario where I wanted to query several database tables at once and process the data as fast as possible using a single database connection.

After some though I decided to look at MARS ( multiple active result set) which was available from SQL2005 and up. This gives the option to open multiple datareaders under the one SQL database connection and allow the data to be read in an interleaved fashion. Although not true parallel reading it does have the advantage of having only one database connection open.

Also I had been reading about the new parallel framework that is available in .Net 4.0. Because I have several datareaders open, I could process each reader in parallel (using multi-core processors) and take advantage of the yeild points. This is because statements which return rows to the client, are allowed to interleave execution before completion while rows are being sent to the client.

Below is the sample prototype code to test my theory.

   1: public void GetUserDetailsSharedConn()

   2:       {

   3:           List<mdAddress> addresses;

   4:           List<mdUser> users;

   5:  

   6:           string dbCon = "Server= .;Database=adventureworks;Trusted_Connection=yes;MultipleActiveResultSets=true;";           

   7:           using (SqlConnection conn = new SqlConnection(dbCon))

   8:           {

   9:  

  10:               string sql1 = "SELECT * FROM [Person].[Address]";

  11:               string sql2 = "SELECT * FROM [Person].[Contact]";

  12:  

  13:               SqlCommand cmd1 = new SqlCommand(sql1, conn);

  14:               SqlCommand cmd2 = new SqlCommand(sql2, conn);

  15:               cmd1.CommandTimeout = 500;

  16:               cmd2.CommandTimeout = 500;

  17:               conn.Open();

  18:               SqlDataReader dr1 = cmd1.ExecuteReader();

  19:               SqlDataReader dr2 = cmd2.ExecuteReader();

  20:  

  21:  

  22:               Parallel.Invoke

  23:                   (

  24:                       () => PopulateAddressSharedConn(dr1, out addresses),

  25:                       () => PopulateUsersSharedConn(dr2, out users)

  26:                   );                

  27:           }                       

  28:       }

  29:  

  30:  

  31:       private void PopulateUsersSharedConn( SqlDataReader rd, out List<mdUser> users)

  32:       {            

  33:            users = new List<mdUser>();           

  34:           // Call Read before accessing data.

  35:           while (rd.Read())

  36:           {

  37:               mdUser user = new mdUser();

  38:               user.Title = rd["Title"] == DBNull.Value ? string.Empty : (string)rd["Title"];

  39:               user.FirstName = (string)rd["FirstName"]; 

  40:               user.LastName = (string)rd["LastName"];

  41:               users.Add(user);

  42:           }     

  43:       }

  44:  

  45:       private void PopulateAddressSharedConn(SqlDataReader rd, out List<mdAddress> addresses)

  46:       {

  47:           addresses = new List<mdAddress>();

  48:           // Call Read before accessing data.

  49:           while (rd.Read())

  50:           {

  51:               mdAddress address = new mdAddress();

  52:               address.Address1 = (string)rd["AddressLine1"];

  53:               address.Address2 = rd["AddressLine2"] == DBNull.Value ? string.Empty :(string)rd["AddressLine2"];

  54:               address.City = (string)rd["City"];

  55:               addresses.Add(address);

  56:           }

  57:  

  58:       }

I used the Parallel.Invoke method because I need to wait until I received all the data back form all the threads.

Below is the SQL Profiler trace after executing the code in quick succession. On the first query set, the ‘Address’ batch yielded  until the ‘Contact’ batch completed. However on the highlighted second query the ‘Contact’ batch waited until the ‘Address’ batch completed.

image

How to gac a component at compile time

Rather than manually having to gac dll’s after compiling, you can add build events to the project to automatically unregister and register the compiled dll’s into the registry.

The following script below will unregister the dll

  • “%programfiles%\Microsoft SDKs\Windows\v7.0A\Bin\NETFX 4.0 Tools\x64\gacutil.exe” /u “$(TargetName)”

This script will register the dll after it successfully compiles.

  • “%programfiles%\Microsoft SDKs\Windows\v7.0A\Bin\NETFX 4.0 Tools\x64\gacutil.exe” /i “$(TargetPath)”

Note the gacutil.exe file may be at different path depending on the OS installed. The path above is for a Win2008 x64 operating system.

Add the scripts above to the Build Events of the project as shown below.

image

 

Enjoy

Writing to the Windows Event Log from a BizTalk Orchestration.

The following code writes a message to the windows event application log under the existing event source “BizTalk Server”.

System.Diagnostics.EventLog.WriteEntry("BizTalk Server","This is the error message",System.Diagnostics.EventLogEntryType.Error,0);

 

The code above is normally placed inside an Expression shape in the orchestration as shown as “Write eventlog”

image

However if you require to write to your own defined event source, you must create the key in the registry first under the following path “HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\eventlog\Application”. The BizTalk host account must have read access to the security key and Read/Write access the new event key.

I tend to create a new event source for each application I deploy, so I decided to use a PowerShell script to create the key and set the permissions for each of the environments to save time.

Use the script below and replace the 2 variables values $EventName and $AccountName with the name of your event source and the account of the host instance for the orchestration.

#Define variables 

[string]$EventName = "BizTalkTest2"

[string]$AccountName = "vm-tpdev01\zx_BTHost1Svc"


#Create the new key

$keyName = "HKLM:\SYSTEM\CurrentControlSet\services\eventlog\Application\" + $EventName

md $keyName


#set the permission

$acl = Get-Acl $keyName


# grant service full control to this key

$person = [System.Security.Principal.NTAccount]$AccountName

$access = [System.Security.AccessControl.RegistryRights]"FullControl"

$inheritance = [System.Security.AccessControl.InheritanceFlags]"None"

$propagation = [System.Security.AccessControl.PropagationFlags]"None"

$type = [System.Security.AccessControl.AccessControlType]"Allow"

$rule = New-Object System.Security.AccessControl.RegistryAccessRule($person,$access,$inheritance,$propagation,$type)

$acl.AddAccessRule($rule)

Set-Acl $keyName $acl


#Grant read-only to BTS Host account

$acl = Get-Acl HKLM:\SYSTEM\CurrentControlSet\services\eventlog\Security

$person = [System.Security.Principal.NTAccount]$AccountName

$access = [System.Security.AccessControl.RegistryRights]"ReadKey"

$inheritance = [System.Security.AccessControl.InheritanceFlags]"None"

$propagation = [System.Security.AccessControl.PropagationFlags]"None"

$type = [System.Security.AccessControl.AccessControlType]"Allow"

$rule = New-Object System.Security.AccessControl.RegistryAccessRule($person,$access,$inheritance,$propagation,$type)

$acl.AddAccessRule($rule)

Set-Acl HKLM:\SYSTEM\CurrentControlSet\services\eventlog\Security $acl


#Check permissions

get-acl  $keyName | Format-Table -wrap

get-acl HKLM:\SYSTEM\CurrentControlSet\services\eventlog\Security | Format-Table -wrap




Next open a PowerShell window as “Run as Administrator” and execute the script to create the key and permissions.

Enjoy.

Content based routing with WCF 4.0

This examples shows how easily it is to route messages based on message headers using WCF4

For this example I built two similar request-response WCF services with different service contract namespaces. One simply adds the two numbers together while the other service subtracts the two numbers and then returns the result.

A windows form was used as the client that accepts 2 numbers and the type of mathematical operation to perform, either addition or subtraction. Once the type of operation to perform is entered, the message header is updated with the operation and then submitted to the WCF routing service.  The routing service reads the header information and reroutes the message to the correct WCF service for the result to be displayed on the windows form.

The main focus of this blog will be the WCF routing service which I have hosted under IIS. It only consists of 2 files, the svc and the web.config file. Routing will be based on an element called “Operation” in the header section of the received soap message.

image

The svc file only contains the following entry which is very similar to a typical WCF web service except for the service model type.

<%@ ServiceHost Language="C#" Debug="true" Service="System.ServiceModel.Routing.RoutingService,System.ServiceModel.Routing, version=4.0.0.0, Culture=neutral,PublicKeyToken=31bf3856ad364e35"  %>

 

 

The real business is in the web.config file below where the routing table and filters are defined. The key points are the routing and client endpoint sections which I will describe in more detail below.

<?xml version="1.0"?>
<configuration>

  <system.web>
    <compilation debug="true" targetFramework="4.0" />
  </system.web>
  <system.serviceModel>

    <serviceHostingEnvironment multipleSiteBindingsEnabled="true" />
    <services>
      <service behaviorConfiguration="RoutingBehavior" name="System.ServiceModel.Routing.RoutingService">
        <endpoint address="" binding="basicHttpBinding" bindingConfiguration=""
           name="MathRouter" contract="System.ServiceModel.Routing.IRequestReplyRouter" />
      </service>
    </services>

    <!-- Behaviour section -->
    <behaviors>
      <serviceBehaviors>
        <behavior>
          <!-- To avoid disclosing metadata information, set the value below to false and remove the metadata endpoint above before deployment -->
          <serviceMetadata httpGetEnabled="true"/>
          <!-- To receive exception details in faults for debugging purposes, set the value below to true.  Set to false before deployment to avoid disclosing exception information -->
          <serviceDebug includeExceptionDetailInFaults="false"/>
        </behavior>


        <!-- Routing behaviour-->
        <behavior name="RoutingBehavior">
          <routing routeOnHeadersOnly="true" filterTableName="filterTable1" />
          <serviceDebug includeExceptionDetailInFaults="true"/>
          <serviceMetadata httpGetEnabled="true" />
        </behavior>

      </serviceBehaviors>
    </behaviors>


    <!-- Routing section-->
    <routing>
      <namespaceTable>
        <add prefix="ha" namespace="http://www.examples.com/Adder/2010/10"/>
        <add prefix="hs" namespace="http://www.examples.com/Subtractor/2010/10"/>
        <add prefix="s" namespace="http://schemas.xmlsoap.org/soap/envelope/"/>
      </namespaceTable>

      <filters>
        <filter name="AddFilter" filterType ="XPath" filterData="/s:Envelope/s:Header/ha:Operation/text() = 'Add'"/>
        <filter name="SubFilter" filterType="XPath" filterData="/s:Envelope/s:Header/hs:Operation/text() = 'Sub'"/>
      </filters>

      <filterTables>
        <filterTable name="filterTable1" >
          <add filterName="AddFilter" endpointName="Adder" priority="0"/>
          <add filterName="SubFilter" endpointName="Subtractor" priority="0"/>
        </filterTable>
      </filterTables>
    </routing>


    <!-- Client endpoints for the services-->
    <client>
      <endpoint address="http://adderservice/adder.svc" binding="basicHttpBinding" bindingConfiguration="" contract="*" name="Adder"/>
      <endpoint address="http://subtractorservice/subtractor.svc" binding="basicHttpBinding" bindingConfiguration="" contract="*" name="Subtractor"/>
    </client>
  </system.serviceModel>


  <system.webServer>
    <modules runAllManagedModulesForAllRequests="true"/>
  </system.webServer>

</configuration>

 

 

Below is what the soap message looks like coming over the wire using MS WCF Test Client for the Adder web service.

<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/">
  <s:Header>
    <Action s:mustUnderstand="1" xmlns="http://schemas.microsoft.com/ws/2005/05/addressing/none">http://EquateRequestMessage/Action</Action>
    <h:Operation i:nil="true" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:h="http://www.examples.com/Adder/2010/10" />
  </s:Header>
  <s:Body>
    <EquateRequestMessage xmlns="http://www.examples.com/Adder/2010/10">
      <Value1>0</Value1>
      <Value2>0</Value2>
    </EquateRequestMessage>
  </s:Body>
</s:Envelope>

 

In the routing section below I have used a namespace table to alias the namespace of the service contracts. Lines 2 and 3 are aliases for the WCF Services that I built. Line 4 is the default soap message namespace.

   1: <namespaceTable>

   2:         <add prefix="ha" namespace="http://www.examples.com/Adder/2010/10"/>

   3:         <add prefix="hs" namespace="http://www.examples.com/Subtractor/2010/10"/>

   4:         <add prefix="s" namespace="http://schemas.xmlsoap.org/soap/envelope/"/>

   5:       </namespaceTable>

The other section of interest is the filters. Here I am using xpath queries to read the type of mathematical operation and depending on which filter matches, it will route to the correct endpoint as defined in the filter tables. Note that I am using the namespace aliases.

   1: <filters>

   2:        <filter name="AddFilter" filterType ="XPath" filterData="/s:Envelope/s:Header/ha:Operation/text() = 'Add'"/>

   3:        <filter name="SubFilter" filterType="XPath" filterData="/s:Envelope/s:Header/hs:Operation/text() = 'Sub'"/>

   4:      </filters>

 

The filter tables determine which endpoint to reroute the message to.

   1: <filterTables>

   2:        <filterTable name="filterTable1" >

   3:          <add filterName="AddFilter" endpointName="Adder" priority="0"/>

   4:          <add filterName="SubFilter" endpointName="Subtractor" priority="0"/>

   5:        </filterTable>

   6:      </filterTables>

 

These are the endpoint addresses for the WCF services that I had built.

   1: <!-- Client endpoints for the services-->

   2:    <client>

   3:      <endpoint address="http://adderservice/adder.svc" binding="basicHttpBinding" bindingConfiguration="" contract="*" name="Adder"/>

   4:      <endpoint address="http://subtractorservice/subtractor.svc" binding="basicHttpBinding" bindingConfiguration="" contract="*" name="Subtractor"/>

   5:    </client>

 

Thats it, routing using a configuration file. This is just one example on using the new routing features in WCF 4.0.

Enjoy.

Creating a composite response schema for the BizTalk SQL WCF Adapter

I had a scenario where I was requesting several different complex xml documents from a SQL database to a BizTalk application. Returning xml from the stored procedures were used because of the relational structure of the data and to avoid having to use multiple maps to get to the final xml structure.

Instead of having to create new send ports for every query because of the response schemas where different for each request, I decided to use a composite schema that allowed me to use a single send port.

Below are the steps I used to go about developing the solution for BTS2009 and SQL2008. I will use the sample database AdventureWorks and will use a simple xml structure for each entity involved.

1. Create the required stored procedures

Create procedure [BTS].[Sel_Contacts_XML]

as

WITH XMLNAMESPACES ('http://BT.Composite.Schemas.WcfSQL/ContactResponse' as nsql1)

select Title,FirstName,LastName,EmailAddress

from Person.Contact

for XML PATH('nsql1:Contact')

 

Create procedure [BTS].[Sel_Products_XML]

as

WITH XMLNAMESPACES ('http://BT.Composite.Schemas.WcfSQL/ProductResponse' as nsql2)

select ProductNumber,Name,ListPrice

from Production.Product

for XML PATH('nsql2:Product')

 

2. Create the schemas for each stored procedure in the BizTalk project. Ensure that the target namespace matches the namespace defined in the stored procedure. For this example I called them ContactResponse and ProductResponse.

3. Create a new schema that will import the above schemas. This schema will be used as the common composite response schema for each of the WCF SQL requests.

image

Below is the final composite schema with the imported ContactResponse and ProductResponse schemas. Note that the target namespace and the root element of this composite schema will be specified in the WCF adaptor properties latter on.

image

4. Create the orchestration to call the stored procedures. Note there is only one send port but there is a separate operation for each stored procedure.  Each operation was created manually and the name of the operation is important as it is used in the soap action header configuration latter.

For this demonstration the orchestration requested all the contacts first from the database and wrote the response to a file, then a second request was made to the database for the products and the response written to a file also.

image

5. Once the application is deployed to BizTalk, the WCF Custom adaptor must be configured.

When using stored procedures that output XML, the action property must be defined as following. XmlProcedure/<schema_name>/<procedure_name>

The operation name must match the name of the operation inside the orchestration

image

 

Below is the Action operation. Note the two operations that match port operation in the orchestration.

<BtsActionMapping xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">

  <Operation Name="SelContacts_OP" Action="XmlProcedure/BTS/Sel_Contacts_XML" />

  <Operation Name="SelProducts_OP" Action="XmlProcedure/BTS/Sel_Products_XML" />

</BtsActionMapping>

Next the binding properties for using stored procedures outputting xml must be set to the target namespace and the root name of the composite schema that was created.

image

 

This is sample output extract of the first request for the Contacts.

image

And this is the output from the second request for the products.

image