Cross-Application Subscription

Applications in BizTalk 2006 for me are just logical buckets that allow you place like-minded artifacts (maps, schemas, orchestrations etc.) together, so your environment when viewed through the BizTalk Admin Console looks clean and tidy and can be easily managed.

Although resources (BizTalk Assemblies) and ports can be moved between applications, they cannot exist in multiple applications at the same time*, so is it possible to have one application respond to events caused another application? Yes it is, through cross-application subscription.

If you create the most basic of subscriptions: a receive port/location and a send port, subscribed to the receive port on the ReceivePortName property; stop the send port (but leave it enlisted) and push a message through. The message will suspend because it is waiting for the send port to be started, but this gives us an opportunity to look at the context properties for the message (click on the image for a larger version):

Looking through the context properties (both promoted and non-promoted), there is no property for the originating application, or any ‘special’ properties that only the internals of BizTalk understands such as those seen in messages traversing request/response ports.

Given that applications are these ‘logical buckets’ and that they sit above the Message Box and the subscription engine, there is nothing stopping us therefore in subscribing to messages in ‘Application B’ that originated from ‘Application A’. This means we can quite easily perform cross-application subscription:

In a new application, create a send port that will subscribe to the ReceivePortName property as we did above and start it; drop a new messages in and you should now receive two output copies of the message, processed by different applications!

There is at least one drawback I can see here: Orchestrations in Application B still won’t be able to be bound to Receive Ports in Application A as the Admin Console won’t let you (without some brave database hacking…); it would however work if Orchestration receive ports were directly bound to the Message Box.

I can see that this approach could have beneficial uses across business-streams, or an ESB, where the on-ramp functionality lives in one application and end-point subscriptions live in other, separate applications.

Any other drawbacks or use-cases that I’m missing here?

* this isn’t necessarily true, as applications can reference other applications.

Error when expanding ‘Application’ in the BizTalk 2006 Admin Console

Originally posted by Nick Heppleston on his blog at: http://www.modhul.com/2008/08/05/error-when-expanding-application-in-the-biztalk-2006-admin-console/

Update: There have been a few comments to this post (see below) and it would appear that this problem is either due to WMI or the DTS service. If you are encountering this problem, I suggest that you investigate restarting these two services before going for the full-blown server restart. Thanks to Mike, Hema and Per for help in resolving this one.

Yesterday we encountered a worrying blocking error when expanding the Application node under a BizTalk Group in the 2006 Administration Console:

Failed to create ApplicationNode (Microsoft.BizTalk.SnapIn.Framework)

Additional Information:
Exception has been thrown by the target of an invocation. (mscorlib)
Failed to enable constraints. One or more rows contain values violating non-null , unique or foreign-key constraints. (System.Data)

All of the servers in the Group were affected, this suggested to me that there was some corruption in the management database given that we had been performing a deployment of new assemblies just before it occurred.

One possible resolution to the problem – repair the BizTalk installation – was discussed on MSDN, but that didn’t appear to work in our case; Instead, restarts of all the servers in the Group (3 servers in total) resolved the problem.

Not a satisfactory resolution, but one which worked nonetheless; we’ll be keeping a close eye on the environment in future….

SOAP Adapter – Confusing error with an untrusted Root Certificate Authority

I recently wrote about problems with two-factor authentication and installing a certificate in the correct certificate store. Well, we’ve had similar problems, this time securing the transport link on a SOAP adapter, which resulted in the following error:

Could not establish trust relationship for the SSL/TLS secure channel with authority ‘your machine name’. The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.

After some digging, it turns out that the problem is due to an untrusted certificate issuer (Certificate Authority) for the cert we were trying to use. Adding the root certificate to the list of Trusted Root Certification Authorities resolved the issue.

Correctly Installing a Certificate for Two-Factor Authentication via the HTTP Send Adapter

I spent several hours last week banging my head against the proverbial brick wall while trying to identify the correct certificate store to be used for authentication by the HTTP Send Adapter – as the answer is a little obscure on the interweb, I’m posting the information here to help any weary BizTalk traveller in the future….

HTTP Transport Properties First, the obligatory background: The HTTP Send Adapter can use a public key certificate to identify itself as part of a two-factor authentication process when accessing a website (two-factor authentication ensures you are who you say you are by asking for information you know (i.e. username/password) and something you have (i.e. a RSA SecureID Token or a public key certificate)). The certificate it uses to perform this authentication is identified by the ‘SSL Client Certificate Thumbprint’ value of the Authentication tab on the adapter config dialog box, as shown to the left:

The adapter looks in the Personal Certificate Store of the user under which the BizTalk Windows Service is running, as detailed in the HTTP Send Adapter page on MSDN. Note that this is different to certificates used on the Send and Receive Ports and on Hosts, the settings for these can be found on MSDN at Certificate Stores that BizTalk Server 2006 Uses (which annoyingly doesn’t document the HTTP adapter).

More information on using two-factor authentication can be found at WindowsSecurity.com, however this article focuses on an end-to-end solution for securing a corporate web-site and is not very BizTalk specific; probably a good excuse to write a post on it in the future!

Why Archive and Purge when you can just Purge?

In BizTalk 2004 SP2, the BizTalk Team gave us the Archive and Purge SQL Server Maintenance Job for managing the size of the Tracking Database. This was a great tool and really took away some of the admin headaches in maintaining this particular database.

The job allows administrators to archive olde tracking data and verify the integrity of the backup before purging from the live database (for detailed instructions, see the MSDN website). This is a good, pro-active practice for the health of any BizTalk environment, in that:

  • BizTalkDTADb database growth is constantly checked, allowing the TDDS service to run effectively, thereby maintaining a healthy Message Box.
  • Maintaining the size of the BizTalkDTADb data and log files ensures that the database doesn’t just eat all of your available disk space.
  • The tracking data can be restored in a dedicated OLAP environment, allowing reports to be run without affecting the live BizTalk environment.

However, imagine a scenario where you don’t want the archived data – you simply want to purge. Where that’s the case, you can either manually run a truncate on the BizTalkDTADb (see my posts detailing how to do this under BizTalk 2004 or BizTalk 2006 – there are some subtle differences), or run the hidden admin gem dtasp_PurgeTrackingDatabase (a stored procedure used by the dtasp_ArchiveAndPurgeTrackingDatabase which just purges) on a scheduled basis, so you no longer need to worry about manually purging.

Configuring the dtasp_PurgeTrackingDatabase Stored Procedure

The Purge stored procedure is used in a very similar manner to dtasp_ArchiveAndPurgeTrackingDatabase, taking 4 parameters:

  • Live Hours – Any completed instance older than the live hours + live days…
  • Live Days – …will be deleted along with all associated data.
  • Hard Days – all data older than this will be deleted. The time interval specified here should be greater than the live window of data.
  • Last Backup – UTC Datetime of the last backup. When set to NULL, data is not purged from the database.

As an example of its usage, if you wanted to purge any tracking data older than two hours and hard-delete any data older than one day, you would use the follow T-SQL:


–– Change to the BizTalk Tracking database.
USE BizTalkDTADb
GO

–– Prune tracking data older than two hours.
DECLARE @dtLastBackup DATETIME
SET @dtLastBackup = GetUTCDate()
EXEC dtasp_PurgeTrackingDatabase 2, 0, 1, @dtLastBackup

The @dtLastBackup parameter is used to ensure that records that have not been backed-up are not deleted by the purge procedure, so we set it to the current UTC date/time to ensure that whatever live hours/days you specify, records are deleted. I’m not too sure why the development team included this as a parameter: the procedure is a wrapper that calls the dtasp_PurgeTrackingDatabase_Internal (which is also called by the dtasp_ArchiveAndPurgeTrackingDatabase procedure) so it could have been included in that wrapper given that it is always defaulted to the current UTC date/time during purges!

One other thing to note is that the wrapper script also modifies the prune before date (the date that is built by the live hours/days specified): this date date is tweaked to remove ten minutes to ensure redundancy in the remaining data.  In the example used above, rather than keeping two hours of data, there will in fact be two hours and ten minutes once the code has been run.

Using the dtasp_PurgeTrackingDatabase Stored Procedure

To start using this procedure, I would suggest that you first truncate your database, then configure the SQL Server Agent job to run every X hours/days depending on your environmental requirements. I suggest initially truncating your database to ensure the first run of the job doesn’t take several hours!

Further details about the dtasp_PurgeTrackingDatabase stored procedure can be found online at MSDN.

Truncating the BizTalk 2006 Tracking Database

In Truncating the BizTalk 2004 Tracking  Database I discussed how to truncate the tracking database in BizTalk 2004. Over on the BizTalk Gurus forums, user Nick Busy wanted to do the same thing for BizTalk Server 2006 – he’s kindly allowed me to repost his instructions for the community on this blog:

0. Before start, ensure you have got the database admin priveleges on the database

1. Stop all BizTalk Server Host Instances

2. Full backup BizTalkDTADb database (just in case)

3. Make scripts to create views (MANDATORY)

dbo.dtav_ServiceFacts
dbo.dtav_MessageFacts
dbo.dtav_FindMessageFacts

4. Run SQL script:

use BizTalkDTADb
GO

— Drop the Views (before you perform this, ensure you take copies of these views!)
— unfortunately, it’s necessary for SQL 2000, but you can skip it for SQL 2005
Drop View dbo.dtav_ServiceFacts
Drop View dbo.dtav_MessageFacts
Drop View dbo.dtav_FindMessageFacts
Go

— Truncate the necessary Tables
Truncate Table dta_CallChain
Truncate Table dta_DebugTrace
Truncate Table dta_MessageInOutEvents

Truncate Table dta_ServiceInstanceExceptions
Truncate Table dta_ServiceInstances

Truncate Table Tracking_Fragments1
Truncate Table Tracking_Parts1
Truncate Table Tracking_Spool1

Truncate Table dta_MessageFieldValues

— end of the script

5. Update statistics on BizTalkDTADb database

— update statistics
exec sp_updatestats

6. Run the saved scripts (see step 3) to recreate the dropped views from your own environment.

7. Shrink BizTalkDTADb database (sometimes it doesn’t work from GUI, so using sql command will help)

— shrink database
dbcc shrinkdatabase (BizTalkDTADb, 10)

8. Start BizTalk Server Host Instances

9. Configure and enable SQL Agent job “DTA Purge and Archive” (to avoid over-growing the database in the future)

P.S. The script above does not truncate Rule Engine related tables.

Thanks Nick, much appreciated.

XmlAssembler TargetCharset Property Error in BizTalk 2006 & 2006 R2

Update: Tomas Restrepo added a comment to this post in which he explains why this particular ‘feature’ is present – I think its worth repeating here:

…the problem was that the Target Charset property of the assembler actually gets written as two separate elements when serialized to the pipeline XML file, and both must be present for the assembler to actually figure out the encoding to use… So what happens is that the metadata in the actual design time properties for the assembler component only actually represents one of those elements in the serialized format, and this is what the per-instance pipeline configuration dialog uses to ask the user to enter the values. So in essence, the dialog only allows you to edit one of the values and not the other, so you always end up with the component incorrectly configured.

Thanks Tomas!

While researching a post about message encoding in BizTalk, I can across this Microsoft Knowledge Base article regarding the TargetCharset property on the XmlAssembler component – it would appear that if the property is set through the Admin Console, it won’t take effect. This is strange, because this bug appears to have been fixed in BizTalk 2004 SP1, but somehow made it back into all versions of BizTalk 2006 & 2006 R2. Furthermore for BizTalk 2006, Microsoft only offer a work-around rather than a Hotfix which is frustrating.

So, what happens when you attempt to set the TargetCharset property? The default encoding when using the XmlTransmit pipeline – without the TargetEncoding property set – is a UTF-8 encoded Xml message (click on the image for a full-size version):

UTF-8 Encoded Message

If you change the TargetCharset property on the XmlAssembler component of the default XmlTransmit pipeline in the BizTalk Admin console (as shown below), the change is not picked-up and the same UTF-8 encoded message (as above) is returned; in this case, we’re trying to change it to UTF-16:

XmlAssembler-AdminConsole-TargetCharsetProperty

To change the encoding, you must create a custom pipeline and use the XmlAssembler with the TargetCharset property set appropriately, as follows:

XmlAssembler Custom Pipline Component TargetCharset Property

Using this new custom pipeline with the TargetCharset property correctly configured to output UTF-16 encoded messages, we acheive our desired encoding, as follows (click on the image for a full-size version):

UTF-16 Encoded Message

Encoding can also be acheived by setting the XMLNORM.TargetCharset property of the message to be output in an Orchestration Message Assignment shape as follows:

<MessageName>(XMLNORM.TargetCharset) = "UTF-16";

Note, this testing was performed on BizTalk 2006 R2.

Testing Pipeline Components – The Pipeline Testing Library

I’ve spent some time recently working almost exclusively with BizTalk pipeline components and took the opportunity to use the excellent Pipeline Testing Library produced by Tomas Restrepo (more info is available here and here). If Microsoft were writing this post, they would probably say that ‘It Rocks!’. I’m British, so I’ll be a little more reserved and simply say that, imho, this is the single best tool for developing and testing custom pipeline components, period.

The library provides a framework for unit-testing components in the standard NUnit or MSTest manner, without having to go to Pipeline.exe or doing a full deployment to a BizTalk instance itself – both of which aren’t particularly quick or agile. Furthermore, the library lets you seamlessly integrate with MSBuild, without jumping through lots of hoops to get a similar effect.

The library allows you to programatticaly create pipelines, create instances of components and assign them to their respective stages (including your own custom components and the familiar out-of-the-box components), or load an existing BizTalk pipeline without the need to deploy the supporting BizTalk project.

A very simple example from my recent testing looks something like the following:

Pipeline Testing Example - Xml Disassembler

Here, I’m creating an instance of my archiving component (the component I’m testing) and the Xml Disassembler (specifying the document type it should accept). I create a new receive pipeline and assign the archiving component to the decode stage and the XmlDasm to the disassembler stage. After creating a sample message using the MessageHelper helper, I add context properties that mimic the adapter I’m trying to test (in this case the FILE adapter) and execute the pipeline. I finally test to check that the archiving component wrote the file to the archive in the expected location, with the expected filename.

If you want to test using Flat-Files, simply use a flat-file disassembler and specify the schema:

Pipeline Testing Example - Flat-File Disassembler Snippet

Powerful stuff – no more long-winded MSBuild tasks to deploy your solution, configure several possible adapters and push messages to those adapters over different protocols, simply to test a custom pipeline component on your development workstation (and lose 5-10 minutes in the process). Don’t get me wrong, all this should be happening in your smoke and UAT environments, but for the developer (attempting) to work in an agile manner on a BizTalk project, Tomas’ Pipeline Testing library is priceless.

Yet more Recommendations for Organisations Getting Started with BizTalk

Richard Seroter has a great article entitled You’ve Just Bought BizTalk Server. Congrats. Now What? over on his blog, in which he lists his top ten recommendations for organizations getting started with BizTalk. He highlights some important points – which I wholeheartedly endorse – including: committing to, and enforce naming standards; setting-up a standard release management plan; defining source code and server access control; identifing your disaster recovery / archive  / purge strategy.

However, I’d like to add a few more to this list (I know it was meant to be a ‘top 10’ but I believe these are equally important):

  • Invest in a good quality infrastructure – Even though this will probably blow your departmental budget, invest in a good quality infrastructure for your BizTalk environment: ensure you have SQL Server clustered in (at least) an active/passive configuration, active/active if you can handle some performance degradation in the event of a node failure and cluster your Enterprise SSO environment; A good quality underlying SAN (ensuring that your BizTalk SQL Server data and log files reside on separate LUNs); and sufficient BizTalk servers to accommodate a receive tier, processing tier (orchestrations) and send tier, which will mean 6 BizTalk servers minimum in a good enterprise setup. When buying servers always purchase hardware with extra capacity (even if it is initially unused) such as additional CPU sockets, giving you the option of a quick-win when you need to scale up your existing hardware, before starting to scale-out. Given that BizTalk 2006, BizTalk 2006 R2 and SQL Server 2005 can run almost natively on 64-bit hardware, go for 64-bit CPU’s. Finally, invest in good quality sys-admins who know what they are doing with all this expensive kit.
  • Invest in smoke, test and staging environmentsDon’t deploy from your dev environment straight into live. Investing in smoke, test and staging environments is worth the cost incurred when compared to the cost of your live environment going down, because the latest solution release or hotfix introduced some unexpected bug or configuration error. Furthermore, these environments don’t have to be expensive: using virtualisation such as VMWare’s ESX server or Microsoft Virtual Server 2005, environments can be provisioned relatively cheaply and on demand – several environments can run on the same hardware for example.
  • Get your DBA’s involved in the project – BizTalk has SQL Server at its core and this environment needs to be performing well to ensure QoS for your BizTalk environment. Richard touched on this point in his article, however it can’t be stressed enough: although BizTalk to a great extent is a ‘black-box’, DBA’s need to ensure that the SQL Server jobs are correctly running, databases (such as the tracking database) are regularly archived or purged, and valid backups are being correctly created, logs are shipped to your DR site and successfully restored. One tip I have learned the hard way: engage your DBA’s early, otherwise they will become disillusioned and the dev team will end up nursing your databases.
  • Read the BizTalk blogs and newsgroups – There are now more than 60 active BizTalk bloggers who have either solved, or who can offer advice on a number of BizTalk development, deployment and operational problems that you will encounter during the life of a BizTalk solution. You can download my OPML file listing 60+ active BizTalk bloggers, or go to the BizTalk Community Blogs via Syndication on the BizTalk Gurus website. If these resources don’t offer the information you need, try posting to the BizTalk Newsgroups; if your an MSDN subscriber, your also entitled to use the managed newsgroups which (I believe) are guaranteed a response from Microsoft.

Subversion & BizTalk: Setting Committer Auto-Properties

In my last post, I discussed how to configure the version control system, Subversion, to correctly handle BizTalk files. Although I did talk about how to set the correct file-type properties and how to mass update files already in the repository, I neglected to detail how to set those properties when files are first added to a repository.

As a committer, you won’t want to keep updating your newly added files with Subversion properties every time you add a or import a file. To resolve this problem, Subversion uses a concept of ‘auto-properties’ – properties that are automatically set on a new file (based on file type) when they are first added to the repository. Auto-properties can either be set in the Subversion configuration file or in the Windows Registry.

Auto-Properties in the Subversion Configuration File

To set auto-properties in the Subversion configuration file, open the config file

%USERPROFILE%Application DataSubversionconfig

and add lines similar to the following under the [auto-props] section of the file:

[auto-props]
*.btm = svn:mime-type=text/xml
*.btp = svn:mime-type=text/xml
*.odx = svn:mime-type=text/xml
*.xsd = svn:mime-type=text/xml;svn:eol-style=native
*.xml = svn:mime-type=text/xml;svn:eol-style=native
*.xsn = svn:mime-type=application/octet-stream

This sets the MIME-types defined in my earlier post for several common BizTalk file types; note that multiple properties can be set per file type, each separated by a semi-colon (e.g. *.xsd and *.xml file-types).

In order to enable these auto-properties, enable-auto-props under [miscellany] must be set to ‘yes’:

[miscellany]
enable-auto-props = yes

Auto-Properties in the Registry

The same auto-properties settings can be achieved through the Windows Registry. A sample .reg file is shown below that sets the same auto-properties detailed earlier:

REGEDIT4

[HKEY_CURRENT_USERSoftwareTigris.orgSubversionConfigauto-props]
"*.btm"="svn:mime-type=text/xml"
"*.btp"="svn:mime-type=text/xml"
"*.odx"="svn:mime-type=text/xml"
"*.xsd"="svn:mime-type=text/xml;svn:eol-style=native"
"*.xml"="svn:mime-type=text/xml;svn:eol-style=native"
"*.xsn"="svn:mime-type=application/octet-stream"

[HKEY_CURRENT_USERSoftwareTigris.orgSubversionConfigmiscellany]
"enable-auto-props"="yes"

For a full list of auto-properties that can be set, see the Special Properties section of the Subversion Online Help.