Accouncing the New BizTalk Message Archiving Pipeline Component

It gives me great pleasure to announce Version 1.0 of the BizTalk Message Archiving Pipeline Component and my new venture, Atomic-Scope.

The component allows you to archive Xml, flat-file, EDI and binary messages as they pass thru receive and send pipelines, without the need for complex ‘archiving’ Send Port configurations, additional load on the MessageBox database and requires zero developer time – just plug-in, configure and use! Archive filenames are created at runtime from Context-Properties written to the message by receive adapters, custom properties defined in your solution or with the standard Send Port ‘macros’. A free 14-day trial is available – download your copy now.

This new version of the component builds on the excellent feedback received following the launch of the original version on Codeplex and has spent many months in development – this release is a complete re-write and incorporates many of the community requested features, including:

  • The ability to enable and disable the component at runtime;
  • The promotion of the archive filename into the Message Context for use in orchestrations, Send Port filters etc.;
  • Full streaming support, even for messages that fail when being processed by downstream components (such as flat-file or EDI messages);
  • Support for a wide-range of adapter Context-Properties ‘out-of-the-box’ and custom, user-defined Context-Properties;
  • Native 32- and 64-bit versions;
  • Support for all versions of BizTalk Server 2006, BizTalk Server 2006 R2 & BizTalk Server 2009;

The component is also backed by our business-class Enterprise Support Agreement. You can also read more about Atomic-Scope and see how we are supported by the Microsoft BizSpark Programme.

A short video if available which gives an overview of how the component works when archiving files in receive pipelines (opens in a new window):

I plan on posting more information about the Message Archiving Component over the coming weeks. In the meantime, why not download the component and try it out yourself?

Handling a Web-Service Null Response: The ‘CreateBodyPart’ Pipeline Component

Originally posted by Nick Heppleston at:

Today I’m releasing a small component that addresses an interesting problem I’ve never come across before – a null response from a web-service.

The web-service in question is provided by a third-party and unfortunately cannot be changed. Their WSDL defines the root response element with a maxOccurs=1 and a minOccurs=0, which allows for a null response to be returned. So instead of a result element with no child nodes as I would expect (below in green):

<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="..." xmlns:xsd="..." xmlns:xsi="...">
    <PendingSalesOrdersResult xmlns="..." />

we’re receiving an empty response (note the lack of an element between the <GetPendingSalesOrdersResponse> element):

<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="..." xmlns:xsd="..." xmlns:xsi="...">

When the SOAP adapter receives this null response, it creates a message but neglects to create a body part – I presume this is because there is nothing to create it from. As a result, the message fails in both an XmlReceive and PassThruReceive pipeline; when using the PassThruReceive pipeline, the XLANG/s engine throws the following error when it tries to read the body part:

The XLANG/s message has no part at index ‘0’.  The total number of parts found in the message is ‘0’. If you expect a multipart message, check that the pipeline supports multipart messages such as MIME.

To address this problem, I’ve created a very simple receive pipeline component which lives on the receive side of the web-service solicit-response port and interrogates a message to determine whether it has a message body. If a message body is not present, it creates one based on the component properties (which mimic the expected – valid – response from the web-service) allowing the response message to be passed into the Message Box without issue. It a message body is present, it passes the message through unchanged.

The CreateBodyPart Pipeline Component

The component lives in the Decode stage of the receive pipeline and simply detects for the presence of a message body. If the body part is null, a new message part is created using the component properties BodyPartName and BodyPartContent and assigned as the body part of the message, before being passed onto the next stage of the pipeline. The BodyPartName and BodyPartContent properties specify the name of the body part and the Xml you want the part to contain – given that the content of the message is likely to be small-ish (i.e. <PendingSalesOrdersResult xmlns=”…” /> in the example above).

There are a few enhancements that I can envisage for this component already, including the ability to specify whether the part should be a body part – you might just want to create extra parts on the fly, not just a body part; and also the ability to specify the encoding and character set being used with your part content. However, I’ll leave these enhancements for a later day.

You can find binaries and source for the component at Codeplex:

As a final note, my client is currently running BizTalk 2006 (not R2) and we are running on the out-of-box SOAP adapter. Does anyone know whether this issue is a problem in the WCF-SOAP adapter or the WSE adapter?

Reblog this post [with Zemanta]

Writing Unit Tests for Pipeline Components with NCover

I discovered NCover over the weekend and wow, am I a fan!

NCover is a code coverage tool which, to quote the website, “helps you test intelligently by revealing which tests haven’t been written yet.” In laymans terms, NCover graphically shows you which lines of code were not touched during the execution of your unit tests, allowing you to create tests accordingly to achieve 100% code coverage, as shown below:

Pipeline Component Testing

Armed with Tomas Restrepo’s excellent Pipeline Testing library, we can now comprehensively test our Pipeline Components:

1. Develop units tests to test your pipeline and ensure that the tests execute and are successful;

2. Invoke your testing framework from the command line to check that your tests will run correctly in the NCover environment (I’m using NUnit here, but you could also invoke VSTS):

"C:Program FilesNUnit 2.4.8binnunit-console.exe" BizTalkMessageArchivingComponent.Tests.dll

Which should produce the following output on the command line:

3. With our unit tests successfully executing, we can wrap the NUnit invocation in NCover loveliness which will inspect our code coverage while those tests executed:

ncover.console "C:Program FilesNUnit 2.4.8binnunit-console.exe" BizTalkMessageArchivingComponent.Tests.dll //ea Modhul.BizTalk.Pipelines.ArchiveMessages.Attributes.NCoverExcludeCoverage //et "Winterdom.*;BizTalkMessageArchivingComponent.Tests.*"

Producing command-line output similar to the following (notice the NUnit tests running between the ***Program Output*** and ***End Program Output*** text):

NCover has now determined the code coverage based of your unit tests and produces a Coverage.Xml file. This file contains information relating to the code coverage and can be loaded in the NCover.Explorer tool to produce a VS like environment that displays lines of code that were touched and un-touched by your unit tests.

4. Load the NCover.Explorer tool and open the Coverage.Xml file generated above, you will be presented with a screen detailing your code coverage. In the example below, you can see that the PerformImmediateCopy and StreamOnReadEvent methods do not have full code coverage – both have code which was not executed in our unit tests:

Clicking on one of the methods in the tree view loads the offending method, displaying the lines which were not executed by our tests in red; lines that were executed are displayed in blue:

Based on this information, we can now create tests to cater for these exceptions, ensuring we have 100% code coverage.

NCover is an excellent tool and although it isn’t free, I personally think its a must-have for any developers tool-kit.

BizTalk 2006 Message Archiving Component – More Efficient Stream Handling

Update: The Message Archiving Component is now maintained by Atomic-Scope, visit for more information.

I’ve re-worked the BizTalk 2006 Message Archiving Component as a true streaming pipeline component – utilising a forward only eventing stream – to more efficiently use process resources when archiving large files, as suggested by Mikael Håkansson. The updated version is on CodePlex as version 0.3.

Writing of the archive file is now accomplished during the read-event of the FOE stream, essentially spooling the archive file to disk (or network share) as the stream is read by downstream components or the Messaging Agent itself. The BizTalk process memory footprint using this new code-base (while testing with a large (10Mb) message) is 90% more efficient than previous versions.

All other functionality remains unchanged.

Announcing a new BizTalk 2006 Message Archiving Component

Update: The Message Archiving Component is now maintained by Atomic-Scope, visit for more information.

A couple of days ago I uploaded a revised version of my BizTalk 2006 Message Archiving Component onto CodePlex – version 0.2 can be downloaded from the CodePlex website at

The component works with BizTalk 2006 & BizTalk 2006 R2 and allows messages to be written to the file system (either locally or over a network share) for archiving, using message context-properties to define the archive path. It is written in a streaming fashion, is designed for large message consumption and can handle Xml, binary and flat-files.

The component can be executed in either the Decode (receive pipeline) or Encode (send pipeline) stages – when used on the receive side, the original message can be archived before BizTalk has started to manipulate the message; on the send side, the final message before delivery via the send adapter can be captured. If message archiving fails, the message will still continue downstream.

As per the previous version, the archive path is constructed using ‘macros’ (e.g. %ReceivePortName%) which relate to message Context Properties. In the previous version, these macro’s were hard-coded into the component so changing or adding a new macro was painful, requiring a rebuild of the code base. The new version continues the theme of macros, however they have moved into an Xml configuration file, which can be configured at run-time, allowing any context-property (either a known BizTalk context property or a custom adapter/pipeline property) to be used. An example of the configuration file is shown below:

Macro Definition File Sample

These macros are used when defining the archive file path in the component configuration (in the BizTalk Server Admin Console) – see the screenshot below. The archive file path is constructed using the ArchiveDirectory and ArchiveFilename properties. Macros can be repeated and/or used in both properties at the same time (however I can think of scenarios where you may not wish to use say ReceivePortName in the archive filename).

Note: When configuring the component properties, you are specifing the macro name, not the context property name. This is useful when you are using a single configuration file and want to read the ReceivedFileName context property from both the FTP and FILE adapters for example (both use the same property name, but different property namespaces) – in this scenario, you would call the macro’s ‘ReceivedFileNameFILE’ and ‘ReceivedFileNameFTP’ to avoid ambiguity.

The example Macro Definition Xml file shown above details macros for the System Context Properties ReceivePortName and ReceiveLocationName. To use these macros in the pipeline component, assign them in the Archive Directory or Archive Filename property, as below.

Component Configuration - Macro Examples These macro’s (%ReceivePortName%, %ReceiveLocationName% and %MessageID%) will be expanded as a message is delivered via the port to their real-world values, writing the archived message to (for example):

..Test Xml Dasm Decode StageTest Xml Dasm Decode Stage Location364a4dee-c698-4ddd-af14-d44e66f3bd5d.xml

In the example above we’re using the %MessageID% macro to build the archive filename. This is a special macro as the value does not come from the message Context, but is taken directly from the message itself, similar to using %MessageID%.xml in a Send FILE adapter.

The OverwriteExistingFile property indicates to the component whether an existing file should be overwritten if, once the archive path has been constructed from the Context Property macros, the full archive path resolves to an existing file. If set to true, the existing file will be overwritten; if false, the new file will be written to the same directory, but with an additional GUID before the file extension.

Finally, the MacroDefinitionsFile property defines the location to the Xml configuration file detailing the macro Context Property substitutions. A different file can be used per pipeline, or a single global file can be used. A more detailed example than that shown above – which includes macro’s for commonly used Message Tracking, System and File adapter properties – is available to download.

With regards to permissions, you will need to ensure that the user running the BizTalk service has read permissions on the macro definitions Xml configuration file and read/write permissions on the root of the archive directory.

Although the component is production ready it is not yet feature complete. If you are using the previous version, can I ask you to download a copy and try it out – please let me have any feedback via the comments on this post, or via the Issue Tracker pages of the CodePlex project.

Testing Pipeline Components – The Pipeline Testing Library

I’ve spent some time recently working almost exclusively with BizTalk pipeline components and took the opportunity to use the excellent Pipeline Testing Library produced by Tomas Restrepo (more info is available here and here). If Microsoft were writing this post, they would probably say that ‘It Rocks!’. I’m British, so I’ll be a little more reserved and simply say that, imho, this is the single best tool for developing and testing custom pipeline components, period.

The library provides a framework for unit-testing components in the standard NUnit or MSTest manner, without having to go to Pipeline.exe or doing a full deployment to a BizTalk instance itself – both of which aren’t particularly quick or agile. Furthermore, the library lets you seamlessly integrate with MSBuild, without jumping through lots of hoops to get a similar effect.

The library allows you to programatticaly create pipelines, create instances of components and assign them to their respective stages (including your own custom components and the familiar out-of-the-box components), or load an existing BizTalk pipeline without the need to deploy the supporting BizTalk project.

A very simple example from my recent testing looks something like the following:

Pipeline Testing Example - Xml Disassembler

Here, I’m creating an instance of my archiving component (the component I’m testing) and the Xml Disassembler (specifying the document type it should accept). I create a new receive pipeline and assign the archiving component to the decode stage and the XmlDasm to the disassembler stage. After creating a sample message using the MessageHelper helper, I add context properties that mimic the adapter I’m trying to test (in this case the FILE adapter) and execute the pipeline. I finally test to check that the archiving component wrote the file to the archive in the expected location, with the expected filename.

If you want to test using Flat-Files, simply use a flat-file disassembler and specify the schema:

Pipeline Testing Example - Flat-File Disassembler Snippet

Powerful stuff – no more long-winded MSBuild tasks to deploy your solution, configure several possible adapters and push messages to those adapters over different protocols, simply to test a custom pipeline component on your development workstation (and lose 5-10 minutes in the process). Don’t get me wrong, all this should be happening in your smoke and UAT environments, but for the developer (attempting) to work in an agile manner on a BizTalk project, Tomas’ Pipeline Testing library is priceless.

Generating Microsoft Word Documents Natively using BizTalk 2006

In this post I’ll discuss how to generate Word 2007 documents natively from BizTalk 2006 using the Office Open Xml System.IO.Packaging API recently released by the Microsoft Office Team under .Net 3.0.


Unless you’ve lived under a rock during the last year, you’ll know that the Office Open XML (OOXML) format is the new Xml format for the Office 2007 suite, namely Word, Excel and Powerpoint. OOXML uses a file package conforming to the Open Packaging Convention and contains a number of individual files that form the basis of the document; the package is then zipped to reduce the overall size of the resulting file (either a .docx, .xlsx or .pptx).

Generating Word Documents – Overview

Generating a Word document is relatively simple and only requires a custom send pipeline component that generates our OOXML package.

In this post I will be using a Sales Report scenario, generating a Word document from the output of a fictional ERP system; to that extent, I’ll also be mapping from a fictional sales summary Xml message to the required OOXML format before generating the final .docx. The final document will look something like the following (note that the areas in red will be replaced with content from our ERP sales summary message – click on the image for a larger version):

Proposed Sales Summary Document - SmallBefore we start, I need to present a quick crash-course in the structure of OOXML packages. A minimal OOXML WordprocessingML document contains three parts: a part that defines the main document body, usually called document.xml; a part detailing the Content Types (which indicates to the consumer what type of content can be expected in the package); and a Relationships part (which ties the document parts and Content Types together). When using the System.IO.Packaging API we only need to concern ourselves with the main document body – the API takes care of creating the Content Types and Relationship parts. Its this feature of the API that allows us to create Word documents in BizTalk – all we need to do is create the Xml for the main document and squirt it at a custom pipeline component which does the packaging stuff for us using the API.

Note that the structure of an OOXML document is outside of the scope of this post (but a good understanding is fundamental when working with these documents) and I would recommend that you read the excellent Open Xml Markup Explained by Wouter van Vugt.

Generating Word Documents – The ‘Main’ Document

The main document body (i.e. document.xml) is the only part that is generated in the BizTalk solution. We don’t actually create a file called document.xml – the packaging API does this for us – instead we simply create a message that conforms to the OOXML schema and pass this into the custom Send pipeline.

In our scenario, we are generating a Sales Report document for distribution to the finance department – we will receive an Xml sales summary document from our fictional ERP system that resembles the following:

<?xml version="1.0" encoding="utf-8"?>
<ns0:SalesReport xmlns:ns0="">
    <Author>Nick Heppleston</Author>
    <SalesStart>10th January 2008</SalesStart>
    <SalesEnd>17th January 2008</SalesEnd>

which needs to be mapped into our OOXML main document body message (I think the layout of the OOXML message is pretty self explanatory, however I would point you at Open Xml Markup Explained if you’re after a more detailed explanation):

<?xml version="1.0″ encoding="utf-8″ ?>
<w:document xmlns:w="">
                    <w:b />
                    <w:sz w:val="52/>"
                        <w:rFonts w:ascii="Cambria" />
                <w:t xml:space="preserve">Sales Summary for: </w:t>
                <w:t>Nick Heppleston</w:t>
                    <w:i />
                    <w:sz w:val="52/>"
                        <w:rFonts w:ascii="Cambria" />
                        <w:spacing w:val="15/>"
                        <w:color w:val="48FDB2/>"
                <w:t xml:space="preserve">Sales from: </w:t>
                <w:t>10th January 2008</w:t>
                <w:t xml:space="preserve"> to </w:t>
                <w:t>17th January 2008</w:t>
                <w:t xml:space="preserve"> - </w:t>
                <w:t xml:space="preserve">Contact: </w:t>
                <w:t>Nick Heppleston</w:t>
                <w:t xml:space="preserve"> | </w:t>

This transformation can be performed anywhere: in the sample solution I’ve put the map on the Receive Port. Also, because I can’t think of any way to generate this type of message using a standard BizTalk Map – how do I graphically say ‘map from this source node to this destination node’ when all of the destination nodes simply repeat themselves – I am using custom XSLT to drive the map.

Note: I’ve yet to find a satisfactory XSD for the WordprocessingML markup so the solution contains a OOXML schema that was automagically generated from the above destination format. I’m working on sourcing the schema – I have a number of ‘feelers’ out with the Office Team and I hope to be able to provide a reference in the next couple of days.

With our Sales Summary message now mapped and in the necessary OOXML format, we can send it to the custom pipeline / pipeline component for it to do its work and generate our .docx package.

Generating Word Documents – The Custom Pipeline Component

The custom pipeline component is relatively simple. It uses the System.IO.Packaging API introduced in .Net 3.0 which can be found in windowsbase.dll (C:Program FilesReference AssembliesMicrosoftFrameworkv3.0windowsbase.dll); full documentation regarding this namespace can be found online at MSDN. The API is invoked in the pipeline component Execute() method as follows:

   1:  public IBaseMessage Execute(IPipelineContext pc, IBaseMessage inmsg)
   2:  {
   3:      XmlDocument InputXmlDocument = new XmlDocument();
   4:      InputXmlDocument.XmlResolver = null;
   6:      // Define bodypart instances
   7:      IBaseMessagePart bodyPart = inmsg.BodyPart;
   9:      // Define stream instances
  10:      Stream originalStream = null;
  11:      MemoryStream odfStream = new MemoryStream();
  13:      string docContentType = "application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml";
  14:      string docRelationshipType = "";
  16:      if (null != bodyPart)
  17:      {
  18:          // Get a *copy* of the original stream
  19:          originalStream = bodyPart.Data;
  21:          // Check that the original stream is not null
  22:          if (null != originalStream)
  23:          {
  24:              // Load the original message stream into our input xml document 
  25:              // to be used as the basis of the OOXML document.
  26:              InputXmlDocument.Load(originalStream);
  28:              try
  29:              {
  30:                  // Create a new OOXML package
  31:                  Package pkg = Package.Open(odfStream, FileMode.Create, FileAccess.ReadWrite);
  33:                  // Create a Uri for the document part
  34:                  Uri docPartUri = new Uri("/word/document.xml", UriKind.Relative);
  36:                  // Create the document part
  37:                  PackagePart mainPart = pkg.CreatePart(docPartUri, docContentType);
  39:                  // Add the data from the Xml Document to the document part
  40:                  Stream partStream = mainPart.GetStream(FileMode.Create, FileAccess.Write);
  41:                  InputXmlDocument.Save(partStream);
  42:                  partStream.Close();
  43:                  pkg.Flush();
  45:                  // Create the relationship between the part and the package.
  46:                  PackageRelationship pkgRelationship = pkg.CreateRelationship(docPartUri, TargetMode.Internal, docRelationshipType, "rId1");
  48:                  // Flush the changes then close the package
  49:                  pkg.Flush();
  50:                  pkg.Close();
  51:              }
  52:              catch (Exception Ex)
  53:              {
  54:                  EventLog.WriteEntry("BizTalk 2006 - Build ODF Package", "Error encountered building the package: " + Ex.Message, EventLogEntryType.Error);
  55:              }
  57:              try
  58:              {
  59:                  // Rewind the new OOXML stream
  60:                  odfStream.Seek(0, System.IO.SeekOrigin.Begin);
  61:              }
  62:              catch (Exception Ex)
  63:              {
  64:                  EventLog.WriteEntry("BizTalk 2006 - Build ODF Package", "Error encountered rewinding the stream: " + Ex.Message, EventLogEntryType.Error);
  65:              }
  66:              finally
  67:              {
  68:                  // Add the new OOXML stream into the return message.
  69:                  bodyPart.Data = odfStream;
  70:                  pc.ResourceTracker.AddResource(odfStream);
  71:              }
  72:          }
  73:      }
  75:      return inmsg;
  76:  }

A quick overview of the code is as follows:

  • Line 26: We load a copy of the original message data part stream into an XmlDocument to use as the main document body (the document.xml) when building the package.
  • Line 31: Create a new OOXML package in a new MemoryStream.
  • Line 34: Create a URI to the main document body (calling it document.xml).
  • Line 37: Create the main document body part (docPartUri and docContentType.
  • Lines 40 – 43: Save the contents of our BizTalk message to the main document body part (the message we created in the BizTalk map).
  • Line 46: Create a package relationship for the main document body part.
  • Line 60 & 69 – 70: Rewind the the MemoryStream and overwrite the original message with our new OOXML package.
  • Line 75: We return the message containing the OOXML package.

The final message is sent via the FILE adapter and written to the file system. The end result looks like this (click on the image for a larger version):

Finished Sales Summary Document - SmallThe complete solution – containing the pipeline component and a BizTalk proof of concept project – is available to download and can be found archived in the downloads area of this blog. Grab a copy, try it out for yourself; comments and suggestions are welcome.


In this post I hope I’ve shown you the tools necessary to generate Word 2007 documents natively using BizTalk 2006. The example I presented is extremely simple and does not include styles, themes, images, headers and footers, font tables etc. that would exist in a real-life document, but I hope it has presented a starting-point for your own custom development.

These same techniques can also be applied to create Excel spreadsheets or PowerPoint presentations – in fact, while writing this post I have had a number of ideas for enhancements to the pipeline component and will endeavour to create a CodePlex project if I can find the time.


This work is licensed under a Creative Commons Attribution 2.5 License – you can use commercially and modify as necessary, but you must give the original author credit. Furthermore, sample projects and code are provided “AS IS” with no warranty.Click the image below to view further detail of the licence.

Creative Commons License

Message Archiving Receive Pipeline Component

Update: The Message Archiving Component is now maintained by Atomic-Scope, visit for more information.

Today I’m releasing a message archive receive pipeline component (BTS06 only) – I’ve been meaning to release the code for this component for some time, but never got around to it.

The component is designed to be used in the decode stage of a receive pipeline to archive a message before any BizTalk processing (with the exception of the adapter) is started. An output directory and filename can be specified in the component properties, with the following macros (which will be expanded to their real-world values) available in the filename:

  • %MessageID% – Expands to the unique Message Id;
  • %ReceivedFileName% – Expands to the received filename, less the extension;

the remainder all expand to their context property values:

  • %InboundTransportLocation%
  • %InterchangeID%
  • %ReceivePortID%
  • %ReceivePortName%
  • %AuthenticationRequiredOnReceivePort%
  • %InboundTransportType%
  • %MessageExchangePattern%
  • %PortName%
  • %ReceivePipelineID%

For example, specifying an archive filename of:


with a file called ‘TestArchive.xml’ will produce an output filename of:


The VS project (including full source-code) can be found in the downloads area under ‘Message Archiving Receive Pipeline Component’ (a 14Kb zip file). Details of how to make the compiled binary available in a BizTalk project can be found here.

Custom Disassembler Component – Remember the Context Properties!

This is really a post to remind myself not to be quite so stupid in future….

I’ve been developing a custom pipeline disassembling component over the last few days to take a single purchase order from a customer and split into multiple orders based on (potentially) multiple delivery addresses. The component worked well through the Pipeline.exe tool but on testing using messaging (Send Port subscribing to messages from a Receive Port configured with the new component) I kept receiving:

The "FILE" adapter is suspending a message coming from Source URL:"D:BizTalk MessagesInboundSales OrdersBAA Sales OrdersOrder Splitter Test Input*.xml". Details:"Could not find a matching subscription for the message. ".

In my wisdom, I’d managed to forget that BizTalk is based on a pub/sub model – if I don’t copy the input message properties onto the resultant output messages and promote the message type, the Send Port might struggle to subscribe…..

So, note to self – copy context properties:

// Iterate through inbound message context properties and add to the new outbound message
for (int i = 0; i < pInMsg.Context.CountProperties; i++)
string Name, Namespace;
object PropertyValue = pInMsg.Context.ReadAt(i, out Name, out Namespace);

// If the property has been promoted, respect the settings
if (pInMsg.Context.IsPromoted(Name, Namespace))
outMsg.Context.Promote(Name, Namespace, PropertyValue);
outMsg.Context.Write(Name, Namespace, PropertyValue);

and promote the message type:

// Promote the MessageType property to ensure subscription by orchestrations
string systemPropertiesNamespace = "";
string messageType = "";
outMsg.Context.Promote("MessageType", systemPropertiesNamespace, messageType);


Add DOCTYPE Declaration Component

Our current infrastructure supports over 200 interfaces with external trading partners in the B2B role. Unfortunately, some of the cXML partners cannot handle Xml Namespaces and insist on receiving DOCTYPES.

Although DOCTYPE’s can be added through the BizTalk mapper, we needed a mechanism to remove the Xml Namespace from the resultant message before passing through to a client. As a result, I created the Add Doctype Declaration encoding component (send pipeline) to add DOCTYPE’s and remove Xml namespaces.

Encode Component on Pipeline

The component uses an Xml Document object to parse the message and add the DOCTYPE declarations. In addition, the Xml Namespace can be removed if required using the [boolean] ‘RemoveNamespace’ property (if this is set to false, the namespace will remain in the final message.

Add DOCTYPE Declaration Component Properties

A project containing source-code and binaries for BizTalk 2004 can be found in the downloads area of this blog, or via this link.

This really is the first release of the component (think of it as beta) and although it is ready for production, there are a number of enhancements I want to make, including:

  1. Produce a version in .Net 2.0 for BizTalk 2006 (our production environment is currently 2004, hence the current code is targeted to .Net 1.1);
  2. For BizTalk 2006, have the component configurable at run-time rather than design-time;
  3. Identify the root element programmatically, rather than through a component property;
  4. Remove DOCTYPE addition from the component and use the mapper to add the declaration;
  5. Create a receive pipeline component to add an Xml Namespace based on a DOCTYPE declaration;

I hope this component will be of use, either in your projects or as a starter for enhancements. I will continue to update the code and release enhancements through this blog.

Happy BizTalking!