Recent posts








    The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

    Architecture Exploration with VSTA2010

    Did some playing today with the Architecture Explorer in 2010, which generates some nice interactive, navigable graphs from code or assembly.

    I used the Architecture Explorer for different kind of application architectures, actually different kind of UI implementations. Just as I did a while ago with this post Rosario - Architecture Explorer's Dependency Matrix and this very old post Visual Studio 2008 Features: Code Metrics.

    You can analyze relationships at assembly, class and namespace level. The first implementation doesn't show much magic [it’s just a drag and drop winform implementation, see previous posts] and as you probably see in the graph below, there is a form with two classes employees and and employee direct coupled to this form.


    The namespace graph, doesn't add much information to it. Only that the namespaces are visualized, it just looks like an layer diagram.


    the default class diagram view  gets a little bit complex due to all the methods and properties in the code behind file. Also you can see the employee class with all his properties and the very thin Employees class with only two methods


    This is another view where I selected all the interesting classes and methods.
    At the top-middle of the graph you can see very easy the interaction between employeeform, cmdLookup and the Employees lookup method.


    But this isn’t a very interesting implementation so let’s look at a kind of Model-View-Controller implementation. If you have read the “Rosario - Architecture Explorer's Dependency Matrix ” post you already know that this one has a cyclic dependency.
    The namespace view nicely shows this dependency and the three namespaces used in this solution


    With a more detailed view we can find the methods responsible for this dependency. [click to enlarge]. Actually all the methods on the top have a dependency with the form and with the controller.


    Next a more complex implementation, adapter pattern together with the command pattern and observer. In this kind of applications the architecture explorer gets valuable, during courses I give all these implementations to the attendees and give them the assignment to figure out how the implementation works and what are the pros and cons of the implementation. This one I often have to explain, just because they only have the code. with a diagram like this… it will be easy for them [this is the namespace view].


    More detailed it looks like this, more loose coupled solution and the controller doesn't know the view.


    and finally an MVP implementation with interfaces…



    So, that's it…  is it useful? yes definitely especially with more complex solution. Architecture Validation is hard, but the Architecture Explorer isn't meant for validation it’s for discovery and that works, although the default views have a lot of noise from the code behind files, properties and settings files. But, with manual selection of classes we get some nice overviews.  I do think there are some preferred filter methods/ practices when you want to find something specific, at this time I select every possible option till I find something what fits me needs… have to work on that.

    Posted: Nov 23 2008, 19:15 by Clemens | Comments (0) RSS comment feed |
    Filed under:

    VS2010 UML Profiles update… C#4.0 dynamic type in DTE

    It’s not really a UML Profile update it’s actually more about Visual Studio Addin’s.

    The UML Profile implementation and functionality in the previouse post I made with the CTP12 bits and that doesn’t work anymore in the current available CTP. Why not? VS Addin’s won’t work anymore… so you need to use a different approach to fire your commands. I explained the use of Addin and the UML Diagrams in this post “Rosario – Create Custom Team Architect UML Diagram-MenuItems ” but for some reason addin’s won’t work anymore. So, back to us progression providers, see this post “Rosario – Create your own Progression Provider

    A nice thing, found while digging in the problem, is that they are already using the C#4.0 dynamic type in the DTE namespace…


    The use of this property should be something like this, I didn’t test this so I don’t know if this is going to work [found the get_CommandBars method using Reflector]:


    In the previous version this command looks like this [won’t compile anymore in VS2010]:


    This is the old DTE interface…

    dynamiccomand old


    Anyway, not that important…  although I planned some funny use of UML Profiles for today, that one have to wait now.
    For more information on C#4.0 dynamic types see www.microsoftpdc.com and channel9.

    Posted: Nov 20 2008, 17:17 by Clemens | Comments (5) RSS comment feed |
    Filed under:

    VSTA 2010 – UML Profiles [make your own…]

    Does anyone have seen that there is the capability to attach UML Profiles to the UML Diagrams in the new CTP..?
    Just take a look at the property window of the component diagram or use case diagram or any other diagram and you can see that there are 3 out of the box ‘trial’ profiles available.


    You can find this ‘profiles’ property on the design surface and after selecting one [or more] profiles you get a new property on the shape named ‘Stereotypes’ with values depending on the profile you selected.


    That’s it for Profiles and Stereotypes in the VSTA diagrams. It doesn’t seem that valuable at first sight, but you can do magic with it.

    UML Profiles
    First a brief explanation of UML Profiles, better and less time consuming a copy-past from Wiki and  OMG [a little bit more fuzzy than wiki].

    A profile in the Unified Modeling Language provides a generic extension mechanism for customizing UML models for particular domains and platforms. Profiles are defined using stereotypes, tagged values, and constraints that are applied to specific model elements, such as Classes, Attributes, Operations, and Activities. A Profile is a collection of such extensions that collectively customize UML for a particular domain (e.g., aerospace, healthcare, financial) or platform (J2EE, .NET).   

      • Identifies a subset of the UML metamodel.
      • Specifies “well-formedness rules” beyond those specified by the identified subset of the UML metamodel.
        “Well-formedness rule” is a term used in the normative UML metamodel specification to describe a set of constraints written in UML’s Object Constraint Language (OCL) that contributes to the definition of a metamodel element.
      • Specifies “standard elements” beyond those specified by the identified subset of the UML metamodel.
        “Standard element” is a term used in the UML metamodel specification to describe a standard instance of a UML stereotype, tagged value or constraint.
      • Specifies semantics, expressed in natural language, beyond those specified by the identified subset of the UML metamodel. 
      • Specifies common model elements, expressed in terms of the profile.

    An UML Profile for VSTA.
    Why do you actually want a profile? you can add additional information to the diagrams and shapes and use this extra information for everything you can think of, just for visualization and communication but also for generation and validation.

    For example the component diagram is often used for visualizing the structure [components] of the solution and the interfaces between those components. as Scot Ambler writes:

    UML component diagrams are great for doing this as they enable you to model the high-level software components, and more importantly the interfaces to those components.  Once the interfaces are defined, and agreed to by your team, it makes it much easier to organize the development effort between subteams.

    While in UML 1.1 a component referred to files, UML 2.x describes them more as a autonomous, encapsulated unit with interfaces [see UML basics: The component diagram] which gives us a wider view what we can describe with it. But thinking of subteams, development effort, autonomous, encapsulated unit and file structures we can make a useful profile for our solution structure. Some time ago I wrote something about Autonomous Develop Services for SOA Projects with Team Architect and Service Factory, it’s about in what way you want to have your solution structure to be sure development teams [subteams] don’t interfere with each other.
    So, with the component diagram you can design your solution “autonomous, encapsulated units with interfaces”. With a profile attached to this diagram, which gives these units extra meaning according to the solution structure and what kind of units they are, we have valuable information how the solution should be structured, even more interesting we can make a ‘Solution Structure Generator’ for the component diagram.

    Make your own
    The UML Profiles in VSTA are XML files in the “C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\UmlProfiles” folder. this is the .NET Profiles profile file. Just copy your own “.Profile” file to this folder and it will appear in the combo box [after restarting VS].


    When looking at the property window the structure gets really easy clear. I selected the .NET Profiles profile at the diagram [first yellow marker] and the .NET Assembly stereotype [yellow marker] can be used with components [next yellow marker] and this stereotype has some more properties [vertical yellow marker].


    So, now we know how to make a profile… or actually add some additional information to a diagram. We can make it useful create a menuitem for the diagram with some functionality that takes this additional information and creates the solution structure for you.

     foreach (Microsoft.VisualStudio.Uml.Classes.ProfileInstance pi in componentmodel.ProfileInstances)
                            if (pi.Name == "Project Structure")
                                foreach (var item in elm.AppliedStereotypes)
                                    switch (item.Name)
                                        case "Single":
                                        case "Partinoned":
                                        case "Multiple":

    As you can see I used the same solution structure types as described in Chapter 3 – Structuring Projects and Solutions in Source Control of the TFS Guidance from the Patterns and Practices group. [ I still like the "Partinoned" solution structure ] .


    and some code to create the projects…

     foreach (ComponentProxy cp in componentmodel.ComponentProxies)
                    Microsoft.VisualStudio.Uml.Classes.Element elm = cp as Microsoft.VisualStudio.Uml.Classes.Element;
                    if (cp.AppliedStereotypes[0].PropertyInstances[3] != null)
                         projectype = cp.AppliedStereotypes[0].PropertyInstances[3].Value;
                    string projectTemplate = "";
                    Solution2 soln = (Solution2)vsApp.Solution;
                    System.IO.FileInfo fi = new System.IO.FileInfo(soln.FileName);
                    string solutiondirectory = fi.Directory.FullName ;
                    switch (projectype)
                        case "ASP.NET Web Application":
                            projectTemplate = soln.GetProjectTemplate("WebApplicationProject.zip", "csproj");

    Final Notes.
    The profile and the use of the profile used in the ‘solution structure’ example has some value, although if you want to use it in the real world it would probably more complex with regeneration, versioning and that kind of things. But that we can add, by using a very simple mechanism, additional information to the diagrams is a very powerful extensibility mechanism. Scenarios like, using a Data Modeling UML Profile for logical modeling and upgrade this model to the physical level with the Entity Framework are very interesting and I need stereotypes for the Usecase diagrams to implement the estimation scenario.

    More on this in the future…

    Posted: Nov 20 2008, 12:11 by Clemens | Comments (0) RSS comment feed |
    Filed under:

    Three Model Archetypes for Oslo…

    It took me several days and a lot of discussions at the PDC to understand the value of “Oslo” and still didn’t get it after the PDC. But, now finally after watching all the session’s / video’s, walking almost all the walkthroughs and after reading the complete documentation, I think I’m there, although still not sure...

    The piece what confused me was the first demo at the “A Lap around Oslo” session.


    First a little background. Several weeks ago I made a post about different modeling approaches to discus Oslo’s place in the modeling world. The most important feature from Oslo, in that post, is the “Model Execution” part and I was really thinking of this feature like: draw/ write your model and it will be interpreted to running applications on the server. So, where I was looking for at the PDC, what I was expecting “how to make an interpreter for Oslo”.
    It even didn’t got more clear after asking the question “What if I want to translate the MService Model [a great example for modeling and executing services] from English to Dutch. The execution/ interpretation will break. So I have to change something in the interpreter… how?”. How can I make my external/ business DSL’s or a translated version of the MService model or a custom internal DSL which executes at runtime? there isn’t any explanation around building an interpreter for my ‘not–out-of-the-box’ model.

    It took some mind re-setting to figure out what the value of Oslo is. Finally after a deep-dive in the documentation I found three different Model Archetypes. See picture below…


    So after

    1. Model Data Aware Application.

    The “A Lap Around” demo type which got me confused.
    This type is the same as Steve Kelly is writing in his blog:

    Overall, it looks like Oslo is primarily just a way to provide configuration information for Microsoft applications

    When during that demo the presenters promised that they would be going to build a runtime for the model I was really excited. To bad, it turned out to be an ASP.NET Form with a grid view which reads the XML file that was generated out of the models. For sure it is pretty exciting what you can do with M, MGraph, MGrammer, MSomething and Quadrant but this was a kind of disappointment.

    This is indeed something that can be used for configuration of applications. A more interesting scenario is, the company wide use of the same data structure. But in this scenario the business problems are more challenging than the replacing of the Canonical Data Model by Oslo.

    2. Model Runtime Aware Application

    type 2, get’s more interesting, but very tied scoped.
    This Archetype is almost the same as David Chappell uses in his whitepaper “A First Look at WF 4.0, Dublin, and Oslo”.


    Step 4 should be skipped for this type, after adding the extra or changed activity to the also changed workflow, the model is updated in the repository. An application which uses this model starts running with this new flow. Workflow uses XAML and this is interpreted at runtime from the repository.
    This type makes it easier to change or add flows and other kind of pieces of an application [you could for example make a model, and interpreter for the menubar]. The application understands the by Oslo system provided models. [ and which are interpreted during Runtime ].

    This archetype will help with several deployment and maintenance problems of a typical  type of business application. Business processes change overtime and the business can change them without the interference of IT people. [Workflow Foundation already has this capability but isn’t used by the business]. I can imagine that overtime more and more interpreters would be available for different kind of ‘models’ and that IT can more and more stitch everything together. Till now it’s very narrow scope, but promising. Although I think Steven still will call it “Configuring Microsoft Applications” ;-) [you are right, it is…]

    An interesting scenario for this type, is using the Oslo models together with the Team Architect Models. Probably the models in Oslo are company wide or even worldwide used, it’s a big effort to make an interpreter for these models so they will have a wide uses scope. So, when you can put constrains from the Team Architect Models [your specific solution] what the business can do with the Oslo Models so it still fits your business solution… interesting scenario, maybe later on more on this.

    3. Model Interpreted Application

    This is the Archetype I expected from Oslo. Define your language, execute it at runtime and use that model on different platforms / runtimes. Oslo, helping with building your own runtime. It is possible to accomplish this with the current bits [maybe not the platforms due to SQL Sever].

    Anyway, first let’s start playing with archetype 2. I do think that’s a valuable solution, maybe in conjunction with Team Architect…  later on with a self-made model and interpreter. So an other thought about Oslo next to hundred of thoughts already there in the cloud. … cloud?  azure? live? mesh? ssds?… everything in Oslo!

    Posted: Nov 02 2008, 18:37 by Clemens | Comments (1) RSS comment feed |
    Filed under:

    PDC Sessions Download

    I Like this picture. Found on Flickr [see all photo’s tagged PDC2008]




    I was LIVE at the PDC, but haven’t seen that much sessions. Most of the time I was at the VSTS and Oslo Booth, talking/ discussing with a lot of people. Main reason… all the sessions are available for download after 24 hour at Channel9.  So, home at the hotel I started the downloads…

    To bad KLM flights haven’t got any power plugs in the economy class so I had to find some kind of device which could play video’s in the airplane at the PDC, I don’t think it’s a surprise it’s a Zune 120… uploaded all the sessions and I really like the screen… even after watching more sessions than I ever could attend LIVE.

    Posted: Nov 01 2008, 11:26 by Clemens | Comments (1) RSS comment feed |
    Filed under:

    Exposing orchestrations with WCF and headers

    [There where still people at work back in Holland, while I was visiting LA to attend the PDC . One who stayed home made a really great solution for the BizTalk WCF Adapter and WSDL Headers. To interesting solution to keep it offline. So, here is a guest post from Ronald Kuijpers. LinkedIn-profile]

    Recently I was asked to expose an orchestration (or its schemas) using WCF. Due to company standards, the WCF service had to use a custom header for inbound and outbound messages. However, the orchestration did not access the header, it just had to be copied from inbound to outbound.

    The problem is that the WCF Publishing Wizard does not support headers, as can be read on msdn:

    The BizTalk WCF Service Publishing Wizard does not include custom SOAP header definitions in the generated metadata. To publish metadata for WCF services using custom SOAP headers, you should manually create a Web Services Description Language (WSDL) file. You can use the externalMetadataLocation attribute of the <serviceMetadata> element in the Web.config file that the wizard generates to specify the location of the WSDL file. The WSDL file is returned to the user in response to WSDL and metadata exchange (MEX) requests instead of the auto-generated WSDL.

    Luckily, also the answer is presented… but I don’t like to create a wsdl manually. Several other sources show how to change the generation of the wsdl. For me, the most important were written by Tomas Restrepo [here and here] and Patrick Wellink [here], but none added a header.

    For adding the header messages, I dug into the System.ServiceModel.dll.

    The second thing I wanted, going to create my own EndpointBehavior anyway, was to copy the header from the inbound to the outbound message. This way, I could concentrate in my orchestration on things the matter. This solution fits very well into the WCF architecture. Some nice blogsposts about message inspectors were written by Poalo Pialorsi [here and here].

    First, create a class that derives from BehaviorExtensionElement and implements IWsdlExportExtension and IEndpointBehavior. You have to derive from BehaviorExtensionElement to make the component configurable, implement IWsdlExportExtension to change the generated wsdl and implement IEndpointBehavior for copying the header.

       1: public class CustomHeaderEndpointBehavior : BehaviorExtensionElement, IWsdlExportExtension, IEndpointBehavior
       2:     {
       3:         #region BehaviorExtensionElement Overrides
       4:         public override Type BehaviorType
       5:         {
       6:             get
       7:             {
       8:                 return typeof(CustomHeaderEndpointBehavior);
       9:             }
      10:         }
      12:         protected override object CreateBehavior()
      13:         {
      14:             return new CustomHeaderEndpointBehavior();
      15:         } 
      16:         #endregion
      18:         #region IEndpointBehavior Members
      22:         public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher)
      23:         {
      24:             CustomHeaderMessageInspector headerInspector = new CustomHeaderMessageInspector();
      25:             endpointDispatcher.DispatchRuntime.MessageInspectors.Add(headerInspector);
      26:         }
      30:         #endregion
      32:         #region IWsdlExportExtension Members
      36:         public void ExportEndpoint(WsdlExporter exporter, WsdlEndpointConversionContext context)
      37:         {
      38:             CustomerHeaderWsdlExport.ExportEndpoint(exporter, context);
      39:         }
      41:         #endregion
      42:     }

    Note that only a few interface methods have to be implemented. Copying the header is taken care of by the CustomHeaderMessageInspector (credits to Poalo) :

       1: public class CustomHeaderMessageInspector : IDispatchMessageInspector
       2:     {
       3:         #region Message Inspector of the Service
       7:         public void BeforeSendReply(ref Message reply, object correlationState)
       8:         {
       9:             // Look for my custom header in the request
      10:             Int32 headerPosition = OperationContext.Current.IncomingMessageHeaders.FindHeader(CustomHeaderNames.CustomHeaderName, CustomHeaderNames.CustomHeaderNamespace);
      12:             // Get an XmlDictionaryReader to read the header content
      13:             XmlDictionaryReader reader = OperationContext.Current.IncomingMessageHeaders.GetReaderAtHeader(headerPosition);
      15:             // Read through its static method ReadHeader
      16:             CustomHeader header = CustomHeader.ReadHeader(reader);
      18:             if (header != null)
      19:             {
      20:                 // Add the header from the request
      21:                 reply.Headers.Add(header);
      22:             }
      23:         }
      25:         #endregion
      26:     } 

    Adding the header messages is a bit more complicated:

    1. Add the header schema;
    2. Add the header schema namespace;
    3. Create and add a header message description;
    4. Add header to operation description.

    A piece of code says more than a thousand words:

       1: public class CustomerHeaderWsdlExport
       2:     {
       3:         public static void ExportEndpoint(WsdlExporter exporter, WsdlEndpointConversionContext context)
       4:         {
       5:             // Read the schema of the custom header message
       6:             XmlSchema customSoapHeaderSchema = XmlSchema.Read(Assembly.GetExecutingAssembly().GetManifestResourceStream("CustomHeaderBehavior.CustomSoapHeader.xsd"), new ValidationEventHandler(CustomerHeaderWsdlExport.ValidationEventHandler));
       8:             // Create the HeaderMessage to add to wsdl:message AND to refer to from wsdl:operation
       9:             System.Web.Services.Description.Message headerMessage = CreateHeaderMessage();
      12:             foreach (WsdlDescription wsdl in exporter.GeneratedWsdlDocuments)
      13:             {
      14:                 // Add the schema of the CustomSoapHeader to the types AND add the namespace to the list of namespaces
      15:                 wsdl.Types.Schemas.Add(customSoapHeaderSchema);
      16:                 wsdl.Namespaces.Add("ptq0", "http://.../CustomSoapHeader/V1");
      18:                 // The actual adding of the message to the list of messages
      19:                 wsdl.Messages.Add(headerMessage);
      20:             }
      22:             addHeaderToOperations(headerMessage, context);
      24:         }
      25: }

    The following code generates the header message description:

       1: private static System.Web.Services.Description.Message CreateHeaderMessage()
       2:        {
       3:            // Create Message
       4:            System.Web.Services.Description.Message headerMessage = new System.Web.Services.Description.Message();
       6:            // Set the name of the header message
       7:            headerMessage.Name = "CustomHeader";
       9:            // Create the messagepart and add to the header message
      10:            MessagePart part = new MessagePart();
      11:            part.Name = "Header";
      12:            part.Element = new XmlQualifiedName("CustomSoapHeader", "http://.../CustomSoapHeader/V1");
      13:            headerMessage.Parts.Add(part);
      15:            return headerMessage;
      16:        }

    The method addHeaderToOperations adds the header to the input and output message bindings using a SoapHeaderBinding.

       1: private static void addHeaderToOperations(System.Web.Services.Description.Message headerMessage, WsdlEndpointConversionContext context)
       2:         {
       3:             // Create a XmlQualifiedName based on the header message, this will be used for binding the header message and the SoapHeaderBinding
       4:             XmlQualifiedName header = new XmlQualifiedName(headerMessage.Name, headerMessage.ServiceDescription.TargetNamespace);
       6:             foreach (OperationBinding operation in context.WsdlBinding.Operations)
       7:             {
       8:                 // Add the SoapHeaderBinding to the MessageBinding
       9:                 ExportMessageHeaderBinding(operation.Input, context, header, false);
      10:                 ExportMessageHeaderBinding(operation.Output, context, header, false);
      11:             }
      12:         }
      13: private static void ExportMessageHeaderBinding(MessageBinding messageBinding, WsdlEndpointConversionContext context, XmlQualifiedName header, bool isEncoded)
      14:         {
      15:             // For brevity, assume Soap12HeaderBinding for Soap 1.2
      16:             SoapHeaderBinding extension = new Soap12HeaderBinding();
      18:             binding.Part = "Header";
      19:             binding.Message = header;
      20:             binding.Use = isEncoded ? SoapBindingUse.Encoded : SoapBindingUse.Literal;
      22:             messageBinding.Extensions.Add(extension);
      23:         }

    This might seem a lot of code, but is almost a full working solution. Patricks blogpost does an excellent job of explaining how to put this to work for BizTalk.

    Posted: Nov 01 2008, 09:35 by Clemens | Comments (1) RSS comment feed |
    Filed under: