Adatis BI Blogs

Debug Custom MDS Workflows

The custom MDS workflows, AKA External Action Business Rules or custom MDS assemblies, are one of the more advanced features of MDS. They allow you to extend the standard MDS business rules by getting MDS to fire Entity Member member data at an external assembly of your choice. There are a lot of moving parts to this feature, MDS business rules, the custom .Net assembly, service broker and a windows service! When a business rule meets your condition, then the rule is written to a service broker queue. The windows service interrogates the queue and then passes the information to the custom assembly. I’m not going to show how to do this, this is already done by Johan Machielse and also on MSDN. What I would like to show is how to debug the use of the custom business rules, just due to the number of moving parts and also a couple of quirks that I’ve encountered when using them. Custom Assembly Deployment One of the first things that may catch you out is the deployment of the assembly. To troubleshoot this, I recommend the           “-console” argument of the workflow service. If you run it in command line mode you may encounter a few errors: “Could not load file or assembly 'YourAssembly' or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.” This means that you need to go the properties of your assembly in Visual Studio and change the Target Framework from .NET Framework 4 to 3.5. Another error that you may get is "ERROR: Could not create type. Ignored.”  Assuming that you have actually put the assembly in the bin directory, this is most likely to be that the either the namespace or the class name in your DLL doesn’t match what you have in your config file. Check the contents of Microsoft.MasterDataServices.Workflow.exe.config match your DLL. You may also get, “ERROR: Type does not implement IWorkflowTypeExtender.” This means that your assembly has been picked up correctly, but it does not implement IWorkflowTypeExtender. If the assembly does start up ok without errors, you may still get errors when the exe tries to take a message off the service broker queue. An error such as “Could not dispatch due to missing extender <tag>” means that the tag that you’ve given to the assembly in the config file does not match the “Workflow Type” value of the MDS business rule: <setting name="WorkflowTypeExtenders" serializeAs="String"> <value>MDSTEST=MDS.CustomWorkflow, CustomWorkflow</value> </setting> Debugging the Custom Assembly Once you do get the assembly working, you may find that the code isn’t quite doing what you want. A good way to troubleshoot this is to choose “Attach To Process” from the Debug menu in Visual Studio, as shown below: If you do this, then you can add breakpoints to your .Net code as normal and step through in order to troubleshoot issues. To conclude, this probably isn’t an exhaustive list of all the sort of errors that could occur with custom workflows, but with any luck it will save someone a few hours of troubleshooting.

Managing SSAS Named Sets with Master Data Services Collections

Master Data Services Collections are probably not the first feature that come to mind when you think of the MDS product. The hierarchies and member grids tend to be the core functionality, but as this post will hopefully show, MDS Collections are useful also. Collections are essentially managed lists of members that can come from multiple different MDS explicit hierarchies, or also from another collection. The idea is that this “subset of members” can be maintained in MDS by a business user and then fed to external applications for reporting or other purposes. Analysis Services Named Sets One example of how collections can be used is to maintain Analysis Services named sets. Some named sets, such as a Top 50 Customers, don't require any maintenance, as it’s just the top 50 of all customers, based on a particular measure. On the other hand, sometimes named sets can be static lists, e.g. a list of core products that the user wants to see for reporting. In the latter example, if a user wants the definition of a named set to change, they have to get IT to change the MDX script. MDS collections can help by allowing the user to control the named set definition, reducing the burden on IT. Example Scenario  Here’s an example of how this is done. First of all, the end game is that I have a named set in Analysis Services that is currently just for 3 products. Therefore, a user can easily drop this set into an Excel report to get quick access to the products that are important to them: So the challenge is that we need to find some way of extending this named set without doing it manually. This is where MDS starts to come in, so using the sample Product model that comes with MDS, I’ve created a new collection called Favourite Products, as shown in the image below: If I go and edit the Favourite Products collection in MDS, then I can drag and drop more products into this collection, or remove some existing members. In this case, I’ve chosen to add two new products: C#, AMO and the MDS API So the named set represents the target that we need to get to, whereas the MDS collection that’s shown is the source. To get the contents of the MDS collection to update the named set, one way of doing it is to use the MDS API to pick up the collection members, and then to use AMO in order to write the named set into the MDX script. I’m just doing this in a C# windows application, but you could do it via a batch process, such as SSIS. For this post I’m just going to show the code, so here goes: This post is already starting to feel too long so I’m not going to show the basics of the MDS API. For that, take a look at a good posting by the MDS Team blog here. Also, as anyone who knows me will no doubt agree, I don’t tend to get involved in doing C#, so don’t consider this to be production ready! It should give you an idea of the basics though. Anyway, assuming that we’re now connected to the MDS Web Service, I’ve created the following method that will return the members from the collection: private HierarchyMembers ObtainHierarchyMembers(string entityId, string hierarchyId, string modelId, string versionId) { HierarchyMembersGetRequest request = new HierarchyMembersGetRequest(); HierarchyMembersGetResponse response = new HierarchyMembersGetResponse(); request.HierarchyMembersGetCriteria = new HierarchyMembersGetCriteria(); //State that our hierarhcy type is a collection and that we want collection members request.HierarchyMembersGetCriteria.HierarchyType = HierarchyType.Collection; //Pass in the key search criteria to identify the correct collection in MDS request.HierarchyMembersGetCriteria.ParentEntityId = new Identifier { Name = entityId }; request.HierarchyMembersGetCriteria.HierarchyId = new Identifier { Name = hierarchyId }; request.HierarchyMembersGetCriteria.ModelId = new Identifier { Name = modelId }; request.HierarchyMembersGetCriteria.VersionId = new Identifier { Name = versionId }; request.HierarchyMembersGetCriteria.RowLimit = 50; request.International = new International(); OperationResult result = new OperationResult(); //Return the hierarchy members from the service return service.HierarchyMembersGet(request.International, request.HierarchyMembersGetCriteria, out result); } Before we use the above method, we need to connect to SSAS using AMO. That can be done quite easily with the following code: Server server = new Server(); string connection = "Data Source=.;Catalog=Adventure Works DW 2008R2;"; server.Connect(connection); After we’ve done all the usual error handling associated with connecting to a database, we need to pick up the SSAS database and cube: Database database = server.Databases["Adventure Works DW 2008R2"]; Cube cube = database.Cubes["Adventure Works"]; Now we’re ready to call our ObtainHierarchyMembers method, before looping through it to build the named set: StringBuilder mdx = new StringBuilder("\n//Auto generated named set at " + DateTime.Now.ToString() + "\nCREATE SET CurrentCube.[Favourite Products] AS {"); int count = 1; //Loop through the collection to build the mdx foreach (ParentChild pc in hm.Members) { //Add the members to the MDX string //This references the member by name //It would be possible to reference by member key, but would require more work mdx.Append("[Product].[Product].[" + pc.Child.Name + "]"); if (count < hm.Members.Count()) { mdx.Append(", "); } count++; } mdx.Append("};"); Now we need to insert the named set in the correct place within the existing MDX script, bearing in mind it could already exist: string currentScript = cube.MdxScripts[0].Commands[0].Text; //Find the correct place to insert the named set within the MDX script: int start = currentScript.IndexOf("\n//Auto generated named set at"); int end = 0; StringBuilder newScript = new StringBuilder(currentScript); if (start != -1) { end = currentScript.IndexOf(";", start); //If the named set already exists, remove it newScript.Remove(start, end - start + 1); } else { start = currentScript.Length; } //Insert the named set in the correct place newScript.Insert(start, mdx.ToString()); //Update the cube's MDX script cube.MdxScripts[0].Commands[0].Text = newScript.ToString(); Finally we just need to update the cube in order to write the MDX back to the cube: //Call the update methods to update the cube cube.MdxScripts[0].Update(); cube.Update(); User Reports Now that the cube has been updated, if the Excel report is refreshed, then the two new products will appear in the list: Summary Controlling SSAS named sets like this won’t be for everybody. I’ve certainly worked in a few clients where strict process has to be followed to update this sort of thing, but I can think of other companies that I know where this would be really useful. Managing Analysis Services named sets is just one use for collections. Another example might be managing default multi-select parameters for SSRS reports. As collections are just lists of members that can be extracted easily, what you do with them is up to you!

Audit Trail in PerformancePoint Planning

I've noticed that the PPS Technet documentation has been updated recently to include an official Microsoft method to carry out auditing in PPS Planning. PPS will do some basic auditing out of the box, namely to the audit.log file on the server. This will automatically capture key events that occur on the server, e.g. creation of a model, updating of a dimension etc. The audit file does not, however, track changes to the model fact data. There has been a custom solution around for this for a while now - Sacha has written an excellent post here that details what you need to do in order to implement your own PPS audit trail. Like Sacha's method, the Microsoft approach involves creating auditing tables, which should then be populated by running a custom stored procedure. The stored procedure should then be scheduled on a periodic basis (e.g. hourly) to capture any new activity. This is a bit different to Sacha's method, where triggers are used to capture changes in real-time as they occur. In both cases the idea is to use something like Reporting Services to to view detailed auditing reports on your PPS data. One thing that did catch my eye on in the Technet documentation is a method to decode the binary 'change list' column that's held in the dbo.Submissions table. Whereas you can manually export the change list to a CSV file, there has historically been no way to take what's in the change list column and automatically decode it into a useful format. The following C# code will read the change list, and then insert it into your newly created auditing table: DataSet ds = new DataSet(); DataLayer dl = new DataLayer("PPSConnection"); ds = dl.ExecuteDataSetFromSQL("SELECT [SubmissionID]FROM [_AppDB].[dbo].[Submissions] s1 where s1.SubmissionID not in (select SubmissionID from [_StagingDB].[dbo].[SubmissionsAudited]) and s1.[Status] = 0"); string sSQL = ""; foreach (DataRow r in ds.Tables[0].Rows) { sSQL = @"INSERT INTO SubmissionsAudited(… ) VALUES("; // RETRIEVE THE CHANGELIST FOR THIS SUBMISSION DataSetWrapper dsw = new DataSetWrapper((Byte[])r["ChangeList"]); foreach (DataRow cldr in dsw.DataSet.Tables[0].Rows) { // SUBMISSION ROW DATA sSQL += r[0].ToString() + ", " + r[1].ToString() + ", " + r[2].ToString() + ", " + r[3].ToString() + ", '" + r[4].ToString() + "', "; // CHANGELIST ROW DATA foreach (object o in cldr.ItemArray) { sSQL += "," + o.ToString(); } sSQL += ")"; } // STORE EACH CHANGE TO THE AUDIT TABLE dl.ExecuteNonQuery(sSQL); Click here to view the Technet documentation.

C# Coalesce

Although this has been around for a long time and this is slightly off topic, I needed it this week, and just think it is worth mentioning.  With objects you occasionally need to know if they are null, and if they are get something else, or do something else.  This used to be very convoluted with .NET 1.1: if (a != null) {   return a; }   else if (b != null) { return b; }   else if (c!= null) {   return c; } else {   return new object(); } Now you can simply use this (.NET 2.0 and above): return a ?? b ?? c ?? new object(); Now you can not use this with types that get default values, such as Integer's, or boolean's, however still very usefull.

Maximum Message Size For Web Services (.NET 3.5)

A new introduction to .NET 3.5 is the ability to limit the size of the incoming messages when using Web services.  Apparently this is to help combat Denial of Service (DoS) attacks. However, it is not clear how to change this setting, its simple when you know how.  In you App.Config, or Web.Config you should have a Bindings section for each of web services references.  Within this there are all sorts of useful settings, however by default the maximum message size is quite small, so to alter this you must change maxBufferSize and maxRecievedMessageSize.  Now don't go crazy just up it to what you may need, this may be quite large if you are building all your internal applications through a web service layer.

LINQ and SQL Server

Some time ago I reported on LINQ whilst Visual Studio 2008 was still in BETA. On Monday, Nov. 19 2007, Microsoft announced that Visual Studio 2008 and the .NET Framework 3.5 was released to manufacturing (RTM).  Since then I am sure many of you have had many hours playing with VS 2008, WPF, Expression Blend, and all that these new products contain.  Having finally got my hands on it, and found the time, LINQ was the first stop.  So we have a database that we need to run complex queries on where stored procedures just would not be flexible enough.  Further to this Inline SQL is an obvious No No, so perfect scenario for a LINQ to SQL implementation.  Deciding that a DBML (Database Markup Language) structure would work best, generating this file, and getting this file to be update-able was crucial.  So after some research we found a small, unknown command line application called SQLMetal.  This allows you to drop and re-create DBML files, based entirely on a given database.   SqlMetal will always produce a consistent data access layer: SQLMetal is a command line tool that can generate the data access layer in seconds. SqlMetal will produce either C# or VB.Net code. SqlMetal generates a strongly typed data access layer. This is great for reducing runtime error. Now those runtime errors pop up in the development cycle as compile errors, reducing stress on the QA dept, Support dept and upper management, later in the life cycle. SqlMetal can generate strongly typed interfaces for stored procedures and user-defined functions. From the developer point of view, it is now a complete breeze to call either a stored proc and/or a user-defined function. To generate source code from SQL database directly, execute the following: C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bin> SqlMetal /server:baker   /database:AdventureWorks  /namespace:AdventureWorks  /dbml:AdventureWorks.dbml This creates a DBML file that can be added to your project:    This now gives an entire database structure accessible via LINQ, one thing to be aware of though; ensure you have a solid database design, as SQLMetal will transform you Database design identifying all keys and relationships. And once you have the database as an object model, you can use LINQ as you wish. LINQ to SQL uses ADO.NET under the covers but offer several additional benefits: Reduces complexity Fewer lines of code Strong Typing

Embedding Images into DLL's

I am sure this is already well documented, however, its something that is really very straight forward with .NET 2.0, and not clearly written about so far on the web, so here is(hopefully) a good example:   Step 1:  Embed the Resource -   Step 2:  Reference the resource with the exact name and Namespace. Step 3: Retrieve the Embedded resource Image.ImageUrl = Page.ClientScript.GetWebResourceUrl(typeof(_Default), "[YourNamespace].Contributor.png");  It is that easy!!

The PerformancePoint WebServices - Retrieving Data

In a previous post, I have discussed retrieving information from the PPS WebServices, namely, retrieving the system information, that returns information on the Applications stored within PPS.  To retrieve any further data from PPS WebServices, we need to know the application ID, which is returned from the GetSystem() method. Once you have retrieved the applications, you may notice that there are not a huge amount of WebServices methods available for use.  This is because one WebServices is used to retrieve most data from PPS.  Again we are going to use the MetaDataManager.asmx, WebServices, and the method we can use is the Request method, that expects an object called a PerformancePointRequest, and in turn returns a PerformancePointResponse.  A PerformancePointRequest can be made to retrieve all sorts of information from PPS, and this is always returned to a PerformancePointResponse.  Here is an example of code that returns Cycle instance information. MetaDataManager.QueryFilter ppQueryFilter = new MetaDataManager.QueryFilter();ppQueryFilter.ResultDataType = MetaDataManager.ReferenceDataType.CycleInstances; MetaDataManager.QueryWorkflowData performancePointRequest = new MetaDataManager.QueryWorkflowData();performancePointRequest.TargetObjectId = ApplicationID;performancePointRequest.QueryFilter = ppQueryFilter;MetaDataManager.PerformancePointResponse ppsResp = ppsMDM.Request(performancePointRequest);return ppsResp; So the PerformancePointResponse will return a standard object that can be successfully cast to a number of different objects, I have not used all objects as yet, however we have most experience with the 'WorkflowDataResponse' object.  So casting the PerformancePointResponse object, to a 'WorkflowDataResponse'  object, we know have a Workflow Data Object that is a set of data tables that re compressed.  So.. To get the data we need to de-compress this.  The following shows how I have gone about decompressing this: First we need to get the Table Schema, that is compressed: public static DataTable UnpackTableSchema(byte[] packedTableSchema){  DataTable table2 = new DataTable();  XmlSerializer dataTableSerializer = new XmlSerializer(typeof(DataTable));  if (packedTableSchema == null)    {      return null;    }      DataTable table = null;      using (MemoryStream stream = new MemoryStream(packedTableSchema))      {        using (GZipStream stream2 = new GZipStream(stream, CompressionMode.Decompress, true))        {          table = dataTableSerializer.Deserialize(stream2) as DataTable;        }      }      table2 = table;    }return table2;} Once we have the unpacked schema, we can unpack the data and load it against the schema: There are quite a few methods to actually unpacking the data, so if you are interested please do not hesitate to contact me.  Also, once we have finished the product, it will be available for download at  Should you like to take a look at our BETA please email us at 

The PerformancePoint Planning Web Service

So.... you want to use the PerformancePoint web service so you can extend PerformancePoint, and integrate it into bespoke applications and SharePoint.  We needed to create a Web part that could show a users PerformancePoint assignments in SharePoint.  So, this should be simple we know that Excel uses the webservices to get all of its PerformancePoint  information, so lets just tap into that and away we go... OR not! We were after creating a WebPart that did not require any further references and was self contained.  This is where life got tricky.  We needed to get all the applications within Performance Point, and then get all the Assignments for each application that are related to the requesting user.  After looking through available web services, it is not clear how to retrieve the amount of data that is required to run PPS, so that was a complication.  However for now lets look at how to get the applications: We can get the whole Biz system from the Web service using this: ( where ppsMDM is the MetaDataManager web service!)  //Get the System Data MetaDataManager.BizSystem bizSystem = ppsMDM.GetSystem(false); This should return the whole system to a Biz system object, however, as we found the webservice attempts to cast the array of application into a dataset, and then fails to do so, meaning that we can not get the application data.  After some degree of head scratching it was decided to change the Webservice Reference.cs, so the Application list was returned as an object.  This enables us to then manually deserialize the XML, into our own object and get access to the entire Application object, as we should have had.  It does seem that if you are willing, and able, to use the PPS Client Dll's, that you can hook straight into them to get this to work less painfully (thanks to Casper and his post here) you need to reference this (Microsoft.PerformancePoint.Planning.Client.Common) and any other dependant DLL's.  This is how we have deserialize this, once creating the object ArrayOfBizApplication, and BizApplication: sb.Append("<ArrayOfBizApplication>");foreach (XmlNode xmlNode in xmlNodes){  if (xmlNode.Name.ToLower() == "arrayofbizapplication")     {       sb.Append(xmlNode.InnerXml.ToString());    }} sb.Append("</ArrayOfBizApplication>"); XmlSerializer serializer; System.IO.StringReader sr = new System.IO.StringReader(sb.ToString());XmlTextReader xmlTReader = new XmlTextReader(sr);  serializer = new XmlSerializer(typeof(ArrayOfBizApplication));arrayOfBizApplicaion = (ArrayOfBizApplication)serializer.Deserialize(xmlTReader); Now we have the array of Applications we should be able to get the Application ID's and then the Assignements from PPS, without the need for referencing any Microsoft PPS Dll's.