Jeremy Kashel

Jeremy Kashel's Blog

Debug Custom MDS Workflows

The custom MDS workflows, AKA External Action Business Rules or custom MDS assemblies, are one of the more advanced features of MDS. They allow you to extend the standard MDS business rules by getting MDS to fire Entity Member member data at an external assembly of your choice.

There are a lot of moving parts to this feature, MDS business rules, the custom .Net assembly, service broker and a windows service! When a business rule meets your condition, then the rule is written to a service broker queue. The windows service interrogates the queue and then passes the information to the custom assembly. I’m not going to show how to do this, this is already done by Johan Machielse and also on MSDN. What I would like to show is how to debug the use of the custom business rules, just due to the number of moving parts and also a couple of quirks that I’ve encountered when using them.

Custom Assembly Deployment

One of the first things that may catch you out is the deployment of the assembly. To troubleshoot this, I recommend the           “-console” argument of the workflow service. If you run it in command line mode you may encounter a few errors:

“Could not load file or assembly 'YourAssembly' or one of its dependencies. This assembly is built by a runtime newer than
the currently loaded runtime and cannot be loaded.”

This means that you need to go the properties of your assembly in Visual Studio and change the Target Framework from .NET Framework 4 to 3.5.

Another error that you may get is "ERROR: Could not create type. Ignored.”  Assuming that you have actually put the assembly in the bin directory, this is most likely to be that the either the namespace or the class name in your DLL doesn’t match what you have in your config file. Check the contents of Microsoft.MasterDataServices.Workflow.exe.config match your DLL.

You may also get, “ERROR: Type does not implement IWorkflowTypeExtender.” This means that your assembly has been picked up correctly, but it does not implement IWorkflowTypeExtender.

If the assembly does start up ok without errors, you may still get errors when the exe tries to take a message off the service broker queue. An error such as “Could not dispatch due to missing extender <tag>” means that the tag that you’ve given to the assembly in the config file does not match the “Workflow Type” value of the MDS business rule:

<setting name="WorkflowTypeExtenders" serializeAs="String">
<value>MDSTEST=MDS.CustomWorkflow, CustomWorkflow</value>
</setting>
image
Debugging the Custom Assembly

Once you do get the assembly working, you may find that the code isn’t quite doing what you want. A good way to troubleshoot this is to choose “Attach To Process” from the Debug menu in Visual Studio, as shown below:

image

If you do this, then you can add breakpoints to your .Net code as normal and step through in order to troubleshoot issues.

To conclude, this probably isn’t an exhaustive list of all the sort of errors that could occur with custom workflows, but with any luck it will save someone a few hours of troubleshooting.

Managing SSAS Named Sets with Master Data Services Collections

Master Data Services Collections are probably not the first feature that come to mind when you think of the MDS product. The hierarchies and member grids tend to be the core functionality, but as this post will hopefully show, MDS Collections are useful also.

Collections are essentially managed lists of members that can come from multiple different MDS explicit hierarchies, or also from another collection. The idea is that this “subset of members” can be maintained in MDS by a business user and then fed to external applications for reporting or other purposes.

Analysis Services Named Sets

One example of how collections can be used is to maintain Analysis Services named sets. Some named sets, such as a Top 50 Customers, don't require any maintenance, as it’s just the top 50 of all customers, based on a particular measure. On the other hand, sometimes named sets can be static lists, e.g. a list of core products that the user wants to see for reporting.

In the latter example, if a user wants the definition of a named set to change, they have to get IT to change the MDX script. MDS collections can help by allowing the user to control the named set definition, reducing the burden on IT.

Example Scenario 

Here’s an example of how this is done. First of all, the end game is that I have a named set in Analysis Services that is currently just for 3 products. Therefore, a user can easily drop this set into an Excel report to get quick access to the products that are important to them:

image

So the challenge is that we need to find some way of extending this named set without doing it manually. This is where MDS starts to come in, so using the sample Product model that comes with MDS, I’ve created a new collection called Favourite Products, as shown in the image below:

image

If I go and edit the Favourite Products collection in MDS, then I can drag and drop more products into this collection, or remove some existing members. In this case, I’ve chosen to add two new products:

image

C#, AMO and the MDS API

So the named set represents the target that we need to get to, whereas the MDS collection that’s shown is the source. To get the contents of the MDS collection to update the named set, one way of doing it is to use the MDS API to pick up the collection members, and then to use AMO in order to write the named set into the MDX script. I’m just doing this in a C# windows application, but you could do it via a batch process, such as SSIS. For this post I’m just going to show the code, so here goes:

This post is already starting to feel too long so I’m not going to show the basics of the MDS API. For that, take a look at a good posting by the MDS Team blog here. Also, as anyone who knows me will no doubt agree, I don’t tend to get involved in doing C#, so don’t consider this to be production ready! It should give you an idea of the basics though. Anyway, assuming that we’re now connected to the MDS Web Service, I’ve created the following method that will return the members from the collection:

private HierarchyMembers ObtainHierarchyMembers(string entityId, string hierarchyId, string modelId, string versionId)
{
    HierarchyMembersGetRequest request = new HierarchyMembersGetRequest();
    HierarchyMembersGetResponse response = new HierarchyMembersGetResponse();
    request.HierarchyMembersGetCriteria = new HierarchyMembersGetCriteria();

    //State that our hierarhcy type is a collection and that we want collection members
    request.HierarchyMembersGetCriteria.HierarchyType = HierarchyType.Collection;
    //Pass in the key search criteria to identify the correct collection in MDS
    request.HierarchyMembersGetCriteria.ParentEntityId = new Identifier { Name = entityId };
    request.HierarchyMembersGetCriteria.HierarchyId = new Identifier { Name = hierarchyId };
    request.HierarchyMembersGetCriteria.ModelId = new Identifier { Name = modelId };
    request.HierarchyMembersGetCriteria.VersionId = new Identifier { Name = versionId };

    request.HierarchyMembersGetCriteria.RowLimit = 50;
    request.International = new International();
    OperationResult result = new OperationResult();
    //Return the hierarchy members from the service
    return service.HierarchyMembersGet(request.International, request.HierarchyMembersGetCriteria, out result);
}

Before we use the above method, we need to connect to SSAS using AMO. That can be done quite easily with the following code:

Server server = new Server();
string connection = "Data Source=.;Catalog=Adventure Works DW 2008R2;";
server.Connect(connection);

After we’ve done all the usual error handling associated with connecting to a database, we need to pick up the SSAS database and cube:

Database database = server.Databases["Adventure Works DW 2008R2"];
Cube cube = database.Cubes["Adventure Works"];

Now we’re ready to call our ObtainHierarchyMembers method, before looping through it to build the named set:

StringBuilder mdx = new StringBuilder("\n//Auto generated named set at " + DateTime.Now.ToString() + 
    "\nCREATE SET CurrentCube.[Favourite Products] AS {");
int count = 1;
//Loop through the collection to build the mdx
foreach (ParentChild pc in hm.Members)
{
    //Add the members to the MDX string
    //This references the member by name
    //It would be possible to reference by member key, but would require more work
    mdx.Append("[Product].[Product].[" + pc.Child.Name + "]");
    if (count < hm.Members.Count())
    {
        mdx.Append(", ");
    }
    count++;
}
mdx.Append("};");

Now we need to insert the named set in the correct place within the existing MDX script, bearing in mind it could already exist:

string currentScript = cube.MdxScripts[0].Commands[0].Text;
//Find the correct place to insert the named set within the MDX script:
int start = currentScript.IndexOf("\n//Auto generated named set at");
int end = 0;
StringBuilder newScript = new StringBuilder(currentScript);
if (start != -1)
{
    end = currentScript.IndexOf(";", start);
    //If the named set already exists, remove it
    newScript.Remove(start, end - start + 1);
}
else
{
    start = currentScript.Length;
}
//Insert the named set in the correct place
newScript.Insert(start, mdx.ToString());
//Update the cube's MDX script
cube.MdxScripts[0].Commands[0].Text = newScript.ToString();

Finally we just need to update the cube in order to write the MDX back to the cube:

//Call the update methods to update the cube
cube.MdxScripts[0].Update();
cube.Update();

User Reports

Now that the cube has been updated, if the Excel report is refreshed, then the two new products will appear in the list:

image

Summary

Controlling SSAS named sets like this won’t be for everybody. I’ve certainly worked in a few clients where strict process has to be followed to update this sort of thing, but I can think of other companies that I know where this would be really useful.

Managing Analysis Services named sets is just one use for collections. Another example might be managing default multi-select parameters for SSRS reports. As collections are just lists of members that can be extracted easily, what you do with them is up to you!

Audit Trail in PerformancePoint Planning

I've noticed that the PPS Technet documentation has been updated recently to include an official Microsoft method to carry out auditing in PPS Planning.

PPS will do some basic auditing out of the box, namely to the audit.log file on the server. This will automatically capture key events that occur on the server, e.g. creation of a model, updating of a dimension etc. The audit file does not, however, track changes to the model fact data. There has been a custom solution around for this for a while now - Sacha has written an excellent post here that details what you need to do in order to implement your own PPS audit trail.

Like Sacha's method, the Microsoft approach involves creating auditing tables, which should then be populated by running a custom stored procedure. The stored procedure should then be scheduled on a periodic basis (e.g. hourly) to capture any new activity. This is a bit different to Sacha's method, where triggers are used to capture changes in real-time as they occur. In both cases the idea is to use something like Reporting Services to to view detailed auditing reports on your PPS data.

One thing that did catch my eye on in the Technet documentation is a method to decode the binary 'change list' column that's held in the dbo.Submissions table. Whereas you can manually export the change list to a CSV file, there has historically been no way to take what's in the change list column and automatically decode it into a useful format. The following C# code will read the change list, and then insert it into your newly created auditing table:

DataSet ds = new DataSet();
DataLayer dl = new DataLayer("PPSConnection");
ds = dl.ExecuteDataSetFromSQL("SELECT [SubmissionID]FROM [_AppDB].[dbo].[Submissions] s1 where 
s1.SubmissionID not in (select SubmissionID from [_StagingDB].[dbo].[SubmissionsAudited])
and s1.[Status] = 0"
); string sSQL = ""; foreach (DataRow r in ds.Tables[0].Rows) { sSQL = @"INSERT INTO SubmissionsAudited(… ) VALUES("; // RETRIEVE THE CHANGELIST FOR THIS SUBMISSION DataSetWrapper dsw = new DataSetWrapper((Byte[])r["ChangeList"]); foreach (DataRow cldr in dsw.DataSet.Tables[0].Rows) { // SUBMISSION ROW DATA sSQL += r[0].ToString() + ", " + r[1].ToString() + ", " + r[2].ToString() + ", " + r[3].ToString() + ", '" + r[4].ToString() + "', "; // CHANGELIST ROW DATA foreach (object o in cldr.ItemArray) { sSQL += "," + o.ToString(); } sSQL += ")"; } // STORE EACH CHANGE TO THE AUDIT TABLE dl.ExecuteNonQuery(sSQL);

Click here to view the Technet documentation.