Jeremy Kashel

Jeremy Kashel's Blog

Debug MDX with Generate

On occasion you may find that your MDX queries don’t quite give you the result that you expect, which sometimes may be due to the way in which a certain bit of MDX has changed the context of part of your query.

This can sometimes be tough to figure out, especially with functions like Sum() or Filter(), whereby we are looping over a set and then only returning the end product. For example, take the following query, where I’m trying to find out the count of UK male customers who have bought Road Bikes in both the current and the previous year:

WITH MEMBER [Measures].[Repeat Customer Count] AS
SUM(
    [Customer].[Customer].[Customer].Members,
    IIF([Measures].[Internet Sales Amount] <> 0 AND
    ([Measures].[Internet Sales Amount], [Date].[Calendar Year].PrevMember) <> 0, 1, NULL)
    )

SELECT      {[Measures].[Repeat Customer Count]} ON 0,
            [Date].[Calendar Year].[Calendar Year] ON 1
FROM        [Adventure Works]
            --Road Bikes, UK, Male
WHERE       ([Product].[Subcategory].&[2], [Sales Territory].[Sales Territory Country].&[United Kingdom],
            [Customer].[Gender].&[M])

It returns the following result set, which I know to be incorrect:

image

Unlike debugging either of the axes, where of course we can see what gets returned, debugging examples such as the above might be difficult. The MDX Generate function can help here, as it works in a similar way to Sum(), by iterating over a set and then evaluating an argument once per iteration. The idea is that Generate will build us a new set, by using the looping context of the first set that you pass it. In this case, however, we can use it to return us a concatenated string based on our input set.

What I’ve done here is to use the CurrentMember of attributes that I think might be causing problems (Customer, Gender) and then just print them out as a string in order to see if the attributes have the context that I expect. Therefore I’ve replaced the ‘1’ in my original query with the string of:

" | " + [Customer].[Customer].CurrentMember.MEMBER_CAPTION + ", " 
    + [Customer].[Gender].CurrentMember.MEMBER_CAPTION

The full query and results then become:

GENERATE(
        [Customer].[Customer].[Customer].Members,
        IIF([Measures].[Internet Sales Amount] <> 0 AND
        ([Measures].[Internet Sales Amount], [Date].[Calendar Year].PrevMember) <> 0, 
        " | " + [Customer].[Customer].CurrentMember.MEMBER_CAPTION + ", "
        + [Customer].[Gender].CurrentMember.MEMBER_CAPTION, NULL)
        )

SELECT        {[Measures].[Repeat Customer Count]} ON 0,
            [Date].[Calendar Year].[Calendar Year] ON 1
FROM        [Adventure Works]
            --Road Bikes, UK, Male
WHERE        ([Product].[Subcategory].&[2], [Sales Territory].[Sales Territory Country].&[United Kingdom],
            [Customer].[Gender].&[M])
image

I can spot straight away that female customers have been returned in 2007, which is due to the fact that sets specified inside the WITH MEMBER section are not evaluated in the context of the slicer. In this case the problem can be solved by using the EXISTING keyword inside the Sum, so that the customers get sliced by the gender attribute:

SUM(
    EXISTING [Customer].[Customer].[Customer].Members,
    IIF([Measures].[Internet Sales Amount] <> 0 AND
    ([Measures].[Internet Sales Amount], [Date].[Calendar Year].PrevMember) <> 0, 1, NULL)
    )

This won’t be applicable in all MDX debugging scenarios, but hopefully it will help someone out.

Managing SSAS Named Sets with Master Data Services Collections

Master Data Services Collections are probably not the first feature that come to mind when you think of the MDS product. The hierarchies and member grids tend to be the core functionality, but as this post will hopefully show, MDS Collections are useful also.

Collections are essentially managed lists of members that can come from multiple different MDS explicit hierarchies, or also from another collection. The idea is that this “subset of members” can be maintained in MDS by a business user and then fed to external applications for reporting or other purposes.

Analysis Services Named Sets

One example of how collections can be used is to maintain Analysis Services named sets. Some named sets, such as a Top 50 Customers, don't require any maintenance, as it’s just the top 50 of all customers, based on a particular measure. On the other hand, sometimes named sets can be static lists, e.g. a list of core products that the user wants to see for reporting.

In the latter example, if a user wants the definition of a named set to change, they have to get IT to change the MDX script. MDS collections can help by allowing the user to control the named set definition, reducing the burden on IT.

Example Scenario 

Here’s an example of how this is done. First of all, the end game is that I have a named set in Analysis Services that is currently just for 3 products. Therefore, a user can easily drop this set into an Excel report to get quick access to the products that are important to them:

image

So the challenge is that we need to find some way of extending this named set without doing it manually. This is where MDS starts to come in, so using the sample Product model that comes with MDS, I’ve created a new collection called Favourite Products, as shown in the image below:

image

If I go and edit the Favourite Products collection in MDS, then I can drag and drop more products into this collection, or remove some existing members. In this case, I’ve chosen to add two new products:

image

C#, AMO and the MDS API

So the named set represents the target that we need to get to, whereas the MDS collection that’s shown is the source. To get the contents of the MDS collection to update the named set, one way of doing it is to use the MDS API to pick up the collection members, and then to use AMO in order to write the named set into the MDX script. I’m just doing this in a C# windows application, but you could do it via a batch process, such as SSIS. For this post I’m just going to show the code, so here goes:

This post is already starting to feel too long so I’m not going to show the basics of the MDS API. For that, take a look at a good posting by the MDS Team blog here. Also, as anyone who knows me will no doubt agree, I don’t tend to get involved in doing C#, so don’t consider this to be production ready! It should give you an idea of the basics though. Anyway, assuming that we’re now connected to the MDS Web Service, I’ve created the following method that will return the members from the collection:

private HierarchyMembers ObtainHierarchyMembers(string entityId, string hierarchyId, string modelId, string versionId)
{
    HierarchyMembersGetRequest request = new HierarchyMembersGetRequest();
    HierarchyMembersGetResponse response = new HierarchyMembersGetResponse();
    request.HierarchyMembersGetCriteria = new HierarchyMembersGetCriteria();

    //State that our hierarhcy type is a collection and that we want collection members
    request.HierarchyMembersGetCriteria.HierarchyType = HierarchyType.Collection;
    //Pass in the key search criteria to identify the correct collection in MDS
    request.HierarchyMembersGetCriteria.ParentEntityId = new Identifier { Name = entityId };
    request.HierarchyMembersGetCriteria.HierarchyId = new Identifier { Name = hierarchyId };
    request.HierarchyMembersGetCriteria.ModelId = new Identifier { Name = modelId };
    request.HierarchyMembersGetCriteria.VersionId = new Identifier { Name = versionId };

    request.HierarchyMembersGetCriteria.RowLimit = 50;
    request.International = new International();
    OperationResult result = new OperationResult();
    //Return the hierarchy members from the service
    return service.HierarchyMembersGet(request.International, request.HierarchyMembersGetCriteria, out result);
}

Before we use the above method, we need to connect to SSAS using AMO. That can be done quite easily with the following code:

Server server = new Server();
string connection = "Data Source=.;Catalog=Adventure Works DW 2008R2;";
server.Connect(connection);

After we’ve done all the usual error handling associated with connecting to a database, we need to pick up the SSAS database and cube:

Database database = server.Databases["Adventure Works DW 2008R2"];
Cube cube = database.Cubes["Adventure Works"];

Now we’re ready to call our ObtainHierarchyMembers method, before looping through it to build the named set:

StringBuilder mdx = new StringBuilder("\n//Auto generated named set at " + DateTime.Now.ToString() + 
    "\nCREATE SET CurrentCube.[Favourite Products] AS {");
int count = 1;
//Loop through the collection to build the mdx
foreach (ParentChild pc in hm.Members)
{
    //Add the members to the MDX string
    //This references the member by name
    //It would be possible to reference by member key, but would require more work
    mdx.Append("[Product].[Product].[" + pc.Child.Name + "]");
    if (count < hm.Members.Count())
    {
        mdx.Append(", ");
    }
    count++;
}
mdx.Append("};");

Now we need to insert the named set in the correct place within the existing MDX script, bearing in mind it could already exist:

string currentScript = cube.MdxScripts[0].Commands[0].Text;
//Find the correct place to insert the named set within the MDX script:
int start = currentScript.IndexOf("\n//Auto generated named set at");
int end = 0;
StringBuilder newScript = new StringBuilder(currentScript);
if (start != -1)
{
    end = currentScript.IndexOf(";", start);
    //If the named set already exists, remove it
    newScript.Remove(start, end - start + 1);
}
else
{
    start = currentScript.Length;
}
//Insert the named set in the correct place
newScript.Insert(start, mdx.ToString());
//Update the cube's MDX script
cube.MdxScripts[0].Commands[0].Text = newScript.ToString();

Finally we just need to update the cube in order to write the MDX back to the cube:

//Call the update methods to update the cube
cube.MdxScripts[0].Update();
cube.Update();

User Reports

Now that the cube has been updated, if the Excel report is refreshed, then the two new products will appear in the list:

image

Summary

Controlling SSAS named sets like this won’t be for everybody. I’ve certainly worked in a few clients where strict process has to be followed to update this sort of thing, but I can think of other companies that I know where this would be really useful.

Managing Analysis Services named sets is just one use for collections. Another example might be managing default multi-select parameters for SSRS reports. As collections are just lists of members that can be extracted easily, what you do with them is up to you!

Master Data Services Training in the UK

This is just a quick post to announce a range of SQL Server training courses, organised (and in some cases delivered) by Chris Webb. To start off there’s a SQL Server course delivered by Christian Bolton in December, followed by an Analysis Services course delivered by Chris in February. I’ll be delivering a Master Data Services course in February, before Chris delivers an MDX course in March.

The details for all the courses are:

  • SQL Server Internals and Troubleshooting Workshop - Christian Bolton – 6th – 7th December 2011

The Advanced Troubleshooting Workshop for SQL Server 2005, 2008 and R2 provides attendees with SQL Server internals knowledge, practical troubleshooting skills and a proven methodical approach to problem solving. The workshop will enable attendees to tackle complex SQL Server problems with confidence.

Full details and registration here.

  • Real World Cube Design and Performance Tuning with Analysis Services – Chris Webb – February 2012

A two day course that takes real world experience in showing you how to build a best practice Analysis Services cube, covering design issues such as data warehouse design and complex cube modelling. Day two then covers performance optimisation for Analysis Services, including MDX optimisation and cube processing.

Full details and registration here.

  • Introduction to Master Data Services with Jeremy Kashel – February 2012

An end to end look inside Master Data Services, this full day course will begin with a synopsis of Master Data Management before moving on to an overview of Microsoft SQL Server 2008 R2 Master Data Services (MDS). The remainder of the course will cover the major MDS topics, such as modelling and business rules, which will include a number of practical exercises.

More details and registration for here.

  • Introduction to MDX with Chris Webb – March 2012

The Introduction to MDX course aims to take you from the point of being a complete beginner with no previous knowledge of MDX up to the point where you can write 90% of the MDX calculations and queries you’ll ever need to write. The three day course covers the basics, such as sets, tuples, members to more advanced concepts such as scoped assignments and performance tuning.

Full details and registration here.

MDX with Microsoft SQL Server 2008 R2 Analysis Services Cookbook - Book Review

As you may have already seen, Packt have released a new MDX book, namely MDX with Microsoft SQL Server 2008 R2 Analysis Services Cookbook by Tomislav Piasevoli.

The book is written as cookbook, meaning that you can choose the individual ‘recipes’ that apply to whatever problem you’re trying to solve. Each recipe starts off with a section called ‘Getting Ready’, essentially giving you the base query to use, before moving onto ‘How to do it', which covers the main instructions for the recipe. There are then further sections, which explain how the example works and also suggest other functions/concepts that you may want to consider. This sort of cookbook style makes it really easy to follow, each recipe is displayed very clearly in the book.

A wide range of MDX problems are covered, from time calculations to context-aware calculations. Not every piece of the MDX functionality is covered, which is to be expected, given the size of the book. It also doesn’t cover the usual introduction to members/sets etc that MDX books tend to cover, but it’s clearly stated that having a working knowledge of MDX is a pre-requisite for the book.

I found the copy that I’ve been reading in the Adatis office, but I really do like it, so I’ll definitely be ordering my own copy!

image

Master Data Services Training

As I’m sure you’ve seen, registration for SQLBits 9 – Query Across the Mersey is now open.

This time around I’ll be running a deep-dive training day with Tim Kent on Master Data Services. This will be a full day of training, showing you how MDS can be used to manage the master data in your organisation. We’re going to start by giving an overview of Master Data Management, before moving on to covering the following MDS topics in detail:

  • Installing, configuring, and maintaining Master Data Services
  • Creating and using models
  • Version management
  • Business rules and workflow
  • Importing data into Master Data Services
  • Integrating with other systems
  • Security

You can register for the MDS Training day, or one of the other 10 training days, by using the following registration page.

Remember, the early bird discount expires at midnight on Friday this week!

Update – We are running another MDS course in February 2012. Click here for the details.

Allocations in PowerPivot Using DAX

Although I didn't mention it, the inspiration for my last post was the excellent MDX Solutions 2nd Edition. Whilst flicking through the book a while back, I though it would be interesting to see how DAX and PowerPivot could handle the so called 'Common Calculations and Selections in MDX'.

This post continues that theme, focusing on one calculation in particular that's in the aforementioned MDX book, namely 'Unweighted Allocations down the Hierarchy'. Essentially this business problem is dealing with allocating data that is entered at a higher level (e.g. Quarter) down to a lower level (e.g. Day). It varies from business to business what the users actually want to see at the day level in this situation. Some expect to see the Quarter amount, some want to see a blank value and then some users want the amount in the quarters allocated down to the day level.

If the expectation is that data should be allocated, then one way of doing the allocation is to use a ratio. In the MDX book, this is achieved by the following MDX:

1.0 / 
Descendants(
            Ancestor(
                    [Date].[Calendar].CurrentMember,
                    [Date].[Calendar].[Calendar Quarter]
                    ),
            [Date].[Calendar].CurrentMember.Level,
            SELF
            ).Count

If we put this into a query and run it against the Calendar hierarchy in AdventureWorksDW we get the following results:

image

Therefore our quarter-entered value can simply be multiplied by this ratio to get the correct value at each level in the Calendar hierarchy.

PowerPivot

So the challenge is how to be able to do this in PowerPivot. I started off by looking for some data in AdventureWorksDW that would fit the bill. A good candidate seems to be the FactSalesQuota table, which I imported into PowerPivot along with its related dimensions:

image

Once in the pivot table, I wanted to be able to calculate the correct ratio at the daily and monthly levels. For the daily level this is 1 / 90, as there are 90 days in the Q1 that year, and for the month level it is 1/3 as there are 3 months in the quarter.

Given that there's no MDX style Ancestor() function in DAX, I ended up having to have a different calculation at different levels. Something that I learnt from this forum post is that you need to use IF() in order to isolate a calculation at a different level. Therefore the DAX that I came up with was:

=IF( COUNTROWS(VALUES('DimDate'[FullDateAlternateKey])) = 1,

    1 / CALCULATE( COUNTROWS('DimDate'), ALL('DimDate'[FullDateAlternateKey], 'DimDate'[EnglishMonthName]) ),
    1 / CALCULATE( COUNTROWS( DISTINCT( 'DimDate'[EnglishMonthName]) ), ALL('DimDate'[EnglishMonthName]) )
   )

The COUNTROWS(VALUES('DimDate'[FullDateAlternateKey])) = 1 allows me to check that we're at the day level, and if we are at the day level, then count the number of days in the quarter. Otherwise, if we're at the month level we calculate the number of months in the current quarter. When this DAX is put into a new measure we get the following results:

image

This produces the same results as the MDX statement. The next step was then to apply this ratio to the Sales Amount Quota for the current quarter. This was achieved using the following DAX:

=IF(
      COUNTROWS( VALUES('DimDate'[FullDateAlternateKey]) ) < 90,
      CALCULATE( Sum('FactSalesQuota'[SalesAmountQuota]), 
      All('DimDate'[FullDateAlternateKey], 'DimDate'[EnglishMonthName]) ) * 'FactSalesQuota'[Ratio],
      Sum('FactSalesQuota'[SalesAmountQuota])
   )

The < 90 at the start of the statement allows me to check if we're at the day level or the month level. If we're at either of these levels, then we want to pick up the Quarter-level SalesAmountQuota and multiply by the ratio. Otherwise, at the Quarter and Year levels, we just apply the original SalesAmountQuota value. This produces the following results:

image

Conclusion

The use of the MDX Scope statement would have been nice to pinpoint exactly where I wanted the calculation to occur, but then again, the use of IF() keeps it quite simple, which is better for an Excel power user. Whilst MDX is very very powerful, can you imagine a business user getting to grips with MDX statements such as Scope or Freeze?

The bottom line is that the DAX calculation produces exactly what I want - the new AllocatedSalesQuota measure is correct at all levels in the Calendar hierarchy. Even with an IF() statement, like everything else in PowerPivot, the calculation performs very quickly.

Entering Dates in PPS Planning Assignments

In the recent PPS Planning projects that I've been involved in, the challenges have often been around subjects such as business rules, hence the often recurring theme of this blog. Recently the tables were turned though, as I was told by a user that they wanted to enter dates into a PPS assignment.

I was initially a bit concerned that the Excel add-in may not be able to deliver here - after all its great at capturing numbers, but knowing the rigid structure of the fact tables, I couldn't see how it would manage to store a date. Then I remembered something from my VBA days many years ago - that is that Excel stores dates as a number from 30/12/1899, meaning in theory it should be possible to get dates working in PPS. Thankfully it is possible, as this post explains.

Excel Setup

The first step to get this working when designing your form template is to set the matrix to have a matrix style of 'none'. If you don't do this, then the built-in matrix styles will over-ride your formatting changes to the required cells. Speaking of formatting, the next step is to format the data entry cells that will contain dates, just using the standard Excel formatting window:

image

Once these few simple steps are done, then the assignment will behave just like any other. As the date is stored as a number, the numeric representation of the date will end up in the fact table just as any other piece of data.

Dates in Business Rules

Once the numbers are in the fact table, we need to convert them to dates to use them in business rules in some way. We can't do much in PEL unfortunately, so the options are either NativeMDX or NativeSQL.

As Analysis Services can pickup some of the VBA functions, it's possible to use the VBA DateAdd() function to convert the stored number back into a date. So in the example below, I'm using the DateAdd() function to convert the number to a date, before comparing the resulting date against another date using the VBA DateDiff() function:

WITH 
MEMBER    [Measures].[DateExample] AS VBA!DateAdd("d", [Measures].[Value], "30/12/1899")
MEMBER    [Measures].[DateDiff] AS VBA!DateDiff("d", [Measures].[DateExample], "01/07/1987")
SELECT    Descendants([Time].[Monthly].[Year].&[2008],,leaves) ON 0
FROM      [Strategic Planning]
WHERE     ([Account].[Profit and Loss].&[5010], [Measures].[DateDiff], [Entity].[Divisions].&[5003])

Although the above is just a simple example, it should give you the idea of the kind of calculations that can be performed in Analysis Services. It's possible to use these functions via a NativeMDXScript or a NativeMDXQuery.

It's a similar story with SQL, as it also has its own DateAdd() function, as shown in the simple select statement below:

SELECT    DateAdd(d, [Value], '30/12/1899')
FROM      dbo.[MG_Strategic Planning_MeasureGroup_default_partition]
WHERE     Scenario_memberid = 4 AND Account_MemberId = 5010

So it's a shame that PEL can't work with dates, but the fact that both the database engine and Analysis Services have a DateAdd function means that it's possible to use dates for logic in both definition and procedural business rules.

NativeMDXQuery Business Rules in PerformancePoint Planning

Having posted about NativeSql business rules a while back, I though that I might as well cover NativeMdxQuery business rules also, especially as there isn't too much documentation available on the web for this subject.

NativeMdxQuery is a rule implementation type that can be used with assignment rules in PPS Planning. Rather than writing PEL and having the compiler convert your PEL into SQL or MDX, the idea is that you write the raw MDX directly if you pick NativeMdxQuery.

Why Native Rules?

When when I posted about NativeSql, I mentioned a scenario or two where using NativeSql was very useful. The advantages to writing a NativeMdxQuery, however, are less obvious. This is especially the case when you consider that a)standard MDX rules in PPS Planning are less restrictive that the SQL rules, and b) that PEL is very efficient - you can write a very concise MDX PEL rule vs writing the equivalent in MDX itself.

So what advantages are there? Although a fair amount of MDX functions are included in PEL, it's not possible to use all MDX functions/statements/operators, e.g. Case(), Item(). This is one situation where you might want to write a NativeMdxQuery.

Also, the ability to filter and restrict data in a raw MDX statement is far more powerful than the options available in PEL. For example, you can use the Filter()/IIF functions in PEL, but they're quite slow and you're restricted as to where you can put them. If you use MDX, you can use both a WHERE clause to slice the query and/or a HAVING clause for filtering.

Finally, when just writing queries from an SSAS cube, a technique that's sometimes used is to create query-scoped calculated members, by using the MDX statement WITH MEMBER. This allows you to have calculation steps in your MDX query, and can effectively be used in one sense as a temporary variable store for calculations. If you're trying to do complex calculations in PEL, you have to assign everything to the 'this' keyword. It's much cleaner to have a block of code where you can define any calculations that the main query depends on. This is what WITH MEMBER will let you do.

How to Write a NativeMdxQuery

The requirements for a business rule using NativeMdxQuery are that you need to write an MDX select statement, which specifies the target cells that will be written to. Unlike NativeSql statements, you do not need to handle how the data gets inserted - PerformancePoint handles all that for you, as long as you produce the MDX select statement.

An example select statement that makes up the entire content of a NativeMdxQuery assignment rule is shown below. The comments show what the query is doing:

--[Measures].[Last Year] just makes the rule a bit more readable
WITH MEMBER [Measures].[Last Year] AS ([Measures].[Value], [Time].[Monthly].CurrentMember.Lag(12))
CELL CALCULATION queryCalc 
FOR 
 --This specifices the cells that we are overwriting with an expression or value
 '([Measures].[Value],
 [Scenario].[All Members].[Scenario].&[1],
 {[Time].[Monthly].[Month].&[200901],[Time].[Monthly].[Month].&[200902]},
 {[Account].[Profit and Loss].[Level 07].&[5010],[Account].[Profit and Loss].[Level 07].&[5009]},
 [BusinessProcess].[Standard].[Level 06].&[8],
 [TimeDataView].[All Members].[TimeDataView].&[1],
 Descendants([Entity].[Divisions].[(All)].&[0], ,leaves),
 Descendants([Currency].[All Members].[(All)].&[0], ,leaves),
 Descendants([Product].[Product Category].[(All)].&[0], ,leaves))' 
AS
 --The 100 is the value that we are giving the cells above
 100
--This is the select statement which cells will receive the value above
SELECT NON EMPTY 
 ([Measures].[Value], 
 ({[Scenario].[All Members].[Scenario].&[1]},
 {{[Time].[Monthly].[Month].&[200901],[Time].[Monthly].[Month].&[200902]}},
 {{[Account].[Profit and Loss].[Level 07].&[5010],[Account].[Profit and Loss].[Level 07].&[5009]}},
 {[BusinessProcess].[Standard].[Level 06].&[8]},
 {[TimeDataView].[All Members].[TimeDataView].&[1]},
 {Descendants([Entity].[Divisions].[(All)].&[0], ,leaves)},
 {Descendants([Currency].[All Members].[(All)].&[0], ,leaves)},
 {Descendants([Product].[Product Category].[(All)].&[0], ,leaves)}))
 --Ensure we only write to cells with a certain value by using HAVING
 HAVING ([Measures].[Last Year] > 100000)
 properties [Scenario].[All Members].Key ,
 [Time].[Monthly].Key ,
 [Account].[Profit and Loss].Key ,
 [BusinessProcess].[Standard].Key ,
 [Entity].[Divisions].Key ,
 [TimeDataView].[All Members].Key ,
 [Currency].[All Members].Key ,
 [Product].[Product Category].Key 
 ON COLUMNS 
FROM [Strategic Planning]
--Filter on a dimension member property by using a WHERE CLAUSE
WHERE ([Entity].[Region].&[North])

The points to note about the above statement are:

  • You must connect to the correct cube for the current model;
  • You don't need to include a cell calculation - but it's the way that Microsoft implement business rules that use MDX, and it's hard to see how you would get a rule to be of any use without it;
  • You must include the member keys as properties, otherwise the rule will error.

Conclusion

Writing MDX is clearly not for all users of PerformancePoint, but does provide the ultimate in flexibility when compared to PEL. Most MDX queries written in PPS will use cell calculations. If you're not used to how these work, or you just want to save some time, remember that an easy way to get started is to use the debug button on an MDX PEL rule in PBM - this will output a query that is similar to the one shown above.

Analysis Services Properties for PerformancePoint MDX Rules

One of the great features of PEL is that you can choose either a SQL or MDX implementation for your rules, depending on what you want to achieve. Whilst the MDX rules are much less restrictive than the SQL rules, they can sometimes run slower, depending of course on how your environment is set up.

When the MDX rules do take a long time to run, it's possible that you might see this message:

clip_image002

What has essentially happened within Analysis Services is that an object (e.g. a cube or dimension) has been processed and is waiting to commit. Unfortunately this is not possible whilst an existing query is running, so therefore AS waits for the query to complete. It will, however, only wait so long, which is defined by one of the Analysis Services Properties called ForceCommitTimeout. Once this threshold has been reached, then the offending query is canceled, resulting in the error message above.

Finding the right balance for the Analysis Services ForceCommitTimeout and the PerformancePoint PAC 'OLAP Cube Refresh Interval' setting is key. If you have set PPS to re-process its cubes too often then you may well see the above message. On the other hand, if you set the ForceCommitTimeout too high, then queries executed whilst the cube is waiting to commit will be made to wait, meaning your big query will get though ok, but other users may see bad performance.

Darren Gosbell has written an excellent post here that provides a thorough explanation of ForceCommitTimeout and other related properties.

BI Evening, TVP, 17th July 2008

I'll be giving a talk titled 'PEL Vs MDX' at the UK SQL Server Community's BI Evening on the 17th July at Microsoft's TVP offices in Reading.

The content will be geared towards MS BI developers, and will highlight the differences between the two languages, with the aim that those with MDX experience will be able to make a fast start with PEL.

Also speaking is Jeremy Kirkup from TAH, on 'Using Excel Services with Analysis Services and MOSS'.

If you are just starting to explore delivering BI solutions with Excel Services and SharePoint then there are some issues that it is wise to be aware of in advance. This session will describe some real world lessons gained while creating BI solutions which expose Analysis Services data through the Excel Web Access web part.

The Excel 2007 client has first class support for the some of the advanced features of Analysis services such as drill-through.  However, when exposing pivot tables to a browser through the Excel Web Access web part this feature is not available. The session will discuss a couple of approaches to implementing drill-through functionality for EWA based pivot tables using the MOSS SmartPart, AJAX and the Excel Web Services.

You can register for the event here.