Adatis

Adatis BI Blogs

Analysis Services Properties for PerformancePoint MDX Rules

One of the great features of PEL is that you can choose either a SQL or MDX implementation for your rules, depending on what you want to achieve. Whilst the MDX rules are much less restrictive than the SQL rules, they can sometimes run slower, depending of course on how your environment is set up. When the MDX rules do take a long time to run, it's possible that you might see this message: What has essentially happened within Analysis Services is that an object (e.g. a cube or dimension) has been processed and is waiting to commit. Unfortunately this is not possible whilst an existing query is running, so therefore AS waits for the query to complete. It will, however, only wait so long, which is defined by one of the Analysis Services Properties called ForceCommitTimeout. Once this threshold has been reached, then the offending query is canceled, resulting in the error message above. Finding the right balance for the Analysis Services ForceCommitTimeout and the PerformancePoint PAC 'OLAP Cube Refresh Interval' setting is key. If you have set PPS to re-process its cubes too often then you may well see the above message. On the other hand, if you set the ForceCommitTimeout too high, then queries executed whilst the cube is waiting to commit will be made to wait, meaning your big query will get though ok, but other users may see bad performance. Darren Gosbell has written an excellent post here that provides a thorough explanation of ForceCommitTimeout and other related properties.

Troubleshooting the PEL Allocate Statement

The PEL Allocate statement can sometimes throw unexpected error messages, especially when you're setting up a pull or push cross model allocation rule. I've been keeping track of a few of the more obscure ones, which I've listed in the table below. Hopefully this will help anyone out there who's struggling to get the cross model rules working. Error Message Solution/Explanation No hierarchy ‘xyz’ exists in [dimension] Check that you have specified the correct member set in the source, target and scope parts of the rule. Member set is not defined correctly for cross model mapping. The number of members in source and target member sets in cross model mapping must match. This message can have 2 meanings. Firstly, as it suggests, it can be that the number of members in the source and target mapping arguments do not match. Or, prior to SP1, it can be this rather more serious problem - http://support.microsoft.com/kb/942640 Execution of rule ‘xyz’ failed.  This may have occurred because changes have been made to the model structure without redeploying the model site. This can occur at execution time when you try and map two members twice. E.g. if your source is {2009, 2009} and your target is {Jan, Feb}. It doesn’t like the 2009 in there twice. Each dimension scope in cross model mapping must be a list of concrete dimension members Unfortunate but true – you can’t use relative member functions such as Descendants in the 4th and 5th arguments of an allocate statement. Some of the model's dimension members are invalid or missing. The model may need to be redeployed. This is a run time error that can occur when you are trying to write to non-leaf members. The first argument of the allocate statement will contain the problem, as this is where the cells to receive the allocation are specified. Often it’s because ‘this’ has been entered as the first argument, which could result in using non-leafs for the argument, as it will inherit any non-leafs specified in the rule’s scope statement. Allocation statement must have 5 parameters when they appear in Cross Model Allocation Rule. Cross model allocation rules expect 5 parameters, in the correct format. So if you don’t have 5 parameters, then there’s the problem. However, be careful how you specify the parameters. For example, if you specify 2 members in the source mappings, such as [Time].[Monthly].[2007], [Time].[Monthly].[2008], then be sure to enclose them in brackets, otherwise the 2nd member will be interpreted as your 5th argument. Unable to compile the cross model allocation rule. The target contains the dimension ‘xyx’ but no explicit mapping has been provided. If an extra dimension(s) exists in the target model, the allocate statement requires you to specify what member you expect the data to be written to in that dimension.  

BI Evening, TVP, 17th July 2008

I'll be giving a talk titled 'PEL Vs MDX' at the UK SQL Server Community's BI Evening on the 17th July at Microsoft's TVP offices in Reading. The content will be geared towards MS BI developers, and will highlight the differences between the two languages, with the aim that those with MDX experience will be able to make a fast start with PEL. Also speaking is Jeremy Kirkup from TAH, on 'Using Excel Services with Analysis Services and MOSS'. If you are just starting to explore delivering BI solutions with Excel Services and SharePoint then there are some issues that it is wise to be aware of in advance. This session will describe some real world lessons gained while creating BI solutions which expose Analysis Services data through the Excel Web Access web part. The Excel 2007 client has first class support for the some of the advanced features of Analysis services such as drill-through.  However, when exposing pivot tables to a browser through the Excel Web Access web part this feature is not available. The session will discuss a couple of approaches to implementing drill-through functionality for EWA based pivot tables using the MOSS SmartPart, AJAX and the Excel Web Services. You can register for the event here.

Executing Rule Sets in PerformancePoint to Move Data

As I mentioned in my post a few weeks a ago, the PEL Allocate statement is a powerful method of moving data between PerformancePoint models. Although it's powerful, the end result of an Allocate statement is simply that the destination model will contain data that has been queried from the source model. What if the destination model already contains data for the target scope? If this is the case, then you will have double counting. When creating the Allocate rule, there is unfortunately no option that lets you decide what you want to do with existing data. Ideally you need to have the 'Existing Data' option that you get when running an association, which gives you the 'Append', 'Scoped Replacement' and 'Full Replacement' options. To get around this, I tend to place two rules inside a rule set. The first rule is a SQL implementation assignment rule with the same scope as the allocate rule, and deletes existing data by using the PEL statement this = NULL inside the rule. Then the second rule is the allocate statement, which appends the data. This is shown below: The idea is then to execute the rule set, which will execute each of its rules in order. The advantage to doing this is that your whole data movement process can be executed in one clean step (perhaps scheduled or via Excel), without the hassle of executing several individual rules.

Assumption Model Design Techniques and Business Rules

PerformancePoint assumption models are very useful if you want the data in one model to drive and affect another model. However, if you’ve been using assumption models, then the chances are that you may have noticed that they can sometimes be a bit awkward to use. This post highlights a couple of issues that I’ve experienced with assumption models, and provides a few workarounds. Firstly, when you initially link the assumption model to the main model, you may get the error message ‘the assumption model uses different member sets for the following dimensions’, as shown below:   Lets presume that you ignore this warning and continue to link to the assumption model. Further down the line, when you come to write your PEL rule to pick up the assumption model data, you may find that you get an error when validating the rule. This will occur if you want to populate a member in one member set by referencing a member in another member set, which resides in the assumption model. To illustrate this, if we want to populate an account called ‘Total Sales’ in the ‘Operational’ member set of the main model by using an account called ‘Price’ in the ‘Drivers’ member set in the assumption model, then we would have the following PEL rule: scope ( [Scenario].[All Members].[Forecast], [Account].[Operational].[Sales]) ; this = [Account].[Operational].[Volume] * (MODEL([Sales Assumptions]), [Account].[Drivers].[Price]); end scope; However, when validating this rule we get told “No hierarchy Drivers exists in dimension Account”. The only account member set that PerformancePoint will let you use in the above rule is 'Operational'. Note: you'll be ok if 'Price' exists in the main model, but you will just need to refer to it as [Account].[Operational].[Price]. But if 'Price' doesn't exist in the main model, then a different approach is needed..... The simple way around it is to use an approach found in the Alpine Ski House sample application. Add the accounts (if that’s the dimension causing you problems) into another dimension called Business Driver. You can then use the Business Driver dimension in your PEL rule, as PerformancePoint is more than happy for you to reference assumption model specific dimensions. This is shown below: scope ( [Scenario].[All Members].[Forecast], [Account].[Operational].[Sales]) ; //The rule is almost identical, but validates ok as we're using the BusinessDriver dimension. this = [Account].[Operational].[Volume] * (MODEL([Sales Assumptions]), [BusinessDriver].[Drivers].[Price]); end scope; Another problem that you may experience is when you have a difference in granularity between member sets used in the assumption model and the main model. For example, you may have an Entity member set called ‘Regional’ used in the assumption model, but a more detailed member set used for the Entity dimension in the main model. So in the above example, we want to populate members in the more detailed 'Company' member set by using a value from the member's parent in the higher level 'Regional' member set. In order to do this, we can just use the Parent function on the current member of the main model's entity member set. Even though we've referenced the main model member set inside the MODEL function, if the current member's parent exists in the assumption model, then it will pick it up without any problems: scope ( [Scenario].[All Members].[Forecast], [Account].[Operational].[Sales], Descendants([Entity].[Company].[TOT], 1073741823, leaves)) ; this = [Account].[Operational].[Volume] * //Its ok here to use [Dimension].[Main Model Member Set].CurrentMember.Parent (MODEL([Sales Assumptions]), [Entity].[Company].CurrentMember.Parent, [BusinessDriver].[Drivers].[Price]) end scope; So both very simple design techniques, but it’s worth being aware of them before you actually start building your models.

PEL Business Rule Re-use and Pre-defined What If Scenarios

You’ve built a rule set to satisfy an original set of business requirements for calculating budget, and all is good with the world.  Then, a change order is raised and approved that requires, for want of a better description, a set of pre-defined what if* scenarios to apply to the calculated budget. The pre-defined what-if’s are essentially the same business rule calculations with a slight factor or variance applied so it makes sense to want to re-use the existing rules. Not an unreasonable request. What’s the best way to implement this change request without introducing the burden of duplicated rules and keeping your model clean and well structured? Background This is the rule set in question: These rules utilise an assumption model that contains values such as salary levels, training days, tax assumptions etc and calculate various budget lines based on headcount information collected from a variety of regional cost centres. The rules are pretty straightforward as the Salary rule outlined below shows: scope([Scenario].[All Members].[Budget],[Account].[Detail].[Salary Costs],Descendants([Role].[Role].[All Roles], 0, after),Descendants([Time].[Quarterly].[Year2008], 0, after),Descendants([Region].[Region].[Global], 0, after) ) ; this = ( (Model([Drivers]), [Account].[Detail].[Annual Salary],[Time].[Quarterly].CurrentMember.Ancestor([Time].[Quarterly].[Year]),[Role].[Role].CurrentMember,[Region].[Region].CurrentMember,[Scenario].[All Members].[Budget]) / 4 ) * ( [Account].[Detail].[Headcount], [Time].[Quarterly].CurrentMember, [Role].[Role].CurrentMember, [Region].[Region].CurrentMember,[Scenario].[All Members].CurrentMember ); end scope; Essentially, for each quarter, for each role within each region we are multiplying the appropriate annual salary from the assumption model (divided by 4 for quarters) by the appropriate headcount to give us the budget salary figure. New Requirement Our change order requires a view of the budget as if we were to increase or decrease the headcount for each role within each region by 30%. Our new scenario dimension will look like this: The last two members have been added to cope with this new requirement – you could argue that these members could exist in a separate ‘what-if’ style dimension that could then be applied to any of the scenarios independently. I actually prefer that approach, but for simplicity let’s just limit this explanation to the scenario dimension. Okay, we have our new scenario members, we now need to wire up some rules to calculate these alternate budgets. Design Our first problem is how we can make the existing rules generic across budget scenarios. We have the budget scenario factors (a +30% and a -30%) to work in somewhere. We could hard-code these values into the rules but that would require a rule set per budget scenario thus duplicating all rules. Nasty. We could store the factors in an assumption model against the appropriate budget scenario, this could then be referenced within the rule. That would potentially allow us to re-use the business rules. However, I’m not keen on this approach at all as I feel that the budget factors are fixed, scalar values and to store them in an assumption model is overkill and would potentially require much more implementation effort. So, what option(s) are we left with? Not many, in this situation I would be tempted to create a member property on the Scenario dimension that held the factor against the appropriate member. The business rules could then simply (yeah right, see later!) reference the member property and apply the factor. So, in the case of the main Budget scenario we would have a factor of 1, for Budget +30% Headcount we would store a factor of 1.3 and 0.7 would be set against Budget -30% Headcount. So, on the face of it you can then update each rule to multiply every reference of Headcount by the appropriate factor held in a member property. However, there is a problem. That problem lies with the PEL function ‘Properties’. Unlike the MDX equivalent, the PEL version always returns a string, irrespective of the data type you assign to the member property itself. You cannot multiply by a string value and PEL does not contain any data type conversion functions to overcome this issue. Implementation The workaround: Use a Native MDX implementation. That sounds worse than it is; I’m not suggesting that you re-write all the business rules in MDX, although that is certainly an option. Alternatively, you can add a single MDX business rule to the top of the rule set that re-calculates Headcount, using the value of the member property: scope ( {[Scenario].[All Members].[Scenario].&[402],[Scenario].[All Members].[Scenario].&[501]},Descendants([Time].[Quarterly].[Year].&[2008], 0, AFTER),[Account].[Detail].[Level 02].&[5001],Descendants([Region].[Region].[Level 02].&[5201], 0, AFTER),Descendants([Role].[Role].[Level 02].&[101], 0, AFTER) ); ([Measures].[Value])= ( [Scenario].[All Members].[Budget],[Time].[Quarterly].CurrentMember,[Account].[Detail].[Headcount],[Region].[Region].CurrentMember,[Role].[Role].CurrentMember ) * [Scenario].[All Members].CurrentMember.Properties('OffsetPercent'); end scope; Apart from the fact the MDX rule is very PEL like, notice the last line of the rule. Here the headcount for each role, region and quarter is multiplied by the Scenario member property OffsetPercent. With this in place, the subsequent rules will calculate the budget scenarios based on the appropriate revised headcount value. (Due to the natural solve order of the rules in the rule set). The final step is to update each of the rules scope to include each of the budget scenarios (leaving the main bulk of the rule in place, untouched). Here’s the updated version of the Salary rule we looked at earlier (Modification in green) scope ({[Scenario].[All Members].[Budget],[Scenario].[All Members].[BudgetPlus30PcntHCount],[Scenario].[All Members].[BudgetMinus30PcntHCount]},[Account].[Detail].[Salary Costs],Descendants([Role].[Role].[All Roles], 0, after),Descendants([Time].[Quarterly].[Year2008], 0, after),Descendants([Region].[Region].[Global], 0, after)) ; this = ((Model([Drivers]), [Account].[Detail].[Annual Salary],[Time].[Quarterly].CurrentMember.Ancestor([Time].[Quarterly].[Year]),[Role].[Role].CurrentMember,[Region].[Region].CurrentMember) / 4) * ([Account].[Detail].[Headcount], [Time].[Quarterly].CurrentMember, [Role].[Role].CurrentMember, [Region].[Region].CurrentMember); end scope; For completeness, our modified rules set will look like this:   Notice the new (MDX) rule ‘Headcount’, this will be calculated first, for each of the budget scenarios ensuring the subsequent rules use the appropriately factored headcount figure. The support for Business Rule re-use is limited in PPS-P v1 but as this example hopefully illustrates, with a little thought and application, re-use is occasionally achievable.  I hope there will be better support for re-use in future versions ! --------------------------------------- * Generally, what-if’s are used dynamically; you change a few numbers in a data entry worksheet, re-calculate the sheet and view the results, repeating the process as necessary. This is a great feature and really quite powerful, however, there are some situations where pre-defined what-if’s are required to easily and frequently show standard what-if scenarios.