Sacha Tomey

Sacha Tomey's Blog

PerformancePoint Suggestion box

Inspired by Adrian's post on the TechNet Planning Forum here's my list of 'I wants' for PPS-P.  I've cross posted this to the forum discussion too.  A few of these have been suggested before but there are some new ones in there too - In no particular order:

Application Migration Wizard
There are several methods for doing this currently but nothing slick.  I was hopeful that the new Data Migration Tool was the answer to this but alas, no.  It only migrates data (don't get me wrong this is useful) - what we really need is an Application Migration Tool that migrates everything, both initial 'go-live' migrations but also 'update release' migrations.  It should support making a copy of the application on the same server too (With a different name of course!).

Developer Edition
The PerformancePoint Server components can only be installed on Windows Server.  This is a tad annoying for development and demo purposes.  Currently we have a Windows Server built laptop for demos and sales activities.  For development, when connecting to the server is not always feasible we have a couple of pre-built PPS-P 'development' hard disk drives that we use to swap out with our non-server development boxes and manually migrate to the official servers periodically.  It would be great if, like SQL Server, there was a 'Developer Edition' that could be installed on XP and/or Vista.

Improved Support for Business Rule Re-Use
There are several ways that Business Rule re-use could be implemented but initially I'd like to see a 'formula' tab where, for example, you can define global formulas and expressions that can be selected (re-used) within various scope statements.  Taking this a step further, how cool would it be if the global formulas were actually functions that accepted parameters so you could produce a smaller set of generic formulas that can be re-used throughout the application.

Improved Rule Editing
The editor is a little clunky at present.  We need something more akin to SQL/MDX Query Editor in Management Studio.

Web-Based Data Entry
Whether via Excel Services or some other 'pivot table' style entry form.  A couple of customers are banking on Microsoft providing this in the future.  We at Adatis would love to know if they've actually started something like this in Redmond as we'd be up for developing something in-house if not !

Extensible PEL
Much like custom tasks can be developed in SSIS and stored procedures can be created for Analysis Services - It would add some real power if PEL keywords could be added via custom .NET assemblies that are registered with the PPS-P Server.  Not only can complex, vertical specific logic be encapsulated, intellectual property can be protected, opening up a third-party plug-in market.

Process Cycle Independent Data Entry Forms
It would really aid productivity if data entry forms can be created without the need to create a Process Management Cycle and associated assignment.

Application/Model Templates
Imagine a 'Retail Budgeting Template' or a 'Transportation Forecasting Template' and how that could potentially wipe weeks off an implementation.  This is a tall order due to the global nature of the product and the diversity of requirements within industry verticals so it might be better to make lower level component type templates that can be plugged in to a larger solution   e.g.  UK Tax and NI and Salary assumption models.

Protect Sensitive Data
Some data that is forecasted and budgeted is extremely sensitive, salary information for example.  Assignments should be marked as sensitive and as such should challenge the user opening it for some form of authentication.  Integrated security is not secure !

Spread Across Time
Not having the ability to spread across time or entering data at non-leaf level (as you can with the other dimensions) results in more models than you would ideally choose, increasing complexity.

Custom Spread Methods
Even and Ratio Spread don't satisfy every client spreading requirement.  If custom spread logic could be developed on the server and exposed through the Excel add-in for specific assignments every requirement could be fulfilled.

Unlimited Undo/Versioning
Without the ability to copy applications, proper versioning and unlimited undo/redo (across sessions) would help.  Adopting the Visual Sourcesafe approach of versioning, pinning, branching and merging would greatly assist with development / deployment / testing cycles.

Validation
Centralised validation rules that can prevent invalid data from being submitted to the server would help reduce approval/review workload.

Dimension Member Creation
Excel Add-In support for dimension member creation would assist greatly for certain budgeting/forecasting scenarios.  Workflow and data collisions would need to be considered otherwise the data integrity could be jeopardised.

Multiple Model Data Entry
Allowing an assignments to capture data for multiple models would provide the ability to simplify the implemented process and provide greater support for, amongst others, assumption model based what-ifs.

Submission Triggered Jobs
It's common to want to run a job once all data has been submitted.  Currently the job has to be run manually or scheduled.  Allowing jobs to be triggered via data submissions would provide greater automated support for implemented processes.

PEL Business Rule Re-use and Pre-defined What If Scenarios

You’ve built a rule set to satisfy an original set of business requirements for calculating budget, and all is good with the world.  Then, a change order is raised and approved that requires, for want of a better description, a set of pre-defined what if* scenarios to apply to the calculated budget.

The pre-defined what-if’s are essentially the same business rule calculations with a slight factor or variance applied so it makes sense to want to re-use the existing rules. Not an unreasonable request.

What’s the best way to implement this change request without introducing the burden of duplicated rules and keeping your model clean and well structured?

Background

This is the rule set in question:

image

These rules utilise an assumption model that contains values such as salary levels, training days, tax assumptions etc and calculate various budget lines based on headcount information collected from a variety of regional cost centres.

The rules are pretty straightforward as the Salary rule outlined below shows:

scope(

[Scenario].[All Members].[Budget],
[Account].[Detail].[Salary Costs],
Descendants([Role].[Role].[All Roles], 0, after),
Descendants([Time].[Quarterly].[Year2008], 0, after),
Descendants([Region].[Region].[Global], 0, after)

) ;

this =

(

(Model([Drivers]),
[Account].[Detail].[Annual Salary],
[Time].[Quarterly].CurrentMember.Ancestor([Time].[Quarterly].[Year]),
[Role].[Role].CurrentMember,
[Region].[Region].CurrentMember,
[Scenario].[All Members].[Budget]) / 4

)

*

(

[Account].[Detail].[Headcount],
[Time].[Quarterly].CurrentMember,
[Role].[Role].CurrentMember,
[Region].[Region].CurrentMember,
[Scenario].[All Members].CurrentMember

);

end scope;

Essentially, for each quarter, for each role within each region we are multiplying the appropriate annual salary from the assumption model (divided by 4 for quarters) by the appropriate headcount to give us the budget salary figure.

New Requirement

Our change order requires a view of the budget as if we were to increase or decrease the headcount for each role within each region by 30%. Our new scenario dimension will look like this:

image

The last two members have been added to cope with this new requirement – you could argue that these members could exist in a separate ‘what-if’ style dimension that could then be applied to any of the scenarios independently. I actually prefer that approach, but for simplicity let’s just limit this explanation to the scenario dimension.

Okay, we have our new scenario members, we now need to wire up some rules to calculate these alternate budgets.

Design

Our first problem is how we can make the existing rules generic across budget scenarios. We have the budget scenario factors (a +30% and a -30%) to work in somewhere. We could hard-code these values into the rules but that would require a rule set per budget scenario thus duplicating all rules. Nasty.

We could store the factors in an assumption model against the appropriate budget scenario, this could then be referenced within the rule. That would potentially allow us to re-use the business rules. However, I’m not keen on this approach at all as I feel that the budget factors are fixed, scalar values and to store them in an assumption model is overkill and would potentially require much more implementation effort.

So, what option(s) are we left with? Not many, in this situation I would be tempted to create a member property on the Scenario dimension that held the factor against the appropriate member. The business rules could then simply (yeah right, see later!) reference the member property and apply the factor. So, in the case of the main Budget scenario we would have a factor of 1, for Budget +30% Headcount we would store a factor of 1.3 and 0.7 would be set against Budget -30% Headcount.

So, on the face of it you can then update each rule to multiply every reference of Headcount by the appropriate factor held in a member property.

However, there is a problem. That problem lies with the PEL function ‘Properties’. Unlike the MDX equivalent, the PEL version always returns a string, irrespective of the data type you assign to the member property itself. You cannot multiply by a string value and PEL does not contain any data type conversion functions to overcome this issue.

Implementation

The workaround: Use a Native MDX implementation. That sounds worse than it is; I’m not suggesting that you re-write all the business rules in MDX, although that is certainly an option.

Alternatively, you can add a single MDX business rule to the top of the rule set that re-calculates Headcount, using the value of the member property:

scope

(

{[Scenario].[All Members].[Scenario].&[402],[Scenario].[All Members].[Scenario].&[501]},
Descendants([Time].[Quarterly].[Year].&[2008], 0, AFTER),
[Account].[Detail].[Level 02].&[5001],
Descendants([Region].[Region].[Level 02].&[5201], 0, AFTER),
Descendants([Role].[Role].[Level 02].&[101], 0, AFTER)

);

([Measures].[Value])=

(

[Scenario].[All Members].[Budget],
[Time].[Quarterly].CurrentMember,
[Account].[Detail].[Headcount],
[Region].[Region].CurrentMember,
[Role].[Role].CurrentMember

)

*

[Scenario].[All Members].CurrentMember.Properties('OffsetPercent');

end scope;

Apart from the fact the MDX rule is very PEL like, notice the last line of the rule. Here the headcount for each role, region and quarter is multiplied by the Scenario member property OffsetPercent.

With this in place, the subsequent rules will calculate the budget scenarios based on the appropriate revised headcount value. (Due to the natural solve order of the rules in the rule set).

The final step is to update each of the rules scope to include each of the budget scenarios (leaving the main bulk of the rule in place, untouched).

Here’s the updated version of the Salary rule we looked at earlier (Modification in green)

scope

(
{
[Scenario].[All Members].[Budget],
[Scenario].[All Members].[BudgetPlus30PcntHCount],
[Scenario].[All Members].[BudgetMinus30PcntHCount]
},
[Account].[Detail].[Salary Costs],
Descendants([Role].[Role].[All Roles], 0, after),
Descendants([Time].[Quarterly].[Year2008], 0, after),
Descendants([Region].[Region].[Global], 0, after)
) ;

this =

(
(Model([Drivers]),
[Account].[Detail].[Annual Salary],
[Time].[Quarterly].CurrentMember.Ancestor([Time].[Quarterly].[Year]),
[Role].[Role].CurrentMember,
[Region].[Region].CurrentMember) / 4
)

*

(
[Account].[Detail].[Headcount],
[Time].[Quarterly].CurrentMember,
[Role].[Role].CurrentMember,
[Region].[Region].CurrentMember
);

end scope;

For completeness, our modified rules set will look like this:

image 
 
Notice the new (MDX) rule ‘Headcount’, this will be calculated first, for each of the budget scenarios ensuring the subsequent rules use the appropriately factored headcount figure.

The support for Business Rule re-use is limited in PPS-P v1 but as this example hopefully illustrates, with a little thought and application, re-use is occasionally achievable.  I hope there will be better support for re-use in future versions !

---------------------------------------

* Generally, what-if’s are used dynamically; you change a few numbers in a data entry worksheet, re-calculate the sheet and view the results, repeating the process as necessary. This is a great feature and really quite powerful, however, there are some situations where pre-defined what-if’s are required to easily and frequently show standard what-if scenarios.

New Microsoft Insider PPS Planning Blog

A great new PPS-P focused blog has been up and running for a good few weeks now.  Peter Eb., a developer on the PPS Excel add-in has started blogging some real gold with a mixture of definitions, 'howtos' and workarounds.  He's one of the resident experts on the Planning TechNet Forum and really knows his stuff !

Just as a taster, here are the headings of his last few posts:

What does this PPS Planning workflow action mean?

HOWTO: Select a time range that is dynamic (updates over time for a rolling forecast)

HOWTO: Hide filters for dimensions that confuse users

What does this status "xyz" mean for my assignments?

What does the error "filter selection for dimension XYZ is not in the filter scope"?

It's what Technical Blogging should be all about, brilliant!

Directions EMEA 2008

I'm traveling to Paris next week to present PerformancePoint Planning at the Directions EMEA 2008 conference.  For those that don't know, it's a conference specifically for (and run by) Dynamics NAV partners.

http://www.directionsemea.com/

I'm presenting with Nigel Geary of PrecisionPoint Software who have developed a fully reconciled, fully functional business warehouse for the Dynamics suite.  If you are a Dynamics user or a Dynamics partner and you haven't seen the PrecisionPoint offering then you are missing a trick !

Our presentation is focusing on budgeting and I'll be demonstrating most of the features that PerformancePoint Planning includes to simplify the budgeting process.

If your attending, be sure to pop by and say Hi !

Book Review: Planning with Microsoft PerformancePoint Server 2007

I've had this book for a while and have been meaning to publish a review for sometime.  I reviewed the companion book Monitoring and Analyzing with Microsoft PerformancePoint Server 2007 by the same authors, Adrian Downes and Nick Barclay of B(iQ) who have already established themselves as the authority on Microsoft PerformancePoint Server 2007; just like the companion book, this one too re-enforces that position. 

As I have come to expect from these guys, the book is just crammed to the 250 page Rational Guide limit with essential information.  It achieves several objectives; it provides a overview of the theory of Performance Management, it introduces the reader to the Microsoft PerformancePoint Planning product and finally it holds your hand through the relatively steep learning curve of implementing a planning solution. 

For those who have been involved with PPS-P since the early CTP days, or later, would, I'm sure, have had several frustrating sessions trying to understand how and what is happening.  I truly believe that this book would have removed much, if not all of that frustration.

But, in the interests of providing a fair and balanced review I do have one gripe - It would not surprise me if Adrian and Nick had a couple of sleepless nights and some 'discussions' with the Rational editors on what chapters to print and what chapters to provide as bonus material on-line (Yes, just like the companion book there is a whole bunch of bonus material including additional chapters, sample downloads and source code).   The Process Management chapter is one of the online bonus chapters that I definitely feel should have been printed.  Process Management falls into the 'essential' category and you can't actually do any Performance Management without Process Management.  Of the printed chapters, the one that could be argued is better suited for online is the chapter concerning data integration, particularly when this chapter contains approximately 16 valuable printed pages of plain T-SQL script.  So be sure to read the online bonus essential material.

I think the authors would agree; the book does not set out to cover the whole of the product, the product is far too big to cram into 250 pages but it is definitely attempting to be a 'get up and running quickly with the essentials' level book and to that end, my hair splitting gripe aside, it absolutely nails it.  Easy to read, perfectly structured and comprehensive enough to give you the confidence to chase that 'early adopter' opportunity.

It was always going to be a challenge to write a Rational Guide on PerformancePoint Planning but the boys have come good once again.  I've added this book, along with the companion book on M&A, to our internal 'essential reading list' - Top job chaps !

PerformancePoint Planning Account Type Crib Sheet - Update

I've updated my Account Type Crib Sheet.  It now has a back page where other PerformancePoint Planning types have been detailed.

Included are:

- Flow
- Biz System Flag
- Business Process
- Consolidation Method
- Entity Type
- Exchange Rate
- Time Data View

I've given it a new name too - it's now the 'PerformancePoint Planning Predefined Types Crib Sheet'.  Imaginative or what!?

image image


You can download it using the link below

PPS-P PTC.xps (236.90 kb)

 

PPS Predefined Types Crib Sheet

The latest version of the PPS Planning Crib Sheet.
PPS-P PTC.xps (236.90 kb)

Contains a quick look up of the PPS Predefined Types, included are:
- Account 
- Flow
- Biz System Flag
- Business Process
- Consolidation Method
- Entity Type
- Exchange Rate
- Time Data View

Essential Adatis Blogs

I've been speaking to a couple of people recently that, although subscribe to my blog, are not aware of our PPS Planning Sharepoint Web Part that Tim has blogged about here and here.  This post is more of an awareness that we have other bloggers onboard and much of the content is of the 'essential' variety.

Currently blogging are Tim, Martyn and Jeremy with our aggregated feed that includes the lot.

Share Calculations and Effective Ownership

In PerformancePoint Planning Server there are several pre-defined model types that satisfy different requirements. One of these model types is Financial Model with Shares Calculations and its primary use is for the financial consolidation of a group of companies where the ownership of the companies within the group is not 100%.

Take the following international group of companies:

image

Adatis (UK) is the ultimate parent (and in this case the holding company too) and own a 90% shareholding of Adatis (Italy) and Adatis (US). Adatis (Canada) is only 60% owned, although notice that Adatis (Italy) owns the remaining 40% of Adatis (Canada). Adatis (France) and Adatis (Switzerland) are both 50% owned by Adatis (Italy) who also owns 45% of Adatis (Germany).

In order to work out the consolidated accounts of the Adatis (UK) Group, one of the first steps is to determine the effective ownership of each of the companies within the group. This needs to be done from the perspective of the holding company.

What do I mean by effective ownership? I mentioned earlier that Adatis (Canada) was 60% owned directly by Adatis (UK) and 40% owned by Adatis (Italy). So you would be forgiven by thinking that Adatis (Canada) is 100% owned within the group albeit split between two of the companies.

However, notice t hat Adatis (Italy) is only 90% owned. This means that Adatis (UK) only effectively owns 90% of the 40% of Adatis (Canada) plus the 60% it owns directly. Therefore, the effective ownership is 90% of 40% = 36% + 60% = 96%.

Below outlines the effective ownership from the perspective of Adatis (UK) for the entire group.image

PerformancePoint Server ‘With Shares’ Financial Model contains a built in Shares Calculation job that should, in theory, determine the effective ownership for a group of companies.

In order to test the PerformancePoint Shares Calculations you need to load data into two of the three additional measure groups that are automatically created in a ‘With Shares’ Financial Model.

Shares Outstanding

Each company needs a set of shares issued and the number of those shares that have voting rights. To keep everything simple I’ve issued all companies with 10000 shares with all shares having voting rights.

image

Shares Owned

For each of the parent companies the share ownership needs specifying. We have two parent companies, Adatis (UK) and Adatis (Italy). The shares owned have been set according to our company breakdown illustrated above.

Adatis (UK) owned companies

image 

Adatis (Italy) owned companies

image 

Control and Ownership

With the data loaded the shares calculation job can be executed. The results are posted to the third of the additional measure groups, Control and Ownership.

image

It’s the POWN columns, highlighted, that contain the effective ownership and encouragingly this matches our expectation. So it looks like, under these simple conditions the Shares Calculation job does what it should.

Incidentally, there are a few additional columns that are of interest.

- The PCON columns are the percentage that will be used in the consolidation.

- The PCTRL (Ultimate Percent Control) columns specify the controlling percentage. This is essentially controlled by the number of voting shares compared to shares owned and effective ownership.

- The ‘Overridden’ equivalents can be used to override the results of the shares calculation job and are the actual values used in the consolidation process. The non-overridden columns exist to show the outcome of the calculation. These can be altered but will have no effect as they are not used in the consolidation process itself.

- The final additional column represents the Consolidation Method. There is a simple formula that PPS uses to determine the Consolidation Method:

o If the ultimate percent control (PCTRL) is >= 0.0 and < 0.2, the consolidation method is NONE.

o If the ultimate percent control (PCTRL) is >= 0.2 and < 0.5, the consolidation method is Equity.

o If the ultimate percent control (PCTRL) is >= 0.5 and <= 1.0, the consolidation method is Full.

o If the entity is the holding company for the model, the consolidation method is Holding.

It’s important to note that there is no automatic way of setting a proportionally consolidated company. The consolidation method can be changed and will need to be changed if, for example, you wanted to proportionally consolidate Switzerland, and/or France as opposed to the Full consolidation by default.

PerformancePoint Server Job Parameters

Just a quick tip to save some mouse clicks. 

When you are setting parameters for period start/end type variables you only have to set the end period if it differs from the start period.  I used to think all parameters were mandatory.

It's particularly handy for a Consolidation Job as there are 2 sets of periods to set.  It annoys me slightly that there is no Dynamic time selection available for Job parameters so you have to scroll through the time dimension to find a specific member whereas now I only have to do this twice, not 4 times !

ConsolidationJob