Sacha Tomey

Sacha Tomey's Blog

RIP PerformancePoint Planning

It's nearly a week since the announcement that shook the (PPS) world !  It's been a bit difficult to report on; generally the Adatis blogs try and offer solutions to problems we have encountered out in the real-world.  Now I could say something crass here about the real-world and the decision makers involved...but that would be childish right?

If I was to offer up my feelings, they wouldn't be that far from Alan Whitehouse's excellent post on the subject.  If I had an ounce of class about me, it would be much more aligned with Adrian's poignant discussion opener, the one with the sharp-witted title, but alas....

We've spent the best part of the week speaking to customers, partners and Microsoft about what to do next.  The timing was choice - would you believe, we actually had three new PerformancePoint Planning phases kicking off this week, according to my project plan - I should be setting up Kerberos as we speak..  [There is always a positive right?]

Some customers are carrying on regardless, they...

...already have planning deployments and are too far invested and dependent to back out at this stage or, 

...have a short-term view (That's not a criticism) and need a "quick" fix with a low TCO to get them through some initial grief.  (Typically these customers are going through rapid organisational change, or form part of a recent acquisition and, to help them see the wood from the trees during the transition, require short/sharp solutions)

Other customers, with longer-term views, feel the product, or more importantly, the suitably skilled resource pool, will drain away far quicker than the life-span of the much touted Microsoft product support.  I have to agree - Fact - Adatis will not be employing or training anymore PerformancePoint Planning Consultants.  I doubt many other consulting firms will either.

It's those customers with the longer-term view that are the ones currently in limbo - they are experiencing pain, they need pain relief, what should they do - wait and see what Office 14/15 offers? (There is talk of some planning functionality appearing in future Office versions - what truth there is in that..?).

The Dynamics customers could wait for the resurrection of Forecaster - I do have information on good authority that they will be developing Forecaster to be closer, in terms of flexibility, to PPS Planning.  I had originally heard the opposite view in that Forecaster will be replaced with a cut down version of PPS Planning.  Either way, I'm sure some of the PPS Planning code-base will be utilised, which could end rumours of PPS Planning being 'given' to the community as some form of community/open-source arrangement.  An arrangement that is, in my opinion, a non-starter anyway, "Hey, Mr FD, We've got this great open-source budgeting and forecasting product we think you should implement!" - yeah right !

Another rumour (and mixed message) is that Service Pack 3 will contain some of the requested features that were earmarked for version 2 (After all, the code has already been written, right?) this rumour was actually started by Guy Weismantel in his Announcement Video.  However, the information I have since received, clearly states that Service Pack 3 will contain stability and bug fixes only - so which is it to be?  It's unlikely for a service pack to contain new features, but it's not unheard of; anyone remember the original release of Reporting Services?  That arrived as part of a service pack for SQL Server 2000.

The burning question I cannot get answered is, have Microsoft actually stepped out of the BPM market for good?  We are told that Excel, Sharepoint and SQL Server provide BPM - I can't see, without Planning, how they can.  Short of hard-coded values, renewed Sharepoint/Excel hell, another vendor or bespoke planning solution, businesses can't set plans which have further reaching implications; effectively Planning's demise is also, effectively, shelving the Scorecard/KPI functionality from the M&A toolset too !  It will be interesting to see the new Monitoring & Analytics Marketing, will they still demo Strategy Maps and Scorecards, or will they now focus on Decomposition trees and Heat maps? Monitoring & Analytics may, in practice, just become Analytics..

I would have thought the cost of continuing to develop the product (even if it were a lemon, which Planning certainly wasn't)  is far less than the potential loss of revenue that Microsoft will face due not only to the loss of confidence by its customers (who are going to think twice about investing in any Microsoft product now, let alone a V1) but perhaps more significantly, the doors it opens to it's competitors who can offer a complete BI\BPM stack. 

Planning was foot in the customer's door for BI - once you put planning in, the customer had already bought the full BI stack, and in most cases, our customers were wowed by what they could now achieve. 

I suspect Cognos and SAP are still partying now!

PerformancePoint SP2 - Planning Fixes and a mini-feature

Jeremy has already announced the release of PerformancePoint Server SP2 and it's great to see that the PPS dev team hit their target release date !  I've spent a little commute time this morning checking out the documentation, admittedly I've initially focused on the Planning component and there are no great surprises (Tim has already told you about the new bits) but I have spotted what could arguably be described as a mini-feature surrounding form validation that I'm sure that will come in useful.

As you would expect, previously released hot fixes have been packaged up into this service pack:

954710 Description of the PerformancePoint Server 2007 hotfix package: July 1, 2008

955432 Description of the PerformancePoint Server 2007 hotfix package: July 14, 2008

955751 Description of the PerformancePoint Server 2007 hotfix package: July 28, 2008

956553 Description of the PerformancePoint Server 2007 hotfix package: August 21, 2008

Plus fixes to issues not previously addressed:

Excel Add-In Related

  • You locally save and close a form in PerformancePoint Add-in for Excel. When you reopen the form, you are prompted to update the form. However, you expect that you are not prompted to update the form because the form is already up to date.
  • In PerformancePoint Add-in for Excel, you open an offline form assignment. In the form assignment, you add a link to an external Excel worksheet in a cell. Then, you submit the changes to the PerformancePoint Planning Server database. However, when you reopen the assignment, the link that you added is not retained.
  • After you install PerformancePoint Server 2007 Service Pack 1, you create a page filter in PerformancePoint Add-in for Excel. You have a user in PerformancePoint Server 2007 that does not have permission to the default member of the page filter. However, the user has permission to other leaf members in the page filter. When the user opens a report that uses this page filter, the user receives the following error message:

    Cannot render the <MatrixName> matrix. The server returned the following error: The <CubeName> cube either does not exist or has not been processed.

    However, in the release version of PerformancePoint Server 2007, the next member that the user has access to will be automatically selected for use in the page filter.
  • You define data validation in a worksheet of Excel. However, you can still submit a form in PerformancePoint Add-in for Excel if data in the form is not validated.
  • You have a matrix that is based on a large and complex model in PerformancePoint Add-in for Excel. You open the Select Filters dialog box to change a page filter for this matrix. When you click the Value column of the filter, the dialog box that displays the dimension members takes a long time to display.

Business Rules Related

  • After you migrate an application in PerformancePoint Server 2007 from one server to another server, the order of user-defined business rules and system business rules in models is not preserved.
  • You cannot use the datamember function in the ALLOCATE statement and in the TRANSFER statement.
  • Consider the following scenario. You create an automatic rule that uses MdxQuery implementation or Native MdxQuery implementation in Planning Business Modeler. Then you submit changes to the source data that the rule uses from an assignment form. The submission causes the model to be reprocessed. Because model reprocess causes rules in the automatic rule set to be executed, you expect that the target data of the automatic rule will reflect the change by the form submission. However, after the model is reprocessed, the target data of the automatic rule does not reflect the change.
  • Rule expression of system business rules uses dimension member names instead of dimension member labels in PerformancePoint Server 2007.

Planning Business Modeler Related

  • You have a model that contains many form templates and assignments. When you try to change objects in the model in Planning Business Modeler, Planning Business Modeler crashes.
  • You create a member property of the Date data type in a dimension in PerformancePoint Server 2007. Additionally, you specify the Set value to Null option when you create the member property. When you retrieve the value of this member property, you obtain a value of 1899-12-31T00:00:00. However, you expect that you obtain a value of blank.
  • You cannot schedule recurring jobs for a frequency that is less than an hour.
  • When a user updates a business rule in Planning Business Modeler, the audit log file of PerformancePoint Server 2007 logs the user ID of the user that created the rule. However, you expect that the audit log file logs the user ID of the user that updated the rule.
  • Consider the following scenario. You create a dimension that has no hierarchy in a localized version of PerformancePoint Server 2007. Then you perform one of the following operations:
    • You run the bsp_DI_CreateHierarchyLabelTableForDimension stored procedure to create label-based hierarchy table for the dimension.
    • You perform the Prepare the Staging DB operation in PerformancePoint Planning Data Migration Tool.
      In this scenario, you receive the following error message:
      A problem was encountered while attempting to connect to, or Execute BSP on, the specified Database
      For more information regarding this error please review the Application Event Log on the SQL Server for any "MSSQLSERVER ERRORS"
      Please check that all parameters in the UI are correct and try again

PerformancePoint Planning: Deleting a Custom Member Property - A Solution

I had a bit of a rant yesterday about the fact I have had to compromise naming member properties when I've inadvertently created them with the wrong data type.  As I mentioned, I found a Dimension attribute collection method in the Planning client assemblies that hinted that it might allow me to delete a member property so I decided to give it a go.

Below is some really rough and ready C# code that actually does delete a dimension member property.  I will improve the code and probably add it in to my PPSCMD GUI interface as a 'feature pack' bonus at some stage, however, if you are in desperate need for the code to delete a member property, and you can't wait for PPSCMD GUI v0.2 or PerformancePoint Version 2 (I'm not sure which will come first !) the code is below (Use at your own risk !!)

Note:  Replace "MyApp", "MyDimension", "MyAttribute", oh, and the server address, accordingly..

    using Microsoft.PerformancePoint.Planning.Client.Common;
    using Microsoft.PerformancePoint.Planning.Bmo.Core;

// Setup the PPS Application Metadata Manager ServerHandler serverHandler = new ServerHandler("http://localhost:46787"); MetadataManager manager = new MetadataManager(); manager.ServerHandler = serverHandler; manager.ServerHandler.Connect(); // Get the system metadata BizSystem system = manager.GetSystem(true); // Get hold of the PPS Application BizApplication ppsApp = system.Applications["MyApp"]; // Obtain the root model site from the application BizModelSite site = ppsApp.RootModelSite; // Obtain the dimension that contains the member property BizDimension dimension = site.Dimensions["MyDimension"]; // Obtain the member property BizDimensionAttribute attribute = dimension.Attributes["MyAttribute"]; // Check out the dimension manager.CheckOut(dimension.Id, dimension.ParentModelSite.Id); // Perform the delete dimension.DeleteDimensionAttribute(attribute, null); // Submit the change manager.SubmitModelSite(ppsApp.Id, dimension.ParentModelSite, Microsoft.PerformancePoint.Planning.Bmo.Interfaces.SubmissionType.Update); // Check in the dimension manager.CheckIn(dimension.Id, dimension.ParentModelSite.Id);
Update:  I've since discovered that you can obtain an unsupported utility from Microsoft Support that reportedly does the same thing, doh !  
Oh well, always nice to have the code ..J

PerformancePoint Planning: Deleting a Custom Member Property..

Update:  I've posted a solution to Deleting a Custom Member Property here

I've done this countless times; I've created my perfectly named Custom Member Property when it suddenly dawns on me that I've forgotten to give it the right data type.  No problem, right?  Wrong!  From within PBM, can you change the data type?  No!  Can you delete the member property? No!  Can you rename the member property?  No!

So, what are the options?  Well, you could wait for version 2 (I truly hope you can edit/delete member properties in V2!), you could hack the back end database in the vague hope of removing the member property safely, or, as I have been doing in the past, create a new member property with a less than perfect name and try not to clench teeth and fists every time I glance at the original.

Well, I've had enough, and decided I'm going to take action.

Strangely, the Microsoft.PerformancePoint.Planning.BMO assembly contains a method called DeleteDimensionAttribute on the Dimension attribute collection. 


I wonder...

Anyone tried?

New PerformancePoint Contoso Demo - Released

Amidst my write up of the first day of the Microsoft BI Conference, I mentioned a new planning demo suite was imminent, and I would post more information about the demos soon.  Well, as it has now been officially released (27th October) I can spill the beans...

Taken directly from the PPS Planning Forum announcement, the demo..

.. consists of Planning and Financial Consolidation demo. It shows how the fictitious Contoso Group uses Microsoft Office PerformancePoint Server for planning, statutory consolidation and data analysis.

Well, I'm proud to announce that Adatis, in the shape of my colleague Jeremy Kashel, designed and built the PerformancePoint Planning element of the suite.  The PerformancePoint Financial Consolidation element was conceived and developed by our friends at Solitwork of Denmark.

The demo can be downloaded from here...

...and is part of the next 'All Up BI VPC' (Version 7).

Great work guys!

Microsoft BI Conference - Day 3 - 8th October 2008

The last day of the conference came round quickly and due to my relatively early flight I only attended a couple of sessions and spent most of the day meeting more people in and around the BI community.  Shout out to Peter Eberhardy (PeterEb), a real highlight.  Barry Tousley, Test Dev on the PPS Server; thanks for listening to my grumbles about the PPS Data Migration Tool and explaining why it does what it does.  Norm Warren of NormBI fame.  Patrick Husting, who I actually met on Day 2 and Brian Berry of BlumShapiro, who I met on Day 1 and reportedly follows this blog !

I thought the conference was great.  The organisation was slick and right on the button, from registration, meals, session offerings right up to the party.  I think last year, the main criticism was the sessions were not technical enough, they appear to have raised the technical level of some of the sessions but I still found most of them to be a bit dated and apart from a couple of choice sessions most BI people wouldn't have learnt a great deal - Nothing official at all about PPS v2 :o(  Also, a couple of the sessions I wanted to attend clashed so I'll have to find the time to watch them on the reported DVD containing a video of every session.  However, I did the feel the standard of presentation was excellent, well practiced, clear, funny and engaging.

I'll definitely be vying for a place at next years, where they really should have lots to show off!

Microsoft BI Conference - Day 2 - 7th October 2008

Day 2 kicked off with some good key notes and still full of steak from day 1 I hauled myself to TK Anand and Akshai Mirchandani's session on Optimising Query Performance in AS 2008.  For me this was one of the best sessions of the conference as I do spend a fair bit of trying to tune and optimise MDX queries.  They gave a really good explanation of the difference between cell-by-cell calculation and the subspace calculation (or block computation) methods - the latter relies on a sparse cube - the most important aspect of speeding up queries using subspace calculations.

Another point they raised, particularly from an AS2008 perspective is that "Non Empty Behaviour is Evil!" - their words!

There was a good set of tips and tricks, some of which can also be applied and adopted to AS2005.

The afternoon started with what I thought would be the busiest session of the conference - New Horizons for BI with Self Service Analysis technologies.  Effectively the deep dive presentation on Project 'Gemini'.  It really is impressive, not only the tool and the capability of the tool, but the supporting framework.  They have implemented an extremely rich administration console that keeps track of user created models on the server and a history of it's usage, query time etc etc.  It allows IT to see who is using what, by how much and what impact it is having on servers, other models etc and allows them to take appropriate action by, for example, bringing it in house into IT by upgrading to PerformancePoint.  We've got a few clients that would just go nuts for this stuff !

That evening, the Attendee Appreciation Party was held at Qwest Field stadium where I have to say, they put on a great party.  I've never been on the field of a huge stadium like that, most impressive, and I've never eaten so many chocolate covered marshmallows, cookies or brownies in my life!


Da Boyz !  Jeremy and Tim

Microsoft BI Conference - Day 1 - 6th October 2008

So, although later than the trail blazers, I thought I'd write up a brief note about day one of the Microsoft BI Conference.  The 'Kilimanjaro' announcements have been done to death although I've noticed a couple of crossed wires.  Kilimanjaro is NOT the next version of SQL Server - it sounds more like an interim release, whether that comes as part of a service pack or a new type of feature pack delivery method I guess we'll have to wait and see.  However it arrives, we have to wait until the first half of calendar year 2010.

With regard to 'Gemini' I'm hoping they make the in-memory (column based storage?) engine part of the SQL Server engine proper, as this can then benefit PPS and any SQL Server dependent app, not just the 'Gemini' framework.  Imagine PPS-P data entry/reporting running in memory !  It's certainly a game-changer and it will be interesting to see where and how it's positioned.  I can't help thinking that it's verging on promoting data silos and 'multiple versions of the truth' and it wouldn't surprise me if it's misused by some customers.  "We don't need a data-warehouse, we'll just use Gemini".. Although Tim did quiz the team on this.   Having said all that, it's pretty damn exciting and will change the face of BI for both customers and BI implementers.

The first breakout session I attended was a Chalk and Talk by the engaging Peter Bull on moving from the spreadsheet world to PerformancePoint Planning.  He outlined a suggested method for converting existing excel based 'business applications' into PerformancePoint models, he was open and honest about some of the shortcomings of the product but also brought our attention to the the built-in features that aid design and productivity.

The following tips were core to the session:

- Don't replicate the current Excel 'models'.

- Use filters to reduce scope and size of input matrices.

- Limit definition rules (Don't build cell by cell calculations)

- Don't use flat hierarchies.

- Don't assume all calculations need to be real time.

- Performance test by cut and pasting MDX.

Another Chalk and Talk followed, held by Michael Bower and Scott Sebelsky on using PPS-P for Financial Consolidation.  They discussed the consolidation functionality available in PPS-P and using a two model site application, walked us through the implementation using US GAAP (Corporate Model Site) and IFRS (EMEA Model Site).

The demo, supporting white-paper, and a new planning demo will be available shortly and was shown off in the hands on labs at the conference.  I'll shortly be able to post more information on these new demos...

My third session of the day effectively covered some elements, mainly Report Builder 2.0, of the SQL 2008 feature pack that is out later this month.  One of the features demonstrated Component Based Report building from a self-service perspective and did look quite slick.  The session was presented by the SSRS PM team and they had a clever way of obtaining feedback from the audience on what features they would like to see the most.  They handed out a crib sheet of features and asked us to allocate a $100 budget to each feature - they collected in the sheets and will use this as a basis on what features to focus on.  In addition to Component based self-service reporting, features such as Office Integration using the acquired Software Artisans technology, Notify Me, Interactive Reports and Rich Presentation were shown off to good effect.

Steve Hoberecht and Srini Nallapareddy were next on my list, taking us through Advanced PPS Planning rules and calculations.  There was some good stuff - I always thought the fact the ALLOCATE statement appended data was a bug, but now I know why it does what it does and warrants a separate post.  Some other tips, particularly for definition rules, some new some old, were also presented:

- Reduce Scope

- Avoid Hard-coded member labels to avoid security restriction errors

- Consider automatic rules/ scheduled assignments.

- Rule order is important / Avoid infinite loops

- Consider moving calcs to Excel

- Consider input/Reporting models

- Locate bad performing rules by commenting out the rules in BIDS and introducing the rules on by one (from within BIDS) rather than setting the active flag from within PBM as that is more tedious.

The day was rounded off by a great steak, with the other UK BI partners at Ruths and Chris Steakhouse.

Dynamic Range, Time Property Filter = Empty Matrix - A bug?

I think I've found a bug in the way the Excel Add-In generates MDX under certain 'rolling' conditions.  The requirement I have is to be able to forecast at the day level for a rolling 6 months; starting from the current period (which is to be updated each week) running for a period of 180 days (~ 6 months)

To prevent requiring 180 columns, a dimension property based filter must be available to select the month in which to forecast.  This will provide a more concise data entry form detailing up to 31 days of the selected month in which to add forecast values.

My form is dimensioned up as follows:

Dimension Position
Employee Filter
Time(Month) Filter (Dimension Property)
Scenario Filter
Location Rows
Time (Day) Columns

I set up the columns as a dynamic range to ensure that the forecast 'rolls' with changes in current period.  The range was set from current member id + 0 : current member id + 180.  [Current Period is set to 16th September 2008 - today).

The simplified MDX that this produces is below:

        Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0)
        Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180)
    } on columns, 
        descendants([Location].[Standard].[All].[All Locations],,after)
    } on rows 
        {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) 

    {[Employee].[Employee].[All].[John Doe]}
    {[Scenario].[All Members].[All].[Forecast]} 

The first element to notice is that the columns have been set to a range using ancestor at the member id level and lag to cover the 180 days:

Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0)
Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180)

The next point to highlight is the sub=query that represents the selected time dimension property value (September 2008):

{[Time].[Month].[All].[September 2008]} on columns from [LocationPlan])

When you run this in SSMS, the following data set is returned:


The Locations appear on the rows, the days appear on the columns - exactly as required.

By changing the sub-query filter to October 2008 - the next month in the range, and definitely covered by the -180 day lag (Not sure why the Lead function isn't used here?) - results in a problem, the results returned now are missing the day level columns:


The root of this problem is the column expression - if you replace the column expression with a direct lag on the current period member the expected results are returned:

        [Time].[Base View].[MemberId].&[20080916].Lag(0)
        [Time].[Base View].[MemberId].&[20080916].Lag(-180)
    } on columns, 
        descendants([Location].[Standard].[All].[All Locations],,after)
    } on rows 
        {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) 

    {[Employee].[Employee].[All].[John Doe]}
    {[Scenario].[All Members].[All].[Forecast]} 


Now, the only workaround I can come up with is to build the form using a custom MDX formula so I reckon this warrants raising a bug on connect - which I've logged here: