Sacha Tomey

Sacha Tomey's Blog

RIP PerformancePoint Planning

It's nearly a week since the announcement that shook the (PPS) world !  It's been a bit difficult to report on; generally the Adatis blogs try and offer solutions to problems we have encountered out in the real-world.  Now I could say something crass here about the real-world and the decision makers involved...but that would be childish right?

If I was to offer up my feelings, they wouldn't be that far from Alan Whitehouse's excellent post on the subject.  If I had an ounce of class about me, it would be much more aligned with Adrian's poignant discussion opener, the one with the sharp-witted title, but alas....

We've spent the best part of the week speaking to customers, partners and Microsoft about what to do next.  The timing was choice - would you believe, we actually had three new PerformancePoint Planning phases kicking off this week, according to my project plan - I should be setting up Kerberos as we speak..  [There is always a positive right?]

Some customers are carrying on regardless, they...

...already have planning deployments and are too far invested and dependent to back out at this stage or, 

...have a short-term view (That's not a criticism) and need a "quick" fix with a low TCO to get them through some initial grief.  (Typically these customers are going through rapid organisational change, or form part of a recent acquisition and, to help them see the wood from the trees during the transition, require short/sharp solutions)

Other customers, with longer-term views, feel the product, or more importantly, the suitably skilled resource pool, will drain away far quicker than the life-span of the much touted Microsoft product support.  I have to agree - Fact - Adatis will not be employing or training anymore PerformancePoint Planning Consultants.  I doubt many other consulting firms will either.

It's those customers with the longer-term view that are the ones currently in limbo - they are experiencing pain, they need pain relief, what should they do - wait and see what Office 14/15 offers? (There is talk of some planning functionality appearing in future Office versions - what truth there is in that..?).

The Dynamics customers could wait for the resurrection of Forecaster - I do have information on good authority that they will be developing Forecaster to be closer, in terms of flexibility, to PPS Planning.  I had originally heard the opposite view in that Forecaster will be replaced with a cut down version of PPS Planning.  Either way, I'm sure some of the PPS Planning code-base will be utilised, which could end rumours of PPS Planning being 'given' to the community as some form of community/open-source arrangement.  An arrangement that is, in my opinion, a non-starter anyway, "Hey, Mr FD, We've got this great open-source budgeting and forecasting product we think you should implement!" - yeah right !

Another rumour (and mixed message) is that Service Pack 3 will contain some of the requested features that were earmarked for version 2 (After all, the code has already been written, right?) this rumour was actually started by Guy Weismantel in his Announcement Video.  However, the information I have since received, clearly states that Service Pack 3 will contain stability and bug fixes only - so which is it to be?  It's unlikely for a service pack to contain new features, but it's not unheard of; anyone remember the original release of Reporting Services?  That arrived as part of a service pack for SQL Server 2000.

The burning question I cannot get answered is, have Microsoft actually stepped out of the BPM market for good?  We are told that Excel, Sharepoint and SQL Server provide BPM - I can't see, without Planning, how they can.  Short of hard-coded values, renewed Sharepoint/Excel hell, another vendor or bespoke planning solution, businesses can't set plans which have further reaching implications; effectively Planning's demise is also, effectively, shelving the Scorecard/KPI functionality from the M&A toolset too !  It will be interesting to see the new Monitoring & Analytics Marketing, will they still demo Strategy Maps and Scorecards, or will they now focus on Decomposition trees and Heat maps? Monitoring & Analytics may, in practice, just become Analytics..

I would have thought the cost of continuing to develop the product (even if it were a lemon, which Planning certainly wasn't)  is far less than the potential loss of revenue that Microsoft will face due not only to the loss of confidence by its customers (who are going to think twice about investing in any Microsoft product now, let alone a V1) but perhaps more significantly, the doors it opens to it's competitors who can offer a complete BI\BPM stack. 

Planning was foot in the customer's door for BI - once you put planning in, the customer had already bought the full BI stack, and in most cases, our customers were wowed by what they could now achieve. 

I suspect Cognos and SAP are still partying now!

PerformancePoint SP2 - Planning Fixes and a mini-feature

Jeremy has already announced the release of PerformancePoint Server SP2 and it's great to see that the PPS dev team hit their target release date !  I've spent a little commute time this morning checking out the documentation, admittedly I've initially focused on the Planning component and there are no great surprises (Tim has already told you about the new bits) but I have spotted what could arguably be described as a mini-feature surrounding form validation that I'm sure that will come in useful.

As you would expect, previously released hot fixes have been packaged up into this service pack:

954710 Description of the PerformancePoint Server 2007 hotfix package: July 1, 2008

955432 Description of the PerformancePoint Server 2007 hotfix package: July 14, 2008

955751 Description of the PerformancePoint Server 2007 hotfix package: July 28, 2008

956553 Description of the PerformancePoint Server 2007 hotfix package: August 21, 2008


Plus fixes to issues not previously addressed:

Excel Add-In Related

  • You locally save and close a form in PerformancePoint Add-in for Excel. When you reopen the form, you are prompted to update the form. However, you expect that you are not prompted to update the form because the form is already up to date.
  • In PerformancePoint Add-in for Excel, you open an offline form assignment. In the form assignment, you add a link to an external Excel worksheet in a cell. Then, you submit the changes to the PerformancePoint Planning Server database. However, when you reopen the assignment, the link that you added is not retained.
  • After you install PerformancePoint Server 2007 Service Pack 1, you create a page filter in PerformancePoint Add-in for Excel. You have a user in PerformancePoint Server 2007 that does not have permission to the default member of the page filter. However, the user has permission to other leaf members in the page filter. When the user opens a report that uses this page filter, the user receives the following error message:

    Cannot render the <MatrixName> matrix. The server returned the following error: The <CubeName> cube either does not exist or has not been processed.

    However, in the release version of PerformancePoint Server 2007, the next member that the user has access to will be automatically selected for use in the page filter.
  • You define data validation in a worksheet of Excel. However, you can still submit a form in PerformancePoint Add-in for Excel if data in the form is not validated.
  • You have a matrix that is based on a large and complex model in PerformancePoint Add-in for Excel. You open the Select Filters dialog box to change a page filter for this matrix. When you click the Value column of the filter, the dialog box that displays the dimension members takes a long time to display.

Business Rules Related

  • After you migrate an application in PerformancePoint Server 2007 from one server to another server, the order of user-defined business rules and system business rules in models is not preserved.
  • You cannot use the datamember function in the ALLOCATE statement and in the TRANSFER statement.
  • Consider the following scenario. You create an automatic rule that uses MdxQuery implementation or Native MdxQuery implementation in Planning Business Modeler. Then you submit changes to the source data that the rule uses from an assignment form. The submission causes the model to be reprocessed. Because model reprocess causes rules in the automatic rule set to be executed, you expect that the target data of the automatic rule will reflect the change by the form submission. However, after the model is reprocessed, the target data of the automatic rule does not reflect the change.
  • Rule expression of system business rules uses dimension member names instead of dimension member labels in PerformancePoint Server 2007.

Planning Business Modeler Related

  • You have a model that contains many form templates and assignments. When you try to change objects in the model in Planning Business Modeler, Planning Business Modeler crashes.
  • You create a member property of the Date data type in a dimension in PerformancePoint Server 2007. Additionally, you specify the Set value to Null option when you create the member property. When you retrieve the value of this member property, you obtain a value of 1899-12-31T00:00:00. However, you expect that you obtain a value of blank.
  • You cannot schedule recurring jobs for a frequency that is less than an hour.
  • When a user updates a business rule in Planning Business Modeler, the audit log file of PerformancePoint Server 2007 logs the user ID of the user that created the rule. However, you expect that the audit log file logs the user ID of the user that updated the rule.
  • Consider the following scenario. You create a dimension that has no hierarchy in a localized version of PerformancePoint Server 2007. Then you perform one of the following operations:
    • You run the bsp_DI_CreateHierarchyLabelTableForDimension stored procedure to create label-based hierarchy table for the dimension.
    • You perform the Prepare the Staging DB operation in PerformancePoint Planning Data Migration Tool.
      In this scenario, you receive the following error message:
      A problem was encountered while attempting to connect to, or Execute BSP on, the specified Database
      For more information regarding this error please review the Application Event Log on the SQL Server for any "MSSQLSERVER ERRORS"
      and\or
      Please check that all parameters in the UI are correct and try again

PerformancePoint Planning: Deleting a Custom Member Property - A Solution

I had a bit of a rant yesterday about the fact I have had to compromise naming member properties when I've inadvertently created them with the wrong data type.  As I mentioned, I found a Dimension attribute collection method in the Planning client assemblies that hinted that it might allow me to delete a member property so I decided to give it a go.

Below is some really rough and ready C# code that actually does delete a dimension member property.  I will improve the code and probably add it in to my PPSCMD GUI interface as a 'feature pack' bonus at some stage, however, if you are in desperate need for the code to delete a member property, and you can't wait for PPSCMD GUI v0.2 or PerformancePoint Version 2 (I'm not sure which will come first !) the code is below (Use at your own risk !!)

Note:  Replace "MyApp", "MyDimension", "MyAttribute", oh, and the server address, accordingly..

    using Microsoft.PerformancePoint.Planning.Client.Common;
    using Microsoft.PerformancePoint.Planning.Bmo.Core;




..
// Setup the PPS Application Metadata Manager ServerHandler serverHandler = new ServerHandler("http://localhost:46787"); MetadataManager manager = new MetadataManager(); manager.ServerHandler = serverHandler; manager.ServerHandler.Connect(); // Get the system metadata BizSystem system = manager.GetSystem(true); // Get hold of the PPS Application BizApplication ppsApp = system.Applications["MyApp"]; // Obtain the root model site from the application BizModelSite site = ppsApp.RootModelSite; // Obtain the dimension that contains the member property BizDimension dimension = site.Dimensions["MyDimension"]; // Obtain the member property BizDimensionAttribute attribute = dimension.Attributes["MyAttribute"]; // Check out the dimension manager.CheckOut(dimension.Id, dimension.ParentModelSite.Id); // Perform the delete dimension.DeleteDimensionAttribute(attribute, null); // Submit the change manager.SubmitModelSite(ppsApp.Id, dimension.ParentModelSite, Microsoft.PerformancePoint.Planning.Bmo.Interfaces.SubmissionType.Update); // Check in the dimension manager.CheckIn(dimension.Id, dimension.ParentModelSite.Id);
Update:  I've since discovered that you can obtain an unsupported utility from Microsoft Support that reportedly does the same thing, doh !  
Oh well, always nice to have the code ..J

PerformancePoint Planning: Deleting a Custom Member Property..

Update:  I've posted a solution to Deleting a Custom Member Property here

I've done this countless times; I've created my perfectly named Custom Member Property when it suddenly dawns on me that I've forgotten to give it the right data type.  No problem, right?  Wrong!  From within PBM, can you change the data type?  No!  Can you delete the member property? No!  Can you rename the member property?  No!

So, what are the options?  Well, you could wait for version 2 (I truly hope you can edit/delete member properties in V2!), you could hack the back end database in the vague hope of removing the member property safely, or, as I have been doing in the past, create a new member property with a less than perfect name and try not to clench teeth and fists every time I glance at the original.

Well, I've had enough, and decided I'm going to take action.

Strangely, the Microsoft.PerformancePoint.Planning.BMO assembly contains a method called DeleteDimensionAttribute on the Dimension attribute collection. 

image

I wonder...

Anyone tried?

New PerformancePoint Contoso Demo - Released

Amidst my write up of the first day of the Microsoft BI Conference, I mentioned a new planning demo suite was imminent, and I would post more information about the demos soon.  Well, as it has now been officially released (27th October) I can spill the beans...

Taken directly from the PPS Planning Forum announcement, the demo..

.. consists of Planning and Financial Consolidation demo. It shows how the fictitious Contoso Group uses Microsoft Office PerformancePoint Server for planning, statutory consolidation and data analysis.

Well, I'm proud to announce that Adatis, in the shape of my colleague Jeremy Kashel, designed and built the PerformancePoint Planning element of the suite.  The PerformancePoint Financial Consolidation element was conceived and developed by our friends at Solitwork of Denmark.

The demo can be downloaded from here...

http://www.microsoft.com/downloads/details.aspx?FamilyId=00B97AC5-8B69-4F4D-AA0C-ACBFBFB9B48E&displaylang=en

...and is part of the next 'All Up BI VPC' (Version 7).

Great work guys!

Dynamic Range, Time Property Filter = Empty Matrix - A bug?

I think I've found a bug in the way the Excel Add-In generates MDX under certain 'rolling' conditions.  The requirement I have is to be able to forecast at the day level for a rolling 6 months; starting from the current period (which is to be updated each week) running for a period of 180 days (~ 6 months)

To prevent requiring 180 columns, a dimension property based filter must be available to select the month in which to forecast.  This will provide a more concise data entry form detailing up to 31 days of the selected month in which to add forecast values.

My form is dimensioned up as follows:

Dimension Position
Employee Filter
Time(Month) Filter (Dimension Property)
Scenario Filter
Location Rows
Time (Day) Columns

I set up the columns as a dynamic range to ensure that the forecast 'rolls' with changes in current period.  The range was set from current member id + 0 : current member id + 180.  [Current Period is set to 16th September 2008 - today).

The simplified MDX that this produces is below:

select 
    {
        
        Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0)
        :
        Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180)
    }
    *
    {
        [Measures].[Value]
    } on columns, 
    
    {
        descendants([Location].[Standard].[All].[All Locations],,after)
    } on rows 
from 
(
    select 
        {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) 
where 

    {[Employee].[Employee].[All].[John Doe]}
    *
    {[Scenario].[All Members].[All].[Forecast]} 

The first element to notice is that the columns have been set to a range using ancestor at the member id level and lag to cover the 180 days:

Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0)
:
Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180)

The next point to highlight is the sub=query that represents the selected time dimension property value (September 2008):

{[Time].[Month].[All].[September 2008]} on columns from [LocationPlan])

When you run this in SSMS, the following data set is returned:

image

The Locations appear on the rows, the days appear on the columns - exactly as required.

By changing the sub-query filter to October 2008 - the next month in the range, and definitely covered by the -180 day lag (Not sure why the Lead function isn't used here?) - results in a problem, the results returned now are missing the day level columns:

image

The root of this problem is the column expression - if you replace the column expression with a direct lag on the current period member the expected results are returned:

select 
    {
        
        [Time].[Base View].[MemberId].&[20080916].Lag(0)
        :
        [Time].[Base View].[MemberId].&[20080916].Lag(-180)
    }
    *
    {
        [Measures].[Value]
    } on columns, 
    
    {
        descendants([Location].[Standard].[All].[All Locations],,after)
    } on rows 
from 
(
    select 
        {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) 
where 

    {[Employee].[Employee].[All].[John Doe]}
    *
    {[Scenario].[All Members].[All].[Forecast]} 

image

Now, the only workaround I can come up with is to build the form using a custom MDX formula so I reckon this warrants raising a bug on connect - which I've logged here:

https://connect.microsoft.com/feedback/ViewFeedback.aspx?FeedbackID=368206&SiteID=181

Unofficial PerformancePoint Planning Tips and Tricks

Wavesmash has posted a series of tips and tricks shared at a train the trainer event that took place in Denver recently.  As suggested, most of the 'nuggets' are from the attendees themselves rather than the course material so, on the plus side there are some real experienced based tips however, I wouldn't treat all as official tips and tricks - I certainly frowned at a couple but that could be due to the explanation rather than the intent.

There's certainly some goodness, and one that made me smile:  Regular Refresh of model = happy modeler

http://performancepointing.blogspot.com/2008/08/train-trainer-helpful-tricks.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-2.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-3.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-3_26.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-4.html

PerformancePoint Server 2007 PPSCMD GUI

I've built a really simple GUI for a couple of commands of the PPSCMD utility.  I always take far too long to work out the syntax and navigate to the appropriate directory (Yes, I ought to update the PATH environment variable) that I felt I could justify building a simple front end to help speed up the usage.

So far I've only implemented the MIGRATE and REPROCESS commands - I use these quite a lot outside of any automated SSIS package so they seemed the most sensible to implement in the GUI first.  I do intend on extending it to encompass some of the other commands and I would welcome any feedback towards prioritisation, usage, features and the inevitable bugs.  It's currently version 0.1 and more or less ready for 'Community Preview' - there are some omissions such as full error handling and validation that I do intend on implementing over time along with the other commands.

It's a .NET 3.5 application so you will need to deploy it to a client where you are happy to install .NET 3.5 if it's not already present.

You can download version 0.1 from the link at the bottom.

Below are the screen shots:

Migrate

The migrate command: both import and export variations can be set and executed directly from the GUI.  In addition, the command line is generated so you can cut and paste into a command window, batch file or SSIS package.

image

Reprocess

Need to reprocess a model quickly?  Rather than wait for PBM/SSMS to open you can reprocess a model directly from the GUI.  Just like Migrate, the command is generated for cut and paste.

image

Console

Any output you would normally see in the command window is reported in the console as the command is being executed.

image

Log

You can enable logging to a log file of your choice to record all commands processed through the GUI.  Useful for additional auditing and for creating batch files of multiple PPSCMD operations.

image

Preferences

Preferences and options are set on the preferences dialog.

image

Here's the link to the download:

PPSCMD GUI Installer.zip (1.53 mb)

PerformancePoint Server Planning SP1 - Clear Changes After Workflow Action

There's a new workbook property that was introduced in PPS Service Pack 1.  The 'Clear Changes After Workflow Action' effectively splats the change list for the workbook once the assignment has been submitted (either draft or final).

The property can only be reached through the Report Properties dialog, and is at the workbook level:

image

 

 

 

 

 

 

 

 

 

 

 

 

This property defaults to false which, under certain circumstances can hinder performance. 

Whenever you change data on a matrix, the slice that you affected is saved to a change list.  You can view what's on the change list by choosing 'View -> Show Current Changes' from the PPS Add-In for Excel.

image

Here's an example change list; two budget accounts for the same time period and department have been updated to the included new values.

image

 

 

 

 

The default behaviour (and the behaviour prior to SP1) is that, for the life of the assignment, the change list is maintained for every cell that is updated  The change list is simply appended to, so you can imagine, on a large workbook with several matrices spanning several filter slices, the change list can become quite large.

Submitting the assignment effectively submits the change list for processing by the server, first updating/inserting the appropriate records into the fact table and subsequently re-processing the Analysis Services partition.  It follows then, that the larger the change list, the slower the submission process.

Before SP1, this forever growing change list issue was resolved with little user training.  As part of the submission process you would invite your users to manually clear the change list:

image

By 'Clearing Current Changes' you throw away the changes to the cells and have to rely on the data being safe and sound on the server.  This process helped keep the change list to a more manageable size thus gaining on the submission performance.

The new 'Clear Changes After Workflow Action' property in SP1, if set to true, will perform the 'Clear Current Changes' step for you automatically.  This helps keep the change list lightweight (providing of course, the user regularly submits).  However, as I have already implied, there is one issue to be wary of; with the property set to clear changes, if your submission fails, the change list is lost and there is a real danger of losing data.