Sacha Tomey

Sacha Tomey's Blog

PerformancePoint Planning: Deleting a Custom Member Property - A Solution

I had a bit of a rant yesterday about the fact I have had to compromise naming member properties when I've inadvertently created them with the wrong data type.  As I mentioned, I found a Dimension attribute collection method in the Planning client assemblies that hinted that it might allow me to delete a member property so I decided to give it a go.

Below is some really rough and ready C# code that actually does delete a dimension member property.  I will improve the code and probably add it in to my PPSCMD GUI interface as a 'feature pack' bonus at some stage, however, if you are in desperate need for the code to delete a member property, and you can't wait for PPSCMD GUI v0.2 or PerformancePoint Version 2 (I'm not sure which will come first !) the code is below (Use at your own risk !!)

Note:  Replace "MyApp", "MyDimension", "MyAttribute", oh, and the server address, accordingly..

    using Microsoft.PerformancePoint.Planning.Client.Common;
    using Microsoft.PerformancePoint.Planning.Bmo.Core;




..
// Setup the PPS Application Metadata Manager ServerHandler serverHandler = new ServerHandler("http://localhost:46787"); MetadataManager manager = new MetadataManager(); manager.ServerHandler = serverHandler; manager.ServerHandler.Connect(); // Get the system metadata BizSystem system = manager.GetSystem(true); // Get hold of the PPS Application BizApplication ppsApp = system.Applications["MyApp"]; // Obtain the root model site from the application BizModelSite site = ppsApp.RootModelSite; // Obtain the dimension that contains the member property BizDimension dimension = site.Dimensions["MyDimension"]; // Obtain the member property BizDimensionAttribute attribute = dimension.Attributes["MyAttribute"]; // Check out the dimension manager.CheckOut(dimension.Id, dimension.ParentModelSite.Id); // Perform the delete dimension.DeleteDimensionAttribute(attribute, null); // Submit the change manager.SubmitModelSite(ppsApp.Id, dimension.ParentModelSite, Microsoft.PerformancePoint.Planning.Bmo.Interfaces.SubmissionType.Update); // Check in the dimension manager.CheckIn(dimension.Id, dimension.ParentModelSite.Id);
Update:  I've since discovered that you can obtain an unsupported utility from Microsoft Support that reportedly does the same thing, doh !  
Oh well, always nice to have the code ..J

PerformancePoint Planning: Deleting a Custom Member Property..

Update:  I've posted a solution to Deleting a Custom Member Property here

I've done this countless times; I've created my perfectly named Custom Member Property when it suddenly dawns on me that I've forgotten to give it the right data type.  No problem, right?  Wrong!  From within PBM, can you change the data type?  No!  Can you delete the member property? No!  Can you rename the member property?  No!

So, what are the options?  Well, you could wait for version 2 (I truly hope you can edit/delete member properties in V2!), you could hack the back end database in the vague hope of removing the member property safely, or, as I have been doing in the past, create a new member property with a less than perfect name and try not to clench teeth and fists every time I glance at the original.

Well, I've had enough, and decided I'm going to take action.

Strangely, the Microsoft.PerformancePoint.Planning.BMO assembly contains a method called DeleteDimensionAttribute on the Dimension attribute collection. 

image

I wonder...

Anyone tried?

New PerformancePoint Contoso Demo - Released

Amidst my write up of the first day of the Microsoft BI Conference, I mentioned a new planning demo suite was imminent, and I would post more information about the demos soon.  Well, as it has now been officially released (27th October) I can spill the beans...

Taken directly from the PPS Planning Forum announcement, the demo..

.. consists of Planning and Financial Consolidation demo. It shows how the fictitious Contoso Group uses Microsoft Office PerformancePoint Server for planning, statutory consolidation and data analysis.

Well, I'm proud to announce that Adatis, in the shape of my colleague Jeremy Kashel, designed and built the PerformancePoint Planning element of the suite.  The PerformancePoint Financial Consolidation element was conceived and developed by our friends at Solitwork of Denmark.

The demo can be downloaded from here...

http://www.microsoft.com/downloads/details.aspx?FamilyId=00B97AC5-8B69-4F4D-AA0C-ACBFBFB9B48E&displaylang=en

...and is part of the next 'All Up BI VPC' (Version 7).

Great work guys!

Microsoft BI Conference - Day 3 - 8th October 2008

The last day of the conference came round quickly and due to my relatively early flight I only attended a couple of sessions and spent most of the day meeting more people in and around the BI community.  Shout out to Peter Eberhardy (PeterEb), a real highlight.  Barry Tousley, Test Dev on the PPS Server; thanks for listening to my grumbles about the PPS Data Migration Tool and explaining why it does what it does.  Norm Warren of NormBI fame.  Patrick Husting, who I actually met on Day 2 and Brian Berry of BlumShapiro, who I met on Day 1 and reportedly follows this blog !

I thought the conference was great.  The organisation was slick and right on the button, from registration, meals, session offerings right up to the party.  I think last year, the main criticism was the sessions were not technical enough, they appear to have raised the technical level of some of the sessions but I still found most of them to be a bit dated and apart from a couple of choice sessions most BI people wouldn't have learnt a great deal - Nothing official at all about PPS v2 :o(  Also, a couple of the sessions I wanted to attend clashed so I'll have to find the time to watch them on the reported DVD containing a video of every session.  However, I did the feel the standard of presentation was excellent, well practiced, clear, funny and engaging.

I'll definitely be vying for a place at next years, where they really should have lots to show off!

Microsoft BI Conference - Day 2 - 7th October 2008

Day 2 kicked off with some good key notes and still full of steak from day 1 I hauled myself to TK Anand and Akshai Mirchandani's session on Optimising Query Performance in AS 2008.  For me this was one of the best sessions of the conference as I do spend a fair bit of trying to tune and optimise MDX queries.  They gave a really good explanation of the difference between cell-by-cell calculation and the subspace calculation (or block computation) methods - the latter relies on a sparse cube - the most important aspect of speeding up queries using subspace calculations.

Another point they raised, particularly from an AS2008 perspective is that "Non Empty Behaviour is Evil!" - their words!

There was a good set of tips and tricks, some of which can also be applied and adopted to AS2005.

The afternoon started with what I thought would be the busiest session of the conference - New Horizons for BI with Self Service Analysis technologies.  Effectively the deep dive presentation on Project 'Gemini'.  It really is impressive, not only the tool and the capability of the tool, but the supporting framework.  They have implemented an extremely rich administration console that keeps track of user created models on the server and a history of it's usage, query time etc etc.  It allows IT to see who is using what, by how much and what impact it is having on servers, other models etc and allows them to take appropriate action by, for example, bringing it in house into IT by upgrading to PerformancePoint.  We've got a few clients that would just go nuts for this stuff !

That evening, the Attendee Appreciation Party was held at Qwest Field stadium where I have to say, they put on a great party.  I've never been on the field of a huge stadium like that, most impressive, and I've never eaten so many chocolate covered marshmallows, cookies or brownies in my life!

07102008005

Da Boyz !  Jeremy and Tim

Microsoft BI Conference - Day 1 - 6th October 2008

So, although later than the trail blazers, I thought I'd write up a brief note about day one of the Microsoft BI Conference.  The 'Kilimanjaro' announcements have been done to death although I've noticed a couple of crossed wires.  Kilimanjaro is NOT the next version of SQL Server - it sounds more like an interim release, whether that comes as part of a service pack or a new type of feature pack delivery method I guess we'll have to wait and see.  However it arrives, we have to wait until the first half of calendar year 2010.

With regard to 'Gemini' I'm hoping they make the in-memory (column based storage?) engine part of the SQL Server engine proper, as this can then benefit PPS and any SQL Server dependent app, not just the 'Gemini' framework.  Imagine PPS-P data entry/reporting running in memory !  It's certainly a game-changer and it will be interesting to see where and how it's positioned.  I can't help thinking that it's verging on promoting data silos and 'multiple versions of the truth' and it wouldn't surprise me if it's misused by some customers.  "We don't need a data-warehouse, we'll just use Gemini".. Although Tim did quiz the team on this.   Having said all that, it's pretty damn exciting and will change the face of BI for both customers and BI implementers.

The first breakout session I attended was a Chalk and Talk by the engaging Peter Bull on moving from the spreadsheet world to PerformancePoint Planning.  He outlined a suggested method for converting existing excel based 'business applications' into PerformancePoint models, he was open and honest about some of the shortcomings of the product but also brought our attention to the the built-in features that aid design and productivity.

The following tips were core to the session:

- Don't replicate the current Excel 'models'.

- Use filters to reduce scope and size of input matrices.

- Limit definition rules (Don't build cell by cell calculations)

- Don't use flat hierarchies.

- Don't assume all calculations need to be real time.

- Performance test by cut and pasting MDX.

Another Chalk and Talk followed, held by Michael Bower and Scott Sebelsky on using PPS-P for Financial Consolidation.  They discussed the consolidation functionality available in PPS-P and using a two model site application, walked us through the implementation using US GAAP (Corporate Model Site) and IFRS (EMEA Model Site).

The demo, supporting white-paper, and a new planning demo will be available shortly and was shown off in the hands on labs at the conference.  I'll shortly be able to post more information on these new demos...

My third session of the day effectively covered some elements, mainly Report Builder 2.0, of the SQL 2008 feature pack that is out later this month.  One of the features demonstrated Component Based Report building from a self-service perspective and did look quite slick.  The session was presented by the SSRS PM team and they had a clever way of obtaining feedback from the audience on what features they would like to see the most.  They handed out a crib sheet of features and asked us to allocate a $100 budget to each feature - they collected in the sheets and will use this as a basis on what features to focus on.  In addition to Component based self-service reporting, features such as Office Integration using the acquired Software Artisans technology, Notify Me, Interactive Reports and Rich Presentation were shown off to good effect.

Steve Hoberecht and Srini Nallapareddy were next on my list, taking us through Advanced PPS Planning rules and calculations.  There was some good stuff - I always thought the fact the ALLOCATE statement appended data was a bug, but now I know why it does what it does and warrants a separate post.  Some other tips, particularly for definition rules, some new some old, were also presented:

- Reduce Scope

- Avoid Hard-coded member labels to avoid security restriction errors

- Consider automatic rules/ scheduled assignments.

- Rule order is important / Avoid infinite loops

- Consider moving calcs to Excel

- Consider input/Reporting models

- Locate bad performing rules by commenting out the rules in BIDS and introducing the rules on by one (from within BIDS) rather than setting the active flag from within PBM as that is more tedious.

The day was rounded off by a great steak, with the other UK BI partners at Ruths and Chris Steakhouse.

Dynamic Range, Time Property Filter = Empty Matrix - A bug?

I think I've found a bug in the way the Excel Add-In generates MDX under certain 'rolling' conditions.  The requirement I have is to be able to forecast at the day level for a rolling 6 months; starting from the current period (which is to be updated each week) running for a period of 180 days (~ 6 months)

To prevent requiring 180 columns, a dimension property based filter must be available to select the month in which to forecast.  This will provide a more concise data entry form detailing up to 31 days of the selected month in which to add forecast values.

My form is dimensioned up as follows:

Dimension Position
Employee Filter
Time(Month) Filter (Dimension Property)
Scenario Filter
Location Rows
Time (Day) Columns

I set up the columns as a dynamic range to ensure that the forecast 'rolls' with changes in current period.  The range was set from current member id + 0 : current member id + 180.  [Current Period is set to 16th September 2008 - today).

The simplified MDX that this produces is below:

select 
    {
        
        Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0)
        :
        Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180)
    }
    *
    {
        [Measures].[Value]
    } on columns, 
    
    {
        descendants([Location].[Standard].[All].[All Locations],,after)
    } on rows 
from 
(
    select 
        {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) 
where 

    {[Employee].[Employee].[All].[John Doe]}
    *
    {[Scenario].[All Members].[All].[Forecast]} 

The first element to notice is that the columns have been set to a range using ancestor at the member id level and lag to cover the 180 days:

Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(0)
:
Ancestor([Time].[Base View].[MemberId].&[20080916], [Time].[Base View].[MemberId]).Lag(-180)

The next point to highlight is the sub=query that represents the selected time dimension property value (September 2008):

{[Time].[Month].[All].[September 2008]} on columns from [LocationPlan])

When you run this in SSMS, the following data set is returned:

image

The Locations appear on the rows, the days appear on the columns - exactly as required.

By changing the sub-query filter to October 2008 - the next month in the range, and definitely covered by the -180 day lag (Not sure why the Lead function isn't used here?) - results in a problem, the results returned now are missing the day level columns:

image

The root of this problem is the column expression - if you replace the column expression with a direct lag on the current period member the expected results are returned:

select 
    {
        
        [Time].[Base View].[MemberId].&[20080916].Lag(0)
        :
        [Time].[Base View].[MemberId].&[20080916].Lag(-180)
    }
    *
    {
        [Measures].[Value]
    } on columns, 
    
    {
        descendants([Location].[Standard].[All].[All Locations],,after)
    } on rows 
from 
(
    select 
        {[Time].[Month].[All].[September 2008]} on columns from [LocationPlan]) 
where 

    {[Employee].[Employee].[All].[John Doe]}
    *
    {[Scenario].[All Members].[All].[Forecast]} 

image

Now, the only workaround I can come up with is to build the form using a custom MDX formula so I reckon this warrants raising a bug on connect - which I've logged here:

https://connect.microsoft.com/feedback/ViewFeedback.aspx?FeedbackID=368206&SiteID=181

Unofficial PerformancePoint Planning Tips and Tricks

Wavesmash has posted a series of tips and tricks shared at a train the trainer event that took place in Denver recently.  As suggested, most of the 'nuggets' are from the attendees themselves rather than the course material so, on the plus side there are some real experienced based tips however, I wouldn't treat all as official tips and tricks - I certainly frowned at a couple but that could be due to the explanation rather than the intent.

There's certainly some goodness, and one that made me smile:  Regular Refresh of model = happy modeler

http://performancepointing.blogspot.com/2008/08/train-trainer-helpful-tricks.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-2.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-3.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-3_26.html

http://performancepointing.blogspot.com/2008/08/tips-from-train-trainer-sessions-day-4.html

PerformancePoint Server 2007 PPSCMD GUI

I've built a really simple GUI for a couple of commands of the PPSCMD utility.  I always take far too long to work out the syntax and navigate to the appropriate directory (Yes, I ought to update the PATH environment variable) that I felt I could justify building a simple front end to help speed up the usage.

So far I've only implemented the MIGRATE and REPROCESS commands - I use these quite a lot outside of any automated SSIS package so they seemed the most sensible to implement in the GUI first.  I do intend on extending it to encompass some of the other commands and I would welcome any feedback towards prioritisation, usage, features and the inevitable bugs.  It's currently version 0.1 and more or less ready for 'Community Preview' - there are some omissions such as full error handling and validation that I do intend on implementing over time along with the other commands.

It's a .NET 3.5 application so you will need to deploy it to a client where you are happy to install .NET 3.5 if it's not already present.

You can download version 0.1 from the link at the bottom.

Below are the screen shots:

Migrate

The migrate command: both import and export variations can be set and executed directly from the GUI.  In addition, the command line is generated so you can cut and paste into a command window, batch file or SSIS package.

image

Reprocess

Need to reprocess a model quickly?  Rather than wait for PBM/SSMS to open you can reprocess a model directly from the GUI.  Just like Migrate, the command is generated for cut and paste.

image

Console

Any output you would normally see in the command window is reported in the console as the command is being executed.

image

Log

You can enable logging to a log file of your choice to record all commands processed through the GUI.  Useful for additional auditing and for creating batch files of multiple PPSCMD operations.

image

Preferences

Preferences and options are set on the preferences dialog.

image

Here's the link to the download:

PPSCMD GUI Installer.zip (1.53 mb)